AWS CLI with LABDRIVE

AWS CLI is a unified tool to manage AWS services. It is frequently the tool used to transfer data in and out of AWS S3. It works with any S3 compatible cloud storage service and it is the recommended way to upload or download content when using LABDRIVE.

In this recipe we will learn how to configure and use AWS CLI to manage data with LABDRIVE.

1. Prerequisites

  • A working LABDRIVE instance

  • Your LABDRIVE S3 access and secret keys.

2. Installation

Install AWS CLI from https://aws.amazon.com/cli/

3. Configuration

To configure AWS CLI, type aws configure and specify the LABDRIVE key information.

Replace shown keys below with your own keys:

$ aws configure
AWS Access Key ID [None]: <your access key>
AWS Secret Access Key [None]: <your secret key>
Default region name [None]: (just press ENTER here for None)
Default output format [None]: (just press ENTER here for None)

4. Commands

Depending on the region and other settings, LABDRIVE keeps your data container inside a particular S3 Bucket. All data containers in your instance may use the same S3 bucket or not.

Path to your files is created using the following convention:

s3://{S3 bucket name}/{container id}/{path to your file}

To obtain the Bucket Name associated to the data container you want to use to begin uploading any file see Getting your S3 bucket name.

List your containers

This will list the data containers you have in a given S3 bucket:

$ aws s3 ls s3://labdrive-acme/
                           PRE 1/
                           PRE 2/
                           PRE 3/
                           PRE 4/
                           PRE 5/
                           PRE 6/

Note that the 1/, 2/, etc are your container IDs. Before the container ID you can find a PRE or a date and time, depending on the case.

List contents inside a container

$ aws s3 ls s3://labdrive-acme/5/
2021-02-30 00:26:53      69297 results.root
2021-02-30 00:35:37      67250 mycode-2.5.0.tar.gz

Copy files to your container

$ aws s3 cp 1.root s3://labdrive-acme/5/1.root
upload: .\1.root to s3://labdrive-acme/5/1.root

Up/download recursively

$ aws s3 cp myFolder s3://labdrive-acme/5/myfolder --recursive
upload: .\myfolder\1.root to s3://labdrive-acme/5/myfolder/1.root
upload: .\myfolder\2.root to s3://labdrive-acme/5/myfolder/2.root

Synchronize a local/remote folder

This will upload/download changed or new files only, and will not delete files not in source.

$ aws s3 sync myFolder s3://labdrive-acme/5/myfolder
upload: .\myfolder\3.root to s3://labdrive-acme/5/myfolder/3.root

Delete files from your container

$ aws s3 rm s3://labdrive-acme/5/1.root
delete: s3://labdrive-acme/5/1.root

Last updated