AWS CLI is a common tool allowing to control S3 service. AWS CLI tool is written in python.
## AWS CLI installation
To install AWS CLI we recommend using [official AWS docummentation](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html). There you can find the guide on how to install AWS CLI on Linux and Windows as well.
???+ note ""
If you need to install AWS CLI in the virtual environment you can use [this guide](https://docs.aws.amazon.com/cli/latest/userguide/install-virtualenv.html).
## Configuration of AWS CLI
???+ note "User profile"
To configure AWS CLI we recommend using the option `--profile` which allows you to define multiple user profiles with different user credentials. Of course, you can also use the settings without the option `--profile`. All commands will be the same, you will just omit the option `--profile`. AWS will then use the **default** settings.
!!! warning
In the configuration wizard, it is necessary by the option **Default region name** to hit the space bar. If you will not put the space into “Default region name” the config file will not contain **region** parameter. You will then obtain the error related to **InvalidLocationConstraint** during the usage **aws s3**.
In the following, we will demonstrate the AWS CLI configuration. Following exemplary commands utilize the `--profile` option.
_AWS Access Key ID_ - access key, obtained from data storage administrator
_Secret Access Key_ - secret key, obtained from data storage administrator
_Default region name_ - Here just press the space bar!!! Some software tools can have special requirements, e.g. Veeam, in that case, insert storage
_Default output format_ - choose the output format (json, text, table)
???+ note "Endpoint URL"
For smooth operation is necessary to use option `--endpoint-url` with particular S3 endpoint address provided by CESNET.
!!! warning
**Multipart S3 upload - the maximal size of the file is limited up to 5 GB**. It's a best practice to use aws s3 commands (such as aws s3 cp) for multipart uploads and downloads because these aws s3 commands automatically perform multipart uploading and downloading based on the file size. By comparison, **aws s3api** commands, such as aws s3api create-multipart-upload, should be used only when aws s3 commands don't support a specific upload need, such as when the multipart upload involves multiple servers, a multipart upload is manually stopped and resumed later, or when the aws s3 command doesn't support a required request parameter. More information can be found on the [AWS websites](https://aws.amazon.com/premiumsupport/knowledge-center/s3-multipart-upload-cli/).
## Controls of AWS CLI - high-level (s3)
To show the help (available commands) you can use help - **aws s3** tool allows you to use several advanced functions, see below.
aws s3 help
### Operation with buckets
???+ note ""
The bucket name has to be unique within tenant. It should contain lower letters, numbers, dashes, and dots. The bucket name should begin only with a letter or number and cannot contain dots followed by a dash or dots preceded by a dash or multiple dots. We also recommend not using “slash” in the bucket name. Using the slash will disallow the usage of the bucket via API.
The content of the source folder is always copied while using the following command. It does not depend on the slash character at the end of the source path. The behavior of **aws** is in this perspective different than the rsync behavior. If you wish to have the source directory in the destination you can add the name of the source directory to the destination path. **AWS tool will create the directory in the destination while copying the data**, see the exemplary commands below. The same is valid in the case of directory downloads or synchronization via **aws s3 sync**.
**aws** tool allows the usage of **aws s3api** module. This module provides advanced functions to control S3 service, see below. The configuration of credentials and connections is the same like for **aws** in the beginning of this guide.
The set of available commands can be obtained by the following command with the option **help**. Alternatively is the complete list available in the **[AWS website](https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html)**.
## Exemplary configuration file for aws tool
After successful configuration, the configuration file should be created. You can find the example below. You can find the credentials file in the same path.
???+ note "Config file"
Windows: C:/Users/User/.aws/config<br/>
Linux: /home/user/.aws/config<br/>
[<br/>profile test-user]<br/>
region =<br/>
output = text<br/>
## Special functions of AWS-CLI
### Presign URLs
### Bucket policies
### Bucket versioning
Detail documentation for Object Storage services could be found at [du.cesnet.cz](https://du.cesnet.cz/en/navody/object_storage/start)
To be able to configure the rclone tool using this guide **first, you have to download, unzip and install rclone**, the guide can be found in the [first section](#downloading-and-installation-of-rclone-tool).
???+ note "Command line in Windows and Linux"
**Windows users** need to run **Command Prompt** and then run the command below.
**Linux users** can open the **terminal window** and then run the command below.
You can use exe installer from [the oficial websites of Cloudberry](https://cloudberry-explorer-for-amazon-s3.en.softonic.com/). When you start the program, it will be always informed about the registration options. Registration is free. Then you receive the key via e-mail and then all pop-ups are avoided.
!!! warning
CloudBerry in the FREE Version does not support Multipart Upload and Multithreading, which means that it cannot work with files larger than 5GB. Encryption and compression is also enabled in the PRO version.
CloudBerry in the FREE version does not support Multipart Upload and Multithreading, which means that it cannot work with files larger than 5GB. Encryption and compression is also enabled in the PRO version.
## Cloudberry Configuration
Storage configuration can be done via **1. File** menu, where you select **2 Add New Account**. Do not select the Amazon S3 Accounts option, as it does not have the option of entering a service point etc.!
@@ -12,28 +12,28 @@ In the following section you can find recommended S3 clients. For all S3 clients
cl3 - https://s3.cl3.du.cesnet.cz<br/>
cl4 - https://s3.cl4.du.cesnet.cz<br/>
## S3 Browser (Windows)
## S3 Browser (GUI Windows)
[S3 Browser](https://s3browser.com/) is a freeware tool for Windows to manage your S3 storage, upload and download data. You can manage up to two user accounts (S3 account) for free. [The Guide for S3 Browser](s3browser.md).
## CloudBerry Explorer for Amazon S3 (Windows)
## CloudBerry Explorer for Amazon S3 (GUI Windows)
[CloudBerry Explorer](https://cloudberry-explorer-for-amazon-s3.en.softonic.com/) is an intuitive file browser for your S3 storage. It has two windows so in one you can see the local disk and in the second you can see the remote S3 storage. Between these two windows, you can drag and drop your files. [The guide for CloudBerry explorer](cloudberry.md).
## AWS-CLI (Linux, Windows)
## AWS-CLI (command line, Linux, Windows)
[AWS CLI](https://aws.amazon.com/cli/) - Amazon Web Services Command Line Interface - is standardized too; supporting S3 interface. Using this tool you can handle your data and set up your S3 data storage. You can used the command line control or you can incorporate AWS CLI into your automated scripts. [The guide for AWS-CLI](aws-cli.md).
## Rclone (Linux, Windows)
## Rclone (command line + GUI, Linux, Windows)
The tool [Rclone](https://rclone.org/downloads/) is suitable for data synchronization and data migration between more endpoints (even between different data storage providers). Rclone preserves the time stamps and checks the checksums. It is written in Go language. Rclone is available for multiple platforms (GNU/Linux, Windows, macOS, BSD and Solaris). In the following guide, we will demonstrate the usage in Linux and Windows systems. [The guide for rclone](rclone.md).
## s3cmd (Linux)
## s3cmd (command line Linux)
[S3cmd](https://s3tools.org/download) is a free command line tool to upload and download your data. You can also control the setup of your S3 storage via this tool. S3cmd is written in python. It goes about open-source project available under GNU Public License v2 (GPLv2) for personal either or commercial usage. [The guide for s3cmd](s3cmd.md).
## s5cmd for very fast transfers (Linux)
## s5cmd for very fast transfers (command line Linux)
In case you have a connection between 1-2Gbps and you wish to optimize the transfer throughput you can use s5cmd tool. S5cmd is available in the form of precompiled binaries for Windows, Linux and macOS. It is also available in form of source code or docker images. The final solution always depends on the system where you wish to use s5cmd. A complete overview can be found at [Github project](https://github.com/peak/s5cmd). [The guide for s5cmd](s5cmd.md).
## WinSCP (Windows)
## WinSCP (GUI Windows)
[WinSCP](https://winscp.net/eng/index.php) is the popular SFTP client and FTP client for Microsoft Windows! Transfer files between your local computer and remote servers using FTP, FTPS, SCP, SFTP, WebDAV or S3 file transfer protocols. [The guide for WinSCP](winscp.md)
## CyberDuck (Windows)
## CyberDuck (GUI Windows)
[CyberDuck](https://cyberduck.io/s3/) is a multifunctional tool for various types of data storage (FTP, SFTP, WebDAV, OpenStack, OneDrive, Google Drive, Dropbox, etc.). Cyberduck provides only elementary functionalities, most of the advanced functions are paid. [The guide for CyberDuck](cyberduck.md)