Data Migration from Amazon S3 to Azure Blob

By Style Sync
| Published on
087ea-screen-shot-2020-02-28-at-10.57.48-am

It may be a requirement of your business to move your data and workload from one public cloud to another. In a world where there are many public cloud choices, this can be complex, cumbersome, and expensive; but, it is the business decision that has been made, and you have to comply.

Last year, Microsoft released version 10 of AzCopy to help customers and users, like you, to simply migrate data from Amazon S3 to Microsoft Azure Blob storage. The AzCopy feature I am going to discuss here is still in preview from Microsoft, but I have been using it for some time now, and am finding it very solid and trustworthy.

I had two customers last week asking about the best and the fastest way to migrate their data from Amazon S3 to Azure Blob. I recommended they use the AzCopy tool, and on this blog post, I am sharing with you the details of my preparation and configuration of AzCopy I used to start their data migration.

Downloading the AzCopy

The good news that AzCopy can be run on Windows 32, Windows 64, Linux and Apple Mac Operating Systems. Download AzCopy using this link.

After it has downloaded, extract the file to your hard drive and you are ready to get started.

Scenario

To make this blog post focused on our task, let’s take the following scenario. You are using Amazon S3 as long term data storage; but now, your business has decided to switch your cloud storage to Microsoft Azure. You must find the easiest and fastest way to migrate the data from Amazon S3 to Microsoft Azure Blob. The diagram below illustrates the challenge:

To accomplish this task using simple and fast steps, you can use Microsoft command-line tool AzCopy. You have this command in the software you just downloaded in the previous step.

Configuring AzCopy to Accomplish Your Task

Before starting with the task of migrating your data, let’s first take a look at the AzCopy command structure:

Azcopy cp <S3 URL> <Blob URL>SAS>> --recursive

To build up the parameters of the AzCopy command, we must have the Amazon S3 URL and the Microsoft Azure Blob Storage SAS URL. The “–recursive” parameter instructs AzCopy to copy all containers, directories, and blobs.

Next, we must acquire the Amazon S3 bucket URL, Access, and Secret Key.

After you have acquired the bucket URL, Access, and Secret Key,  your next step is to add them to your environment as variables. You do that by running the following commands for Windows:

set AWS_ACCESS_KEY_ID=KIA357XVARPR

set AWS_SECRET_ACCESS_KEY=d89TpVOZeQLQewtm7r2LTX1JU8gKf6x

For Linux and macOS, use these commands:

export AWS_ACCESS_KEY_ID=KIA357XVARPR

export AWS_SECRET_ACCESS_KEY=d89TpVOZeQLQewtm7r2LTX1JU8gKf6x

Next, you must generate and acquire the Azure Blob container <SAS> string. You do that with these steps:

Browse to your Azure Storage Account, Containers; then click on “Shared access signature”

The string is generated after configuring the Allow services, resources and permissions etc…

Now that has been done, copy the Blob service SAS URL:

With that done, we now have all the information we need to run the AzCopy command to migrate the data.

Migrate Data between Amazon S3 and Blob

After gathering all the required URLs, Access, Secret keys and Azure Blob <SAS>, it is time to start typing to migrate our Backup Data. I have written below the command lines I used to migrate my backup data from Amazon S3 to Azure Blob:

set AWS_ACCESS_KEY_ID=KIA357XVARPR

Set AWS_SECRET_ACCESS_KEY=d89TpVOZeQLQewtm7r2LTX1JU8gKf6x

Azcopy cp "https://s3.amaonaws.com/veeagrate" "https://migratefroms3.blob.core.windows.net/?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup=2020-02-28:10:12Z&st=2020-02-28T04:10:12Z&spr=https&sig=q7RnBopZNktkTzdQJMIj9l%PjSdEbX%2BQz15sACY%3D" --recursive

This screenshot shows the report that I got from the command and parameters above:

Testing

After completing the migration, and you can confirm that with the last line of the output stating “Final Job Status: Completed”, it is time to test and validate that the process of moving the data has gone as expected. On my lab,  I went through the following steps. First I removed my Amazon S3 from my Veeam Backup and Replication server:

The next thing I did was to register the Azure Blob storage, and add it to my Scale-Out-Backup-Repository

On pressing the Finish button, I was greeted with the following notification, indicating that the new Azure Blob already contains an existing backup; Veeam was also asking if I wished to import the Blog. I chose to import it at this stage:

I was presented with the following results after the import was completed:

To confirm that my import was successful, I browsed to Veeam Home, and under Backups, I can see the new imported data backups on my new Blob storage. You can see them in the screenshot below:

Conclusion

I was very impressed with the data throughput of Microsoft AzCopy. In the test of AzCopy I conducted to write this blog, I returned 1133 MB/s. An added bonus is that the data migration path does not pass through the workstation from where you run the command. The path of the AzCopy data is directly between the Amazon S3 AWS URL and the Microsoft Azure Blob URL. It is this direct data path that makes the migration process extremely fast.

Microsoft AzCopy can do much more; but on this blog post, I have focused on a task where I was asked to assist; the migration of backup data from Amazon S3 to Azure blob storage. To learn more about Azcopy, you can use this link. AzCopy is a simple, fast and reliable command-line tool from Microsoft to migrate data between AWS and Azure.

What do you think?

What do you think?

4 Comments:
July 18, 2020

Hi, I setted up my AWS credentials with export command….but its throwing me error like AWS credential is not set!

November 5, 2022

I use Gs Richcopy 360 instead of Azcopy and it makes it easy to migrate from S3 to Azure Blob, also it is very fast and integrates with AWS/Azure services very well

August 17, 2023

That’s great! I’m glad to hear that you found a tool that works well for you. Gs Richcopy 360 is a popular tool for migrating files between different cloud storage providers. It is easy to use and can preserve NTFS permissions and timestamps. It is also free for personal use, which is a great option if you are on a budget.

Azcopy is another popular tool for migrating files to Azure blob. It is a command-line tool, which means that it can be a bit more complex to use than a GUI tool like Gs Richcopy 360. However, Azcopy offers a wider range of features, such as the ability to transfer files over different protocols and the ability to resume transfers that have been interrupted.

Ultimately, the best tool for you will depend on your specific needs and requirements. If you are looking for a simple and easy-to-use tool that can preserve NTFS permissions and timestamps, then Gs Richcopy 360 is a good option. If you need a more powerful tool with a wider range of features, then Azcopy may be a better choice.

September 17, 2024

How can I use Gs Richcopy 360 free for this task ?

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to Our Newsletter

Table of Contents

Related Insights