Use GCP Data Transfer Service import data from Azure Blob
In GCP there are various ways to get the data from other clouds to GCP storage. All the options are listed in this article.
Focus on this article is how to copy data from Azure Storage to GCP Storage using GCP Transfer Service.
GCP Transfer Service has three main areas as listed below
1️⃣ Transfer Service Cloud 👉 This will allow transfer data from other clouds including GCP to GCP
2️⃣ Transfer Service On-Premises 👉 This will allow transfer data from private data centres into Google Storage
3️⃣ Transfer Appliance 👉 This will allow transferring a large amount of data to GCP by leasing a high capacity server from GCP.
Step by Step How to use GCP Transfer Service Cloud to Copy Data from Azure Storage
Step 1️⃣: Select Transfer Service under Storage Section in Google console and click Transfer Service Cloud
Step 2️⃣ : Select Microsoft Azure Storage Container and fill the storage account, container name and SAS Access token
You can find the above value in the Azure Portal as shown in below screen.
⑴ Storage Account Name
⑵ Click the Containers which will bring up all the containers in this storage account
⑶ Once you click the containers you can see the container name and select the container to have the data need to import to GCP storage. Which should be the name used in the GCP storage name section.
⑷ How to generate the SAS Token, Goto Shares Access Signature section
👉 Allow services need to be Blob
👉 Allow resource types Container and Object
👉 Set Token validity period using Start Date Time, End Date Time
👉 Set the protocol, recommended using HTTPS only
Once you click as shown in number 3 in the above screen, Generate SAS and connection string SAS token can be found in the SAS token section which you need to put into GCP transfer access token.
Step 3️⃣: Select Where Google Storage Bucket where you want data to be copied over from Azure Storage
Step 4️⃣: Select the schedule of how often data need to be copied over to GCP Storage from Azure Storage and click create will create the job.
Step 5️⃣ Monitor schedule job
Above example, set up to run once-off immediately one it create the job. Below is the statistic experience with 6 files with various sizes and one more than 1.5GB.