Hi,
I have around 1.2 PB of data which is residing in S3 with private access. I would like to know if it is possible to migrate objects via a private channel.
Thanks in advance
Regards,
Ganesh
Hi,
I have around 1.2 PB of data which is residing in S3 with private access. I would like to know if it is possible to migrate objects via a private channel.
Thanks in advance
Regards,
Ganesh
As shown in the documentation, it is recommended to use Storage Transfer Service to move or back up your data from other cloud storage providers to Cloud Storage. Supported cloud storage providers include Amazon S3 and Microsoft Azure Blob Storage.
Note that transfers from Amazon S3, Microsoft Azure, URL lists, or Cloud Storage to Cloud Storage do not require agents and agent pools.
Before configuring your transfers, make sure you have configured access:
Then you can create transfers:
Go to the Storage Transfer Service page in the Google Cloud console.
Click Create transfer job. The Create a transfer job page is displayed.
Choose a source:
Access key: Enter your access key in the Access key ID field and the secret associated with your access key in the Secret access key field.
ARN: Enter your ARN in the AWS IAM role ARN field, with the following syntax:
```
arn:aws:iam::ACCOUNT:role/ROLE-NAME-WITH-PATH
```
Where:
ACCOUNT: The AWS account ID with no hyphens.
ROLE-NAME-WITH-PATH: The AWS role name including path.- For more information on ARNs, see IAM ARNs.
Under Source type, select Amazon S3.
Click Next step.
In the Bucket name field, enter the source bucket name.
The bucket name is the name as it appears in the AWS Management Console.
Select your Amazon Web Services (AWS) authentication method. You can provide an AWS access key or an Amazon Resource Name (ARN) for identity federation:
Click Next step.1. Choose a destination:
In the Bucket or folder field, enter the destination bucket and (optionally) folder name, or click Browse to select a bucket from a list of existing buckets in your current project. To create a new bucket, click Create new bucket.
![]()
Click Next step
Choose settings for the transfer job. Some options are only available for certain source/sink combinations.
Click Next step.1. Choose your scheduling options:
Note: The Storage Transfer Service displays transfer job schedules in your local timezone, but it stores those times in Universal Time Coordinated (UTC). If you are affected by Daylight Savings Time (DST), you might experience a transfer job schedule change when DST starts or ends.
From the Run once drop-down list, select one of the following:
From the Starting now drop-down list, select one of the following:
To create your transfer job, click Create.
Note: Storage Transfer Service is currently available to transfer data from the following Amazon S3 regions: ap-east-1, ap-northeast-1, ap-northeast-2, ap-south-1, ap-southeast-1, ap-southeast-2, ca-central-1, eu-central-1, eu-north-1, eu-west-1, eu-west-2, eu-west-3, me-south-1, sa-east-1, us-east-1, us-east-2, us-west-1, us-west-2.### Service Level Agreement
Storage Transfer Service currently does not provide an SLA, and some performance fluctuations may occur. For example, we do not provide SLAs for transfer performance or latency.
Whether it is private or public S3 bucket, it should still work with STS. Is it correct?
That is correct. In step 3.d, an Amazon authentication method is required, which will give access to the S3 buckets, and as long as the provided credentials have access to those S3 buckets, there will be no problem with the transfer whether they are private or public buckets.
Whether it is a private or public S3 bucket, tools like Gs Richcopy360, AvePoint, and Cloudsfer can migrate S3 objects directly and securely to GCS, search all