Hi All,
This sample in GitHub shows how you can copy files from SFTP to Google Cloud Storage (GCS) and vice versa. For example, if you have a partner that only uses SFTP, but your process uses GCS, you can use Application Integration to help bridge the gap. Please use the Readme file for instructions.
For convenience, I put both of these flows in one Integration. The GCS to SFTP flow has two types of triggers. The PubSub Trigger is enabled for a notification to PubSub feature that GCS supports: when you drop a new object in the configured bucket, pubsub is notified, and then the Integration executes! Event driven Files! The API Trigger allows you to easily test the flow without having to drop more files into the GCS bucket. You can cut/paste the payload from the logs from the PubSub trigger execution and set it to the default value of the input variable ($PubSubPayload$) so that you can re-test with the same GCS object over and over until you have everything finalized.
You can edit the .json files to replace your project-id, region, bucket-name, pub-sub-topic, sftp-connection-name, and gcs-connection-name and then upload to your Application Integration environment in the GCP console. Or, you can just upload it and then configure all of these settings in the Connection Tasks, Trigger tasks, and in the default value of the input variables. (search and replace in a text editor may be easier to do in the .json file). You can also use the IntegrationCLI tool to do the replacements. Here is what it looks like:
Please see this community post for some details about how to do binary files
Please see these docs for how to set up the GCS Notifications: https://cloud.google.com/storage/docs/reporting-changes#command-line
Enjoy. Please comment with any questions.
