I have various services built on GCP, including GCS and BigQuery. We have another service and data with API on AWS . How to interface between these two platforms if I would like to send/push my data regularly to Google Cloud Storage.
Hi Fikri, Apigee provides a platform for developing and managing APIs. If you have services on on GCP and AWS, you could use Apigee to create API proxies for those services and let client applications and other services talk to each other via API proxies while providing security, rate limiting, quotas, analytics, and many other API management features.
This would also apply to Google Cloud Storage. You could expose Google Cloud Storage API via an API proxy through Apigee if needed.
In this design, you may first need to consider whether you really need API management features for these APIs. If not, you could directly use existing APIs without having to create API proxies for them.
Hope this would help. If you could share more specifics of your requirements we might be able provide a better answer. Thanks!
We have API management using Apigee on AWS where we have applications and interface to external users or consumers.
What we are going to do is simply transfer data regularly or continuously from the application or services on this AWS using API to Google Cloud Storage. This could be in reverse direction where we collect data using Apigee API on AWS from GCP.
Any suggestion on what is correct and most efficient way to do this?
I am aware of this interoperability, but due to some restrictions imposed in the organization and different managed area we’re not allowed to pull out raw data from the storage directly on AWS but only using processed data, which is available via API>
Similarly access from external consumers to query some processed data. Any suggestion how this can be effectively done?
Create a job on GCP Cloud Scheduler for fetching processed data from the above API proxy and writing that to GCS periodically: https://cloud.google.com/scheduler/