Seeking advice on securely uploading local database backups to Google Cloud Storage (Colombia, healthcare IPS)

Hello everyone,

I am a systems intern at a healthcare provider (IPS) in Colombia. Our clinical software generates daily .bak database backups that remain on a local server provided by the software company.

I am looking for recommendations and best practices for safely storing these backups in Google Cloud Storage, including encryption, storage class selection, and logging/access tracking to meet Colombian health data regulations.

Any advice, examples, or documentation would be greatly appreciated.

Hello @anox1,

I would say that the simplest way to do that is to write a script that mirrors your backup folder or just uploads new backups incrementally every now and then. You could also add a (mail?) notification once it’s done to keep track of the backup process.

GCS is encrypted at rest, which means everything is encrypted by default, but you can use a different encryption as mentioned in Data encryption options. As I’m not aware of the Colombian health data regulations, I will let someone else advise you on that point.

Note that there is no GCP datacenter in Colombia, so you will have to pick another region. I think that’s a very important point that you should address before starting anything. Here in France, it’s common to ask a DPO (Data Privacy Officer) or a legal expert to clarify such things.

Otherwise, you have 2 options in South America, Santiago(southamerica-west1) and São Paulo(southamerica-east1). You can also check the available regions that suit you there. Depending on your budget and your plan, you can use a single region or go with a model focused on Data availability and durability.

Concerning the GCS bucket, the Archive Storage Class is recommended for long-term storage with infrequent to no access and is ideal for lowest-cost, highly durable storage service for data archiving, online backup, and disaster recovery. You can also apply Object Lifecycle Management to automatically delete objects that are too old (e.g., after x months/year(s)).

Last but not least, the IAM part should respect the Principle of Least Privilege (POLP), as explained on the page: Use IAM securely. So you should create a service account dedicated to this job, whose sole purpose and permissions are to write the .bak files into your bucket. You will have to grant the service account the role roles/storage.objectCreator directly on the bucket.

If you plan to do more than that on GCP, I highly encourage you to use an Infrastructure as Code tool like Terraform or OpenTofu.

For safely storing your healthcare backups in Google Cloud Storage: use encryption (default or CMEK for stricter control), choose a storage class based on access needs (Standard/Nearline for recent backups, Coldline/Archive for older), enable audit logging and IAM controls for access tracking, and ensure regional compliance for Colombian health data. Automate uploads, verify integrity, retain local/cloud copies, and document policies to meet regulatory requirements.

To safely store healthcare backups in Google Cloud Storage, use encryption (default or customer-managed keys) for data protection, select the right storage class based on access frequency, and enable IAM roles with audit logging to track access. Ensure compliance with Colombian health data regulations by choosing regional storage, automating uploads, verifying data integrity, maintaining both local and cloud copies, and documenting all backup and retention policies.