Google Cloud Bucket stops uploading after 400 objects

I have a Firestore database that contains around 3 million documents. I want to back up every document to a Google Cloud Storage bucket. I have written a script to accomplish this. The scrip writes the documents in batches concurrently. I’ve noticed that the bucket stops growing after around 400 documents. I still get success callbacks from the script indicating that I’ve written much more than 400 documents but when I inspect the bucket and use a client library to read the number of objects, I always get around 400. The documentation says that there are no restrictions on writes. Why could this be happening?

I’ve also played around with the size of batches and it seems like when the batches are around 50 documents big the writes execute successfully however when there are around 100 documents in a batch the writes don’t seem to execute properly. Note that my script never throws any errors. It seems like all the writes are executing but when I retrieve the number of objects, it’s always around 400 regardless of how many documents the script thinks it has written.

1 Like

How big are the documents? Are you able to see more than 400 in the GCS UI? Is it possible that you overwrite the documents because their names are not unique?