Vertex AI Batch Prediction: How to output generated images as GCS URIs instead of inline Base64?

Hello,

I am running Batch Predictions on Vertex AI for image generation (image + prompt input).

The Setup:

  • Input: My input JSONL file works perfectly; I am successfully using gs:// URIs to reference source images.

  • Output Destination: The results are correctly saved to my specified GCS bucket.

The Problem: The resulting prediction-*.jsonl file contains the generated images as inline Base64 data rather than references. Since each image is ~2MB, a batch of 500 prompts results in a ~1GB JSONL file.

I plan to scale this to thousands of prompts per batch, making a single JSONL file unmanageable to parse and store in memory.

The Question: Is there a specific parameter or configuration flag for Vertex AI Batch Prediction that forces the generated output images to be saved as individual objects in GCS, resulting in a JSONL file that contains only the gs:// URIs?

I am trying to avoid post-processing a massive text file just to extract images.

Thanks!