temp_location set for template at execution time but it still uses value at compile time

So I know that you can set temp_location at compile time e.g.


python template_name \
--runner DataflowRunner \
--project project\
--setup_file ./setup.py \
--temp_location gs://path/to/compile/temp \

And at job creation time (e.g. I pass gs://path/to/execution/temp), it still insists to write to compile/temp. I also notice that doing this, it actually writes to both buckets. Is there any way to make it only write to the bucket specified at job creation time?

Hi!

Welcome to Google Cloud Community!

As you said “it still insists to write to compile/temp”, can you provide a screenshot wherein this happens? This is to properly replicate your use case.

Hello!

Just to preface a bit more, we compile the template and push to a bucket in one bucket, and run Dataflow jobs with a separate temp bucket in another bucket. The service running the Dataflow jobs will fail to fetch the template if it also does not have write access to the template’s bucket (we want it to have read only).

This is the error we get in dataflow (it’s attempting to upload into the bucket it shouldn’t be writing to)

[snip]

As of now, this is an expected behavior and no possible work around as of this moment. You may file a feature request for this in https://cloud.google.com/support