I’m getting the below Error when running DataFlow from Commandline - I would like to know what is wrong in this command and how to run Dataflow successfully using “gcloud dataflow jobs run” command? I don’t find these options covered in detail in GCP Documentation.
Below is the template that I’m using in --gcs-location
https://github.com/GoogleCloudPlatform/DataflowTemplates/blob/main/v1/src/main/java/com/google/cloud/teleport/templates/WordCount.java
Below is the command and Error,
rajkumar@cloudshell:~ (raj-box)$ gcloud dataflow jobs run WordCountDemocli --project raj-box --gcs-location “gs://raj/Default Templates/WordCount.java” --region asia-south1 --num-workers 2 --staging-location gs://raj/temp --subnetwork https://www.googleapis.com/compute/v1/projects/raj-box/regions/asia-south1/subnetworks/default --disable-public-ips --parameters inputLocations=“{“location1”:“gs://dataflow-samples/shakespeare/kinglear.txt”}”,outputLocations=“{“location1”:“gs://raj/wordcountoutput/wordcount”}”
ERROR: (gcloud.dataflow.jobs.run) FAILED_PRECONDITION: Unable to parse template file ‘gs://raj/Default Templates/WordCount.java’.
- ‘@type’: type.googleapis.com/google.rpc.PreconditionFailure
violations:
- description: “Unexpected end of stream : expected ‘{’”
subject: 0:0
type: JSON
rajkumar@cloudshell:~ (raj-box)$