Hi Folks,
I have hit a problem and would like to know if you have any suggestions to get around it. Primarily if I can fix it without trying alternate methods of code changes like insertAll or a csv file that is uploaded into GCS or directly into table.
I’m hitting an error on INSERT sql statement from Java application doing a call to Bigquery.query(sqlStatement)
where the sqlStatement has something like :
INSERT INTO project_id.dataset_id.ts_image_t (SMRY_KEY,field1…) VALUES (….),(….),(….)
My Insert statement does have multiple rows of insert values.
The length of the above string is 1023722 characters.
The error returned by BigQuery is:
GoogleJsonResponseException: 400 Bad Request
POST https://www.googleapis.com/bigquery/v2/projects/project_id/queries
{
“code”: 400,
“errors”: [
{
“domain”: “global”,
“message”: “The query is too large. The maximum standard SQL query length is 1024.00K characters, including comments and white space characters.”,
“reason”: “invalid”
}
],
“message”: “The query is too large. The maximum standard SQL query length is 1024.00K characters, including comments and white space characters.”,
“status”: “INVALID_ARGUMENT”
}
BigQueryException: The query is too large. The maximum standard SQL query length is 1024.00K characters, including comments and white space characters.
com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc:114
com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc:728
com.google.cloud.bigquery.BigQueryImpl$35:1349
com.google.cloud.bigquery.BigQueryImpl$35:1346
Can anyone guide on how to implement parameterized query in my insert statement to fix this error? Or any other alternate approach to fix this error are much appreciated.
Thanks,
Vigneswar Jeyaraj