def function():
try:
if partition_key is None:
df.write.format('bigquery').option('table', 'table_name'
).mode('append').save()
else:
df.write.format('bigquery').option('table', 'tablename'
).option('partitionField',
partition_key).mode('append').save()
return 'Success'
except Exception as e:
print str(e)
return 'error'
Hi guys , The line df.write.format… in above pyspark code prints a Java runtime exception (error while writing in Big Query) but it is not caught by the try except block in Python. I do know why the exception occurs but how to do I catch these Java runtime exceptions in pyspark ? The above code returns Success in this case even if the write to BQ has failed [I am running this code on dataproc cluster with spark bq connector]
spark to big query connector link : gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar
cluster details : image => 2.0 (Ubuntu 18.04 LTS, Hadoop 3.2, Spark 3.1). 1 master 0 node