try except not able to catch Java runtime exception for pyspark code

def function():
    try:
        if partition_key is None:
            df.write.format('bigquery').option('table', 'table_name'
                    ).mode('append').save()
        else:
            df.write.format('bigquery').option('table', 'tablename'
                    ).option('partitionField',
                             partition_key).mode('append').save()
        return 'Success'
    except Exception as e:
        print str(e)
        return 'error'

Hi guys , The line df.write.format… in above pyspark code prints a Java runtime exception (error while writing in Big Query) but it is not caught by the try except block in Python. I do know why the exception occurs but how to do I catch these Java runtime exceptions in pyspark ? The above code returns Success in this case even if the write to BQ has failed [I am running this code on dataproc cluster with spark bq connector]

spark to big query connector link : gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar
cluster details : image => 2.0 (Ubuntu 18.04 LTS, Hadoop 3.2, Spark 3.1). 1 master 0 node

1 Like

Could you try the code I’m sharing with you?

df.write \

.format(“bigquery”) \

.option(“temporaryGcsBucket”,“bucket/temp”) \

.mode(“append”) \

.save(“gcp-bankier.sof.table1”)

If it doesn’t help you, there might be something wrong in the code that you are using.

@RC1 I’m facing same issue. How do you fix this? I’m developing on databricks