Spark Dynamic overwrite to bigquery overwrites entire table

Hi

I’m trying to overwrite a specific partition date(dt) data from bigquery through spark but it overwrites the entire table data.Can someone help me on this.

Spark configurations:
“spark.sql.sources.partitionOverwriteMode” : “dynamic”,

df.write.mode(“overwrite”)
.format(“bigquery”)
.option(“partitionField”,“dt”)
.option(“spark.sql.sources.partitionOverwriteMode”,“dynamic”)
.option(“temporaryGcsBucket”, “xxxx”)
.option(“table”, “project.dataset.table”)
.save()

It overwrites the entire content, can someone help me out on