I am trying to write a dataframe into postgreSQL database table. When i write it into heroku’s postgres SQL database, everything works fine. No problems. For heroku postgresql, I use the connection string
It looks like the same post [1] on StackOverflow already has a good answer. Per answer to the, kindly take a look at using Cloud SQL Python Connector to manage your connections and take care of the connection string for you. Additionally it supports the pg8000 driver and should help resolve your troubles. Kindly take a look at the sample code implementation from the post or from the repo [3].
Lastly, to set the GOOGLE_APPLICATION_CREDENTIALS environment variable can be set by running gcloud auth application-default login [4] in Cloud SDK or setting os.environ[“GOOGLE_APPLICATION_CREDENTIALS”] = "path/to/file.json under the import statements.
Hello Horace, unfortunately the response on StackOverflow did not solve the problem. I still am unable to authenticate using os.environ[“GOOGLE_APPLICATION_CREDENTIALS”] = “path/to/file.json”
What do you mean by "setting os.environ[“GOOGLE_APPLICATION_CREDENTIALS”] = “path/to/file.json under the import statements”??
Could you share an example code snippet for understanding? thank you so much!
I too am experiencing this problem - i am using “postgres+pg8000://” as my connection string but otherwise the setup is the same as the OP. I have tried,
using the dtypes kwarg in df.to_sql with Integer object from SQLAlchemy
converting any integer based columns in my data using df.col.astype(np.int32)
removing columns individually to attempt to further identify the problem.
I am also already using GOOGLE APPLICATION CREDENTIALS - do you have any other ideas?