Hello Team,
am trying to set up a data transfer job from big-querry to on-prem msql db,
but getting attached error.
Complains about credentials, yet locally the same creds work.
What could be the issue?
Hi @fmugambi ,
A few things to check:
-Make sure the on-prem MySQL is accessible from Google Cloud (check firewall, VPN, or Cloud Interconnect setup).
-Verify the user has remote login permissions (GRANT with host %), not just local.
-Check if the encryption mode (FULL) matches what the MySQL server expects.
-Ensure you’ve selected at least one MySQL object to transfer — the red error suggests this is missing.
If the local credentials work but fail here, it’s often a network access or permissions issue, not the password itself.
Its confusing since, a VM on the GCP subnet can reach the on-prem db, also note the network attachment is on this gcp subnet.
So what could i be missing?
How does one check for remote logins? not sure how to check this.
How to be sure as well DB expects FULL encryption or not?
Permissions on the GCP end or DB end? > tested even with root, that has all privileges, but still the same thing.
DB was not set to accept encryption.
we turn this off, and it worked.
Thank you!
Hi @fmugambi thanks for sharing the error! Based on the message “Request contains an invalid argument: Please check your login information” and the interface you showed, it looks like you’re setting up a BigQuery Data Transfer to an on-premises MySQL database, and the error appears when you click “Browse” in the MySQL objects to transfer field.
Here are a few key things to check:
Possible causes:
Incorrect or restricted credentials
Even if the credentials work locally (e.g., via DBeaver or the CLI), it’s possible that:
-
The MySQL user doesn’t have sufficient read (SELECT) permissions on all the necessary tables.
-
Access is restricted by IP — Google’s Data Transfer service must be explicitly allowed through the firewall.
Blocked port or IP
If your MySQL server is on-prem, make sure port 3306 (or your custom port) is open and reachable from GCP’s IP ranges. You might want to allow Google IPs temporarily just for testing.
Incorrect hostname
If you’re using localhost or 127.0.0.1, switch to a public IP or a DNS name that’s reachable from outside your network — Google Cloud can’t connect to local-only addresses.
Security/SSL configuration issues
Some setups require an SSL connection. If your MySQL instance enforces SSL and the connector isn’t properly configured with the right certs, the connection will fail.
Some troubleshooting ideas:
-
Try connecting to the database from a GCP VM using the same credentials — this can help confirm if the issue is network-related.
-
Create a new MySQL user with limited permissions just for this transfer, and test with it.
-
If you’re using a VPN or SSH tunnel, make sure it’s active and functioning correctly.
Easier alternative:
If you’re looking for a simpler way to move data between BigQuery and MySQL (whether on-prem or cloud), you could consider tools like Windsor.ai, Stitch, Fivetran, or DataFusion , these platforms can handle data syncing with less manual setup and configuration
Hope this helps you identify the problem!
