Having a conversation with your data platforms
Using Application Integration & Vertex Agents to activate BigQuery Data
Let’s have a chat with BigQuery.
In this technical demonstration, we will explore the functionalities of the native connectivity features of Application Integration. We will demonstrate how we can utilize BigQuery data to enhance AI applications and empower users in their daily tasks.
Scope of the Demo:
- Vertex AI Agent: Configuring an agent that uses an OpenAPISpecification to access data and responds to questions about the information stored in BigQuery.
- Application Integration: Native connectivity. Configuring a simple way of integrating with the data platform and restructuring data to accommodate it for its consumption.
- BigQuery: A dataset will be configured to act as sampledata. Big Query will act as a data warehouse where valuable business data rests.
Demonstration Setup
Let’s bring this implementation to life!
To demonstrate the value of the platforms, in this section you will be able to replicate an end to end scenario of data activation. As a summary, this is a list of the activities we will be covering:
- Configure BigQuery: load a demo dataset.
- Create BigQuery Connector: enable native connectivity against the dataset.
- Create Integration: using the provided template, configure connectors.
- OpenAPI Specification (OAS): an OAS will be provided. This will be the detailed specification of the data exposed by App Integration
- Configure Vertex Agent: Load OpenAPI Specification and interact with BigQuery using a chat interface
Pre Requisites - Before we start:
I. A new project can be created to avoid configuration collisions or changes.
II. Permissions and Service Accounts:
- Grant the roles/connectors.admin IAM role to the user configuring the connector.
- Create a service account with the following roles:
- roles/bigquery.dataEditor
- roles/bigquery.readSessionUser
- roles/bigquery.jobUser
- Create the DialogFlow service agent and grant the following permissions:
- roles/integrations.integrationInvoker
- Note: the service agent is identified by service-PROJECT_NUMBER@gcp-sa-dialogflow.iam.gserviceaccount.com
Use this commands by opening a cloud shell and populating your own details:
##Populate variables with your own data
PROJECT_NUMBER="{your project number}"
PROJECT_ID="{your project ID}"
EMAIL="{your user email}" #The email/principal configuring the demo.
SERVICE_ACCOUNT_NAME="big-query-data-editor"
SERVICE_ACCOUNT=$SERVICE_ACCOUNT_NAME@$PROJECT_ID.iam.gserviceaccount.com
#Set project
gcloud config set project $PROJECT_ID
#Enable APIs
gcloud services enable integrations.googleapis.com \
secretmanager.googleapis.com \
connectors.googleapis.com \
bigquery.googleapis.com \
bigquery.googleapis.com --project=$PROJECT_ID
#Add role to user
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=user:$EMAIL \
--role='roles/connectors.admin'
#Create service account
gcloud iam service-accounts create $SERVICE_ACCOUNT_NAME \
--display-name=$SERVICE_ACCOUNT_NAME
#Add roles to service account
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=serviceAccount:$SERVICE_ACCOUNT \
--role='roles/bigquery.dataEditor'
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=serviceAccount:$SERVICE_ACCOUNT \
--role='roles/bigquery.jobUser'
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=serviceAccount:$SERVICE_ACCOUNT \
--role='roles/bigquery.readSessionUser'
#Create DialogFlow service identity and Add role to DialogFlow Service Agent
gcloud beta services identity create --service=dialogflow.googleapis.com \
--project=$PROJECT_ID
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member=serviceAccount:service-$PROJECT_NUMBER@gcp-sa-dialogflow.iam.gserviceaccount.com \
--role='roles/integrations.integrationInvoker'
III. Copy the contents of the following repository, in it you will find the following required dependencies:
Repository: Application Integration Samples Repository
- users-export.csv - sample data for bigquery table.
- vertex-agents-bigquery.json - The integration flow responsible for native connectivity to BigQuery (and 100+ other platforms and apps ?)
- user_management_api.yaml - OpenAPI Specification file of the integration.
- user-management-agent-app.zip - An Vertex AI Agent application, bundled and ready to be tested.
Big Query Setup
-
Let’s navigate to BigQuery in the GCP console.
-
The next step will be the creation of a dataset and a table. For these purposes the sample file users-export.csv will be used.
-
Click +ADD
-
SelectLocal File option. -
A window will appear with options to fill. Make sure the forms is filled following the example provided:
Source
- Create table from: Upload
- Select file *: Browse the file provided: users-export.csv
- File format: CSV
Destination
- Project *: Use your project id
- Dataset: Click on CREATE NEW DATASET
(see table below for guidance)
- Table: users
- Table Type: Leave default (Native Table)
Create dataset:Fill information when prompted to create a new dataset and click CREATE DATASET.
- Dataset ID: UserDepartment
- Location type: Region
- Region: us-central1 (Iowa)
- Table Type: Leave default (Native Table)
**Schema:**Leave “Auto Detect” checked ![]()
Leave the rest of the values by default and click CREATE TABLE
Now the Dataset and Table should be visible and if you navigate to Preview in the table, the data imported will be visible:
Application Integration Setup
It’s time to present the star of the show. Let’s navigate to the following link: https://console.cloud.google.com/integrations
In this section the following file will be imported: vertex-agents-bigquery.json**.** Before the integration flow can be exported, Application Integration has to be provisioned. In the main page of application integration you will see a quick start option.
- Region: us-central1 (Iowa)
- Click QUICK SETUP
Once the setup is complete, you will be presented with the application integration dashboard.
- Navigate to integrations list page
- Create a new integration
- A prompt will appear requiring a Name. Use: big-query-integration
- Click CREATE
Once the integration is created, a blank page will be presented (not for long). In the top-right corner there is a button with 3 dots. Let’s click on it. Pick the option of upload and choose the file vertex-agents-bigquery.json**.**
The integration flow should be as follows:
Let’s understand the integration:
- Application Integration has many triggers. These triggers tell how the flow gets started. There are different types of triggers. We can configure a flow to subscribe to a Pub/Sub topic, via schedule, or in this case by making an API Call.
- There are two connector tasks that are configured. These tasks enable the flow to reach, access and extract/update data from BigQuery.
- “Set List Response” and “Set Item Response” tasks are for data mapping and will help in accommodating the data structure for consumption.
Application Integration Connectors Setup
In order to connect to BigQuery we need to provision a connector and configure the task.
-
Click on the “BigQuery List Users” task.
-
The following section will appear. Click on CONFIGURE CONNECTOR.
-
Use the following table as a reference for filling the form:
Select Connection
- Region: us-central1 (Iowa)
- Connection: CREATE CONNECTION
(a new page will appear)
Connection Details
- Connector: BigQuery
- Connection: big-query-connection
- Description: This connector enables operations against BigQuery.
- Enable Cloud Logging:

- Service Account *: Select the service account you previously created.
(it should have the following role: roles/bigquery.dataEditor) - Project ID: Use the Project ID of the project where BigQuery dataset is created.
- Dataset ID: UserDepartment
(this is the id of the dataset we created in earlier steps)
Note: if a warning is prompted suggesting to add roles/permissions to your service account, click GRANT
-
After completing this information, click NEXT. In authentication click NEXT. Finally click CREATE.
-
Once the Connector has been created, this is how it should look:
-
Click NEXT and select Entities. Click NEXT again**.**
-
The Entity field is loading the tables and schemas directly from BigQuery. Pick “users”. This is the table we created during BigQuery configuration.
-
In operation leave “List” selected and Click NEXT.
-
Click DONE.
For the “BigQuery Get Item” task, the same configuration is required. This time the already created connection should be reused.
Application Integration Testing
After configuring both connectors, you can proceed to test the integration flow. Locate the test button in the top-right corner of the integration editor. Upon clicking it, two options will be presented. To fetch a list of users, click on “Get User Inventory.”
The following section will appear:
For Input Variables, just leave it blank and click TEST INTEGRATION.
Result of the integration execution:
To obtain a single user, let’s grab a single identifier from the previous integration execution result and save it for now. Click on TEST and select “Get User” option to fetch a single user:
Result of the integration execution:
At this point the integration is working and data is being extracted natively from BigQuery.
Finally, the integration has to be published in order to be executed via API Call. In the top-right corner click on the Publish button. This will enable the agent that will be configured on a later section to execute the integration flow and fetch data.
Note: If more details are needed with regards to execution flows, logs can be checked.
Vertex AI Agent Setup
In the following section an agent will be configured by using an OpenAPI Specification. The file user_management_api.yaml has already been configured into an agent. In the following section an agent will be created and an application imported, enabling the user to interact with BigQuery by using natural language.
Let’s start by navigating to: https://console.cloud.google.com/gen-app-builder/start. The following page will appear:
You can choose to opt-out by not checking the box. Let’s click CONTINUE AND ACTIVATE THE API.
-
The following page will load multiple options. Let’s pick “Agent”.
-
Use “user-management-agent-app” as the display name.
-
Pick “us-central1” as the region.
-
Click AGREE & CREATE
After the creation of the agent application, the following page should be loaded:
Fortunately, an agent can be restored/imported. Once you are on this page, notice the 2 arrows going in opposite directions in the top-right corner of the screen.
- Click on the Arrows
- Click on Restore app
- Select Upload and import the file user-management-agent-app.zip
After the import process succeeds, this is what configurations should look like:
The right side of the page is a chat that enables the user to interact with the agent by prompting natural language. The left side of the page contains configurations.
The last step involves the OpenAPI Specification. This file should point to the project you are working on. This has to be modified on the Tool that is configured in the agent. For these purposes follow the instructions below:
- Click the tools section in the sidebar.
- Click on user-management-api-tooling
- Scroll down until the OpenAPI Spec is visible under the Schema section:
- Locate lines 37 and 89. These lines are the API Operations and should be pointing to the project your integration is running. Follow the examples below to make the required changes:
#Line 37:
"/v2/projects/your-gcp-project/locations/us-central1/integrations/big-query-integration:execute?triggerId=api_trigger/get-user-inventory":
#Line 89:
"/v2/projects/your-gcp-project/locations/us-central1/integrations/big-query-integration:execute?triggerId=api_trigger/get-user":
Replace your-gcp-project with the corresponding project id.
Let’s start asking our agent some questions to access BigQuery with natural language:
Example Prompt 1:
Example Prompt 2:
Example Prompt 3:
Example Prompt 4:
Disclaimer and Notes:
- The agent is a very simple implementation. It enables a very simple but powerful scenario of data activation. For sure, some fine tuning is needed in order to reduce hallucinations and to enrich the outputs it gives.
- Feel free to play around with the configurations, examples on any of the components.
- Share your findings, feedback or your own implementation!
Wrapping up & Next Steps
A use case and proof of concept as simple as the one implemented during the demo, should be enough to make a mind run wild, igniting some creativity and thinking about some enterprise scenarios and ways of materializing value for the organization. Being able to chat with an AI agent in natural language is a feat on its own, but being able to access, share, activate and connect in a native manner with SAP, Salesforce, Oracle, BigQuery and over 100 platforms is absolutely fundamental. If there is any value in generative AI, for sure it deeply depends on the data it can consume. To be able to consume data there is an integration effort that has to be absorbed by someone, in this case, it has been solved by Application Integration and the pre-built connectors.
For more information with regards to data activation the following article should give more context with regards to activating data: Powering AI Applications
If you want more details about application integration: https://cloud.google.com/application-integration/docs/overview
Free Tier & Costs
Application Integration provisions infrastructure dedicated to the user/organization. For this reason, if you go out of the free tier, there could be some charges into the configured gcp billing account.

























