You can use this post to learn about building an Integration and deploy using Cloud Build. But for the latest update on CI/CD, pls refer to this new article - Part 1 and Part 2
Introduction
Application Integration is an Integration-Platform-as-a-Service (iPaaS) solution in Google Cloud that offers a comprehensive set of core integration tools to connect and manage the multitude of applications and data required to support various business operations.
Application Integration is a low-code/no-code platform that allows developers to create integration flows. Integration flows are sequences of tasks or activities that connect and coordinate the exchange of data between different systems. It is a common practice to develop these assets and promote them to various environments just like source code.
The best practice for CI/CD with Application Integration is to use a GCP project for each phase of SDLC. In this example, we will demonstrate how to automate the promotion of Application Integration and Integration Connectors artifacts between two SDLC phases - development (dev) and qa environments.
Prerequisites
- Application Integration and Integration Connectors are enabled and provisioned. If you have not done so already, please follow this guide
- Basic working knowledge of Application Integration and Integration Connectors.
- GitHub is used as the example source code repository. A GitHub account will be necessary to complete the instructions
- Download gcloud and configure the default project
gcloud config set project $project
Principles
There are a few (opinionated) principles followed in this example:
- An integration flow developed in the development environment must be deployed unchanged in production. Treat the integration flow like one would source code.
- The portions of an integration flow that do change from one environment to another are externalized. Only those properties change between environments. For example, the HTTP URL configured for a REST Task.
- There must be traceability between what is stored in a source code repository and an integration flow version deployed in an environment.
- Maintain a single repository for each deployable service
- Automate the deployment between environments
- Test in a clone of the production environment
Steps to Automate Application Integration Deployments
In this example you will see how to store integration and connector assets to a source code repository, promote those assets from one SDLC environment to another and finally, how such deployments can be automated.
Sample Integration
We will build a sample integration with minimal complexity to demonstrate this use case. This Integration flow calls an API and publishes the response from the API to a Pub/Sub topic. This sample is meant to illustrate the use of REST and Connector tasks.
Create a topic
gcloud pubsub topics create mytopic
Create a connection for Pub/Sub
In the GCP console, navigate to “Integration Connectors” from the left menu and click the “+ CREATE NEW” button to create a new connector.
Please change the Service Account and Project ID appropriately. Ensure the service account has privileges to publish to the PubSub topic. This may take a few mins
Create an Integration Flow as follows
- Create a new Integration “sample” with description “Sample Integration for CI/CD” and select your appropriate region. Click “CREATE”
- In the Integration designer, click the “+ CREATE” in the Variables section to create some config variables. Make sure you select “Config Variable for Integration” in the Variable Type dropdown
- Similar to the step above, create another config variable and called it “URL”
- One more config variable called “CONN_NAME”
- Crete another config variable “TOPIC”
Once all the variables are created, you should see them in the Variables section
- Add an API Trigger and wire to the REST task. In the REST Task config, select the URL config variable
Add the config variable “ENV” to a custom http header “X-INTEGRATION-ENV” in the REST task as shown below. NOTE: the variable name will be CONFIG_ENV
- Add a Connectors Task, do not wire it yet. Select the PubSub connection created previously
Select the connection
Click “NEXT”. Click “Actions” as the TYPE. Click “NEXT” again
Select “Publish Message” for “Action” and click “NEXT”
Click DONE. This should configure the Connector task.
In the connector task, change the “Connection Name” to use the “CONN_NAME” variable
- Add a data mapper task and wire the tasks as shown below
- Click the Data Mapping task and open the Data Mapper editor. In the Data Mapper Editor, map the responseBody variable of the REST task to the input section and the “message” attribute of connectorInputPaylod to the output section. Similarly, drag the
CONFIG_TOPICto the input section and “topic” attribute of the connectorInputPaylod to the output as shown in the diagram below
- Test the integration to ensure the flow works successfully. To Test, click the “TEST” button and in the input section, provide the following values
CONFIG_ENV |
dev |
|---|---|
CONFIG_URL |
https://httpbin.org/get |
CONFIG_CONN_NAME |
projects/$PROJECT/locations/$REGION/connections/pubsub |
CONFIG_TOPIC |
projects/$PROJECT/topics/mytopic |
NOTE: Replace PROJECT and REGION with your appropriate values
Now hit “Test integration”. You should get a successful response.
- Now click “Edit Integration”. In the Variable panel on the left, click “connectorOutputPayload” and select “View Details”
In the Variable panel, under Variable Type, change it from “None” to “Output from Integration” and click “SAVE”
Similarly set the “responseBody” variable type to “Output from Integration” and click “SAVE”
Now Publish the Integration again with the same set of config variable value
CONFIG_ENV |
dev |
|---|---|
CONFIG_URL |
https://httpbin.org/get |
CONFIG_CONN_NAME |
projects/$PROJECT/locations/$REGION/connections/pubsub |
CONFIG_TOPIC |
projects/$PROJECT/topics/mytopic |
NOTE: Replace PROJECT and REGION with your appropriate values
Now hit “Publish integration”. Click “Test”. You should get a successful response.
In the new response, you will see the output variable printed for both the HTTP call that was made and the response message ID from Pub/Sub. You should see “X-Integration-Env” in the output
In this example, the following details may change between environments:
- The URL used in the REST task
- The project where the PubSub connection exists (for ex CONN_NAME variable)
- The topic name
- The variable content (for ex X-INTEGRATION-ENV variable)
Prepare Github
Create a new repository. It is recommended that every integration flow uses a different repository.
mkdir app-integration-demo && cd app-integration-demo
git init
git checkout -b dev
In this step, the dev branch is created. The recommended folder structure for Integration and Connectors are as follows:
├── cloudbuild.yaml #the cloud build deployment file
└── <env>
│ ├── connectors
│ │ └── <connector-name>.json #there is one file per connector. the connector name is the file name.
│ ├── config-variables
│ │ └── <integration-name>-config.json #there is one file per integration.
│ ├── authconfigs
│ │ └── <authconfig-name>.json #there is one file per authconfig. the authconfig name is the file name.
│ ├── endpoints
│ │ └── <endpoint-name>.json #there is one file per endpoint attachment. the endpoint attachment name is the file name.
│ ├── zones
│ │ └── <zone-name>.json #there is one file per managed zone. the managed zone name is the file name.
│ ├── sfdcinstances
│ │ └── <instance-name>.json #there is one file per sfdc instance. the sfdc instance name is the file name.
│ ├── sfdcchannels
│ │ └── <instance-name_channel-name>.json #there is one file per sfdc channel. A combination of sfdc instance name and channel name is the file name.
│ ├── overrides
│ │ └── overrides.json #always name this overrides.json. there is only one file in this folder
│ └── src
│ └── <integration-name>.json #there only one file in the folder. the integration name is the file name.
Introduction to integrationcli
integrationcli is a tool that lets you interact or manage (create, delete, get, list integrations and connections) with Application Integration, Integration Connectors or Apigee Integration/Connector APIs. This example uses this tool to automate deployments. You can see other examples and options here.
Install the CLI with the following command:
curl -L https://raw.githubusercontent.com/GoogleCloudPlatform/application-integration-management-toolkit/main/downloadLatest.sh | sh -
Set integrationcli preferences:
token=$(gcloud auth print-access-token)
project=<set DEV project here>
region=<set DEV region here>
integrationcli prefs set -p $project -r $region -t $token
Create a scaffold for the Integration
integrationcli integrations scaffold -n sample -s 1 -f app-integration-demo -e dev --cloud-build
Where “-n sample” is the name of the integration, “-s 1” is the snapshot number, “-e dev” is the environment name which creates a directory under which the files are stored and “-f app-integration-demo” is the folder to generate the artifacts . This command downloads the integration, connections used by the integration, authconfigs, sfdc instances & channels in the folder structure mentioned previously.
Overrides file
Integration flows contain values that can change between environments. For example,
- The Connector task needs to be changed as you migrate from one project to another
- The REST task & Cloud Function tasks may have different values between environments
integrationcli makes it easy to generate an overrides file for values that typically change between environments.
The file should look like this (./dev/overrides/overrides.json)
{
"task_overrides":[
],
"connection_overrides":[
{
"taskId":"2",
"task":"GenericConnectorTask",
"parameters":{
"connectionName":"pubsub"
}
}
],
"param_overrides":[],
"integration_overrides":{
"databasePersistencePolicy":"DATABASE_PERSISTENCE_POLICY_UNSPECIFIED",
"enableVariableMasking":false,
"cloudLoggingDetails":{
"cloudLoggingSeverity":"CLOUD_LOGGING_SEVERITY_UNSPECIFIED",
"enableCloudLogging":false
}
}
}
The values of “databasePersistencePolicy”, “enableVariableMasking”, “cloudLoggingSeverity”, “enableCloudLogging” can be overridden when the integration is promoted to other environments. Other overrides not automatically captured from the CLI may be added to the file at this time. Examples like retries, default values etc. may be added.
The scaffold also creates a cloudbuild.yaml that can help you with your pipeline.
Check in all the content to the dev branch.
git add --all
git commit -m 'first draft'
git push
The folder structure will look like this:
├── dev
│ ├── config-variables
│ │ └── sample-config.json
│ ├── connectors
│ │ └── pubsub.json
│ └── overrides
│ └── overrides.json
└── cloudbuild.yaml
└── src
└── sample.json
Now copy the “dev” directory and paste it as “qa”. The new structure of the repo is
├── dev
│ ├── config-variables
│ │ └── sample-config.json
│ ├── connectors
│ │ └── pubsub.json
│ └── overrides
│ └── overrides.json
├── qa
│ ├── config-variables
│ │ └── sample-config.json
│ ├── connectors
│ │ └── pubsub.json
│ └── overrides
│ └── overrides.json
└── cloudbuild.yaml
└── src
└── sample.json
In the newly copied “qa” directory, make changes to the sample-config.json, overrides.json and pubsub.json files that are specific to the qa environment. In this example, we will change the variable to qa.
overrides.json fragment:
"integration_overrides": {
"databasePersistencePolicy": "DATABASE_PERSISTENCE_POLICY_UNSPECIFIED",
"enableVariableMasking": true,
"cloudLoggingDetails": {
"cloudLoggingSeverity": "INFO",
"enableCloudLogging": true
}
}
Sample-config.json:
{
"`CONFIG_CONN_NAME`": "projects/$PROJECT/locations/$REGION/connections/pubsub",
"`CONFIG_ENV`": "qa",
"`CONFIG_URL`": "https://mocktarget.apigee.net/echo",
"`CONFIG_TOPIC`": "projects/$PROJECT/topics/mytopic"
}
Replace the PROJECT and REGION values with your QA Project and region. Notice that the value if CONFIG_ENV and CONFIG_URL have changed.
Commit all the content to the dev branch.
git commit -am 'adding qa configurations'
git push
Create the QA branch
git checkout -b qa
git push
Manual deployments to QA
The artifacts in the qa branch can now be deployed to the QA environment (GCP project). Set integrationcli preferences for the QA environment:
token=$(gcloud auth print-access-token)
project=<set QA project here>
region=<set QA region here>
integrationcli prefs set -p $project -r $region -t $token
integrationcli can be used to apply changes generated by scaffold. This will automatically provision all the necessary entities in the right order
integrationcli integrations apply -f . -e qa --wait=true
Once you test the integration, you should see that the integration should point to the new URL and also have “qa” as the custom header that was set in the sample-config.json file
Integration with Cloud Build
To promote changes into upper environments like QA, Stage, you would need to set up DevOps pipelines for example, Jenkins, Gitlab, or CloudBuild, etc. For this demo purpose, we are going to use Cloud Build, you can learn more about cloud build here.
This example uses a custom cloud builder called integrationcli-builder:
us-docker.pkg.dev/appintegration-toolkit/images/integrationcli-builder:latest
There are few options in the cloud build YAML to consider:
substitutions:
_LOCATION: "us-west1" # update your Integration region if different
_CREATE_SECRET: "false" # do not create Secret Manager secrets
_GRANT_PERMISSIONS: "true" # grant service account permissions
_ENCRYPTED: "false" # use encryption
_DEFAULT_SA: "false" # use default service account
_SERVICE_ACCOUNT_NAME: # the service account for connectors
_KMS_RING_NAME: # name of the KMS key ring
_KMS_KEY_NAME: # the name of the cloud kms key
_WAIT: "true"
Please change them to appropriate values esp. if using Cloud KMS for encryption.
Manual triggering of builds
gcloud builds submit --config=cloudbuild.yaml --project=$project --region=$region
NOTE: The integration revision is labeled (userLabel field) with the SHORT_SHA of the commit code in github. This is to provide traceability between what is stored in the source code repository and the published revision.
Automate deployments with triggers
Setup a Cloud Build Trigger as shown in the screenshot below
This will trigger deployments to the qa environment automatically as soon as changes are merged with the qa branch.
Similar to QA, you can create a directory called “prod” and then move it to a “prod” branch using a pipeline to your “prod” environment.
Advanced Options
When developing integrations, there may be a need to store sensitive information. For example,
- An authconfig may contains api keys or passwords
- Connections to databases, FTP servers etc. require username and password
Read more about advanced options here to use Cloud KMS to encrypt sensitive information.
Special Thanks to Nandan Sridhar for his collaboration on this article and also for all his contribution to integrationcli

























