Application Integration: Integration over Kafka

Hello!

Did you know that you can use Application Integration to easily manage and propagate your Kafka events to the cloud ?

Kafka is a distributed event store and stream-processing platform. This type of platform allows you for example to produce events in real time from you digital platforms where it is much more difficult in a point-to-point approach.

Your IT systems can then consume the events from the topic depending on their needs (ex : being aware of a user action or bug in the platform).

The only issue is that integrating IT Systems with Kafka can be complex because systems were not designed to subscribe to a topic.

This is where Application Integration comes to play : a simple and managed way to modernize Digital Experience using Kafka. Application Integration subscribes to the topic and propagates the right information in real time to IT Systems using APIs or connectors.

It also brings the mediation layer between Kafka and IT Systems.

I made few demos with different IT systems that works pretty well :

Here is a quick guide to start on your own :

  1. Install Kafka (it could be locally or in the cloud)

Follow the Kafka quickstart to download Kafka, start the Kafka environment, and create a Kafka topic.

  1. Configure the Kafka Connector

Application Integration now offers a native Kafka trigger. You can subscribe directly to the topic you created.

  • Create the Kafka Connection: In the Google Cloud Console, navigate to Integration Connectors. Create a new connection, select Kafka, and input your Kafka environment details (Bootstrap servers, authentication, etc.) to establish the link between your cloud environment and your Kafka instance.

  • Configure the Kafka Trigger: Open your Integration designer and select the Kafka Trigger.

  • Select the Kafka Connection you created in the previous step.

  • Enter the name of the Kafka topic you created earlier.

  • This trigger will now listen for new events in real-time and initiate your integration flow automatically.

  1. Create the Kafka trigger with the topic name

  2. Create a Data mapping task to extract the “type” variable from Kafka message (json)

  3. Create an email task the sent the message content to your email

  4. Create a REST task to send the message by API (for example to an Apigee Endpoint)

  5. Use a ServiceNow connector instance and Create a new Incident

  6. Use a Salesforce connector instance and push a new event on the Platform Event defined

Here are few Kafka messages samples to test the flow on your end :

Mail over Kafka : {“Id”:“1234”,“user”:“user@domain.com”,“type”:“email”,“message”:“Customer asked to be called by sales”}

APIs over Kafka : {“Id”:“12345”,“user”:“user@domain.com”,“type”:“api”,“message”:“Profile changed to developer”}

Incidents over Kafka : {“Id”:“123456”,“user”:“user@domain.com”,“type”:“incident”,“message”:“Bug report on UI 504 error code”}

Salesforce over Kafka : {“Id”:“1234567”,“user”:“user@domain.com”,“type”:“salesforce”,“message”:“New eval created on Apigee”}

Have fun and let me know if you have any questions!

8 Likes

hi, thank you for your post. It helps a lot. Well, do we need any subscription here? Please help.

Hello ! You need access to Application Integration https://cloud.google.com/application-integration?hl=en

There is a Pay As you Go offer