I have a requirement to run a long-running thread that will subscribe to a jms queue and do some action when message available - like a MDB style proxy
Can i use either java or node to run this in Apigee?
I don’t think there is a way to directly implement a MDB style proxy. Although, I’m tempted to try and deploy a spring.jms message listener and see what happens.
If we can’t get that working, the other idea is to use the node-scheduler npm. Have the scheduler send a http request to a second API proxy. This can trigger the second proxy to read message available messages from the queue.
good point carlos - If you decide to run something like this in apigee - you will basically lose control over where its run [atleast partially] and how many instances are run and should also take into account that co-ordination among the instances is not possible
Order: If order matters, you can only use a single queue and a single thread to consume it. I wouldn’t want to design any application like that (with or without Apigee). Messages are timestamped. I would try to use that property to know which message was sent when.
Concurrency: If spring.jms is used, then you can define connection pool settings in the spring configuration file. I have not tried this for a message listener in Apigee. The node.js method uses batched process (hence the scheduler). At the time the scheduler is invoked, you read all available messages from the queue.
We run many scheduled jobs at Apigee, some of which follow this pattern:
Expose an endpoint, a Edge nodejs proxy which:
Recieves a batch of items from a queue service, a db query, etc OR from the request package directly
Processes the new items received
Reports status out
We call this from a scheduler (Apache Airflow, Cron or Azkaban), or directly from a webhook (SFDC, Gighub, etc), or from a command line (curl, etc)
A few things to consider are the batch memory and runtime length requirements. Our proxies are better solutions for requirements that are quick, atomic and idempotent.
This is essentially a server-less batch processing or integration pattern.