Can you suggest some best practices around Caching within an API Proxy?

Hi there,

I would like to ask about What is the best practices of using cache especially when we have multiple conditional flows and multiple target end points in the proxy?

Typically the Apigee gateway acts as a “proxy”, a middle-actor between the client (requester) and the upstream service. Rather than calling the service directly, the client calls the proxy. The proxy can then attend to “cross-cutting concerns” - that is to say, stuff that you can factor out of each individual service in your upstream portfolio. This might be authentication (like verifying an API key or token), request mediation, security mediation (applying some different credential to the call to the upstream system), caching, rate limiting, monetization, and so on.

The way I think about it, there are two opportunities to use the cache that is builtin to Apigee: use the response cache, and use the cache primitives.

  • ResponseCache. You can use the ResponseCache policy to tell Apigee to respond to a request using information stored in an in-memory cache, rather than calling the upstream system. This saves a call to the upstream system, which means lower load on your systems, and potentially quicker responses delivered to your clients. This can make a significant difference at scale. A typical upstream system might have a mean response time of 150ms or even 500ms. If Apigee can serve a cached response, that can make a big difference for the end-user experience. Typically you want to enable a ResponseCache only for GET requests, as PUT/POST/DELETE verbs can mutate resources. So in general you should not respond to a POST or a PUT or a DELETE with cached data. The GET, on the other hand, simply reads data. (That’s the convention, anyway). So it’s safe to respond to GET requests with cached data. How long should you allow items to exist in cache? That depends on the scenario. How often does the data change? And what is the downside of delivering “stale” data to a client request? Take for example, a request to get locations of retail stores. Those locations change VERY RARELY. Therefore it’s probably quite safe to cache that data, for 24 hours or more. On the other hand, a request to retrieve current inventory in a particular store, will read data that changes more often. So in that case maybe the Apigee cache should have a 5-minute time-to-live (or expiry). Or maybe 15 minutes. It’s up to you.
  • Cache Primitives. With the PopulateCache and LookupCache policies, you can configure your API proxy to store ANY data in cache. This might be user profile information, routing tables, validation keys, and all sorts of other data. Use these policies to improve performance when many requests need to access the same data, and that data is remote (accessed via ServiceCallout for example).

In general I’d say you don’t need caching unless you would like to optimize the user experience.

BTW, I recommend that you DO NOT attempt to wrap caches around token validation logic. The OAuth2/ValidateAccesstoken in Apigee is already designed to use the cache implicitly. It will already be fast. Likewise with VerifyApiKey. It uses caching already.

@dchiesa1 Thank you very much for your adequate explanation of caching policies.