We’re putting together a weather API which is generally open for all but authorised users with their open API key and username/pass (as headers) get a different response to /locations and are allowed to request data for those locations. Since the URL is the same a simple ResponseCache policy would populate the cache with the general or specific user and then the other user type would be able to get that response. Have tried adjusting the CacheKey to include the unique values the specific user sends and that does keep the values going to the correct users.
The problem comes when we get a new forecast. I can call the public endpoints with the header specified in SkipCacheLookup which avoids checking the cache but does repopulate it from our backend server. However I can’t send a request that emulates the specific user’s request with the SkipCacheLookup because that requires their credentials.
Idea #1
If the user is specifying credentials we used different cache location and clear that via the Management API when a new forecast is made. But that means we need an automated script with our management credentials (seems to require Basic Auth)
Idea #2
If the user is specifying credentials we use a different cache with a much shorter expiring time. But that means our partners have to make slow requests to our server more often
Idea #3
I set up a public endpoint, only reachable with a specific apikey (maybe controlled via an appropriate API Product) that can flush the “different cache” as above. This would mean the partners still make occasional slow requests, but only when new data is available. But it also means I need a way to invalidate a cache from an API Proxy and I believe it is only achievable through the Management API.
Idea #4
Could the Invalidate Cache policy take a wildcard to in validate all keys that include a specific user marker?