Saturday, September 21, 2019

Improve Azure API Management performance by 10x through Caching


Abstract


I have written this in many of my Azure API Management blog posts that API Management will become very mission critical in your Azure Cloud Architecture and strategy. And customer after customer this continues to be proved. Any component of Enterprise wide Architecture becoming critical means it will also give birth to critical problems and test your architecture skills in all respect.

I have seen this repetitive pattern at customer who has embarked on journey of Azure API Management that, performance issues are inevitable after certain massive load and you need to tweak Azure API Management to continue to serve at optimum level. By far, in the history of web application Caching mechanisms gives you awesome performance boosts. Same is true with Azure API Management and I will explore the same in this blog post.

Let’s go!

Caching in Azure API Management highlights


When we say “caching in API Management” then essentially we are talking about caching the RESPONSE of backend API inside the storage [not Azure Storage] of Azure API Management. You always and always cache only GET methods.

Warning - Technically you can cache POST methods also, but not recommended. I would call it a hack. PUT and DELETE methods are never to be cached.

Get method caching is the recommended approach of caching API response. If you use Azure API Management then Azure API Management acts as a middle ware and provides caching facility for all backed APIS hence saving you from cache implementation from all of the individual backend APIs.
Refer to below diagram –



When users requests the API Management for some data using REST API GET method for the first time, then the response is sent by backend API. This response is cached in API Management inbuilt storage and also passed onto to user. The next time when user calls the same method, the cached response is sent back, without actually sending the query/ request to backend APIs; till the time to live is not elapsed. This results in saving of round trip to backend API and hence the response to user is delivered at lightning fast speed. Resulting into overall performance improvement of APIs responses and latency.

Azure API Management Caching Levels

In Azure API Management, when you add an API, you have API at high level and it contains the operations. Then an API can also belong to multiple Products. An individua API operation can never exists in Products independently. The caching in API Management is configured using Policies. So in essence caching can be implemented at below levels in Azure API Management – as per below diagram –



Note – If your API contains an operations of mix Http methods then you will have to configure the Caching policy in Azure API Management “at individual level of operation in APIs”. This can become a bottleneck or hectic work to manage the policy configuration of APIs at every operation level.

Therefore as a best practice you can either have APIs designed to contain only GET methods and for rest of the methods define new APIs. Then add GET methods based APIs in Products and define the caching policy in Azure API Management at product level. This simplifies the design of your APIs in Azure API Management a lot and implementation of caching becomes super easy.

Configuring the Caching policy in API Management


In this part let us configure the caching in API Management. Creating API Management instance on Azure portal, adding backend APIs is out of the scope of article. For this I already have an Azure API Management instance created and sample Get Books APIs added into it. Let us configure the caching policy.

Caching policy at individual API Operation in Azure API Management

My API Management with Books sample API look as follows -



As you can see in above screenshot as of now there is no caching policy added on any of the GET operation. I run few test calls from Developer Portal and below were the results and response times.



I did multiple runs and found that the minimum response time has been around 220 ms average. So let us enable the caching policy on the same API. Click on desired API operation [should be GET operation], then inside Inbound processing tab click on “Add Policy” as shown below –



Then select Cache lookup policy option as shown below –



For Cache Response page, keep the setting to BASIC and add the value as 120 seconds in duration and then click Save. Now click on “Policy code Editor” icon next to Policy as we need to update the key known as “downstream-caching-type” to public value as shown and save.




Why did we change downstream settings?

Every cache-control header has the cacheability directive which can have values as no-cache, private, public and no-store. When we set it to no-cache means process forces caches to submit the request to origin server to validate before releasing cached copy value. When we set the values to no-store means browser will never cache the response value. When we set to public means response may be cached by any cache storage like shared cache storage. However the data stored by Public cache will be visible to every user. As of now we are just performing demo of cache control in Azure API Management so we are good to use public directive attribute. Reference – referto stackoverflow question.


Open Developer portal again and run few tests on the same API. Below were the results and response times.



From above screenshot it is clearly visible that with caching, your response time is less than 10ms and there is more than 10x improvement. Also, the “Max-age” value is decreasing from 120. When it reaches to zero, the first request will again take response time ranging into 200ms as it will bring data from origin server. Then all subsequent request again improves the response time to less than 10ms for next 120 seconds.

Caveats in using Azure API Management inbuilt/ default cache


While Azure API Management cache works great, you should also consider below limitations in your architecture design and approach.

Cache Storage size

The Azure API Management cache works great and it is natively built into Azure API Management service instance. If you have observed my first architecture block diagram above, I have mentioned that the storage offered by inbuilt cache in Azure API Management is 1GB. And it can’t expand beyond this limit. For any enterprise application this may present cache storage problem if your volume is too big.

Solution to this can be implementing external cache storage like Azure Redis Cache. But that’s another blog on another day!

Monitoring metrics unavailability for cache storage

The default cache of Azure API Management do not have any metric alerts configured. Imagine a situation where the you have implemented this cache of 1GB storage in your API Management instance. The volume and data cached in it is high and it reaches 1GB limit. What next? I don’t know as of now what will happen if cache storage limit is hit. It is still an unsolved / un-experienced mystery. Also we can’t get alert email if cache storage is about to get full so that you can take corrective actions before you hit this mysterious stage for your API Management cache.

Solution is to use external cache like Redis so that you can also implement monitoring of cache storage used in azure API Management.

However at the same time I have seen 1GB cache is god enough for large scale volumes GET requests when you keep the cache expiration to lesser value and works great.

Conclusion

Hope this post has helped you to focus on caching the REST APIs with GET method in Azure API Management and demonstrated how drastically performance gets improved.
Happy Caching!!

A humble request!

Internet is creating a lot of digital garbage. If you feel this a quality blog and someone will definitely get benefited, don't hesitate to hit share button present below. Your one share will save many precious hours of a developer. Thank you.


Related Posts



7 comments:

  1. only concern on downstream-caching-type as public

    ReplyDelete
  2. Really Nice article! I have requirement to implement caching policy for HTTP Post operation, when I implement the above policy for any POST operation, i get below message in Trace section

    cache-lookup (0 ms)
    "Request has a non-cacheable HTTP method. Cache Lookup policy was not applied."


    Can you please help us with sample example?

    Regards,
    Naresh

    ReplyDelete
    Replies
    1. You will not be able to cache post operation in API Management. Post operation should not be cached. If you still want to cache post operation then implement at the API level not at the API Management.

      Delete
  3. Hi, I have an authorization jwt policy at the "All APIs" scope. I have many APIs, and many operations within each API.

    For some operations within an API, I have set a very basic configured cache policy as you have shown (all of the calls return static data). However, the cache policy never works. Using trace, it says
    cache-lookup (0 ms)
    "Request has a non-cacheable HTTP method. Cache Lookup policy was not applied."
    and on cache-store it says the response is not cacheable.

    Do I need to do something with my code?

    ReplyDelete
  4. What is the maximum cached response size, can we increase by using radis cache?
    Why we cannot use private in cache downstream type?

    ReplyDelete
  5. When tried to autoscale apim, it is taking 36 minutes. Is there a way to make it quicker?

    ReplyDelete