Sunday, September 29, 2019

Azure VMs – Export to CSV

Abstract

For human race, there are some common tasks in daily life which you must do. Without which humans can survive but you can’t say if they are “living the life”. For example, taking a bath is one of such tasks. You can survive without taking a bath but you will not like it. These tasks are those which bring “life” to humans daily lives and make them enjoy their stay on earth.

Out of these humans there is special category of humans – I call them “Humans who work in Software field”. They can be sub-categorized into below –

       1.     Software developers
2.      IT administrators
3.      Project managers
4.      Software Engineers
5.      Solution Architects

For all these sub-categories you can apply prefix such as Senior, Junior, Principal, Full Stack, Distinguished and in recent times we have a new addition to this prefix known as “CLOUD”. These are humans who can easily survive without bath but there is one thing without which none of these humans can survive and it is known as “Export to CSV”. It is more or equally important like eating food and drinking water for them.

No matter how many cutting edge features and service Microsoft Azure is bringing, we still feel the product or service is not complete unless you offer “the” functionality of “Export to CSV”. Surprisingly Azure VM export to csv do not exists on Azure portal and you need to write a PowerShell for this. Therefore my lazy followers and friends asked me to write PowerShell to export Azure VM to CSV. So here we are!

Let’s go!

Current state of feedback

Users have provided the feedback to bring the functionality of export to csv for all azure resources on the portal. It is on the roadmap. You can view the details here - https://feedback.azure.com/forums/216843-virtual-machines/suggestions/37934101-virtual-machine-list-export-to-csv.

Why yet another new script?


I spoke to multiple people who are dealing with Azure VMs daily basis and requirements from there for CSV was an eye opener. Many of the scripts available today provides very minimal details about VM when exported to CSV. Most of the Azure Administrator find them not so useful. So after a quick survey with few of Lazy followers I received below list as a top ask for Azure VM to CSV export. In the below list the top ask was to retrieve Azure VNET for Azure VM and Azure VNET subnet for Azure VM using PowerShell. I have addressed this as well in the script.

So, current script provides the CSV output of Azure VMs with great details. Refer below columns list –

Saturday, September 21, 2019

Improve Azure API Management performance by 10x through Caching


Abstract


I have written this in many of my Azure API Management blog posts that API Management will become very mission critical in your Azure Cloud Architecture and strategy. And customer after customer this continues to be proved. Any component of Enterprise wide Architecture becoming critical means it will also give birth to critical problems and test your architecture skills in all respect.

I have seen this repetitive pattern at customer who has embarked on journey of Azure API Management that, performance issues are inevitable after certain massive load and you need to tweak Azure API Management to continue to serve at optimum level. By far, in the history of web application Caching mechanisms gives you awesome performance boosts. Same is true with Azure API Management and I will explore the same in this blog post.

Let’s go!

Caching in Azure API Management highlights


When we say “caching in API Management” then essentially we are talking about caching the RESPONSE of backend API inside the storage [not Azure Storage] of Azure API Management. You always and always cache only GET methods.

Warning - Technically you can cache POST methods also, but not recommended. I would call it a hack. PUT and DELETE methods are never to be cached.

Get method caching is the recommended approach of caching API response. If you use Azure API Management then Azure API Management acts as a middle ware and provides caching facility for all backed APIS hence saving you from cache implementation from all of the individual backend APIs.
Refer to below diagram –



When users requests the API Management for some data using REST API GET method for the first time, then the response is sent by backend API. This response is cached in API Management inbuilt storage and also passed onto to user. The next time when user calls the same method, the cached response is sent back, without actually sending the query/ request to backend APIs; till the time to live is not elapsed. This results in saving of round trip to backend API and hence the response to user is delivered at lightning fast speed. Resulting into overall performance improvement of APIs responses and latency.

Azure API Management Caching Levels

In Azure API Management, when you add an API, you have API at high level and it contains the operations. Then an API can also belong to multiple Products. An individua API operation can never exists in Products independently. The caching in API Management is configured using Policies. So in essence caching can be implemented at below levels in Azure API Management – as per below diagram –



Note – If your API contains an operations of mix Http methods then you will have to configure the Caching policy in Azure API Management “at individual level of operation in APIs”. This can become a bottleneck or hectic work to manage the policy configuration of APIs at every operation level.

Therefore as a best practice you can either have APIs designed to contain only GET methods and for rest of the methods define new APIs. Then add GET methods based APIs in Products and define the caching policy in Azure API Management at product level. This simplifies the design of your APIs in Azure API Management a lot and implementation of caching becomes super easy.

Configuring the Caching policy in API Management

Wednesday, September 4, 2019

Azure DevOps – Build and Release pipeline to Azure File Storage


Abstract

Azure File Storage is an ubiquitous service. It is so useful that I have hardly seen an enterprise not making use of it. Recently Premium tier of File storage is introduced with IOPS as 1,00,000 as opposed to standard File storage which used to offer 1000 IOPS only. This premium tier now even makes Azure File Storage natural choice for high performance demanding applications.

I have been architecting many scenarios and implementations where I used Azure File Storage for running web applications. In such a scenario your application binaries, DLLs, application files [or jar, war files in case Java] are present on Azure File Storage and this Azure File Storage is mapped as a drive to Azure VMs. This Azure VM then runs web servers likes Tomcat, IIS and maps their website path to the drive mapped using Azure File Storage. And this works awesome!

In today’s enterprise world DevOps has become a regular practice. Naturally when you plan to deploy web applications you use pipelines in Azure DevOps. In above scenario you would need to create Build and Release pipeline that will deploy your application to Azure File storage. Unfortunately, there is no default task exist in Azure DevOps that can publish the build output to Azure File Storage. This is what I am going to build in this post.

Let’s go!

Setting up pre-requisites

For this blog post I am going to use .NET Core language web application as my sample. It doesn’t matter if you use any other language-based example. So I already have a DevOps Organization created and in the same I have created one Project named as “MyNETCoreApp” as shown below –



It doesn’t matter which option you select for Agile process while creating project in Azure DevOps. Selection of Agile process while creation of project in Azure DevOps will matter in other important cases of setting up your business process. Not in Build and Release pipeline. But that is another blog on another day. Not today!

Next part is about creating the project n Visual studio [or in your favorite IDE] and pushing the same project in this current Azure DevOps Project Repo. I have it done already because I think this is straight forward and no step by step guide is required. So My project Repo looks as follows –




So we are all set for now to create Build and Release pipelines to release to Azure File Share.

Don’t believe on your eyes - Clearing up the confusion

When I asked my regular blog followers for the current topic most of them responded saying “Azure DevOps already offers a way/task to release the code to Azure Files.”. I said where? How and when did this happen? Then my dearest followers [most of them - not all] sent below screenshot to me [without Red and Green highlights]–



They were not wrong because they had just a given a quick look to the heading. Red highlighted Heading is perfectly fine and it will make you believe that Azure DevOps do offer a task to publish and release pipeline to Azure File Share. However if you focus on description of the task Green highlighted; you will see that Azure DevOps File Copy tasks do not publish to Azure Files but to Azure Blob and Virtual Machine only. Confusion because of names!

I am sure you know the difference between Azure Blob [https] storage and Azure File Share [SMB] storage. So moral of the story – we don’t have any tasks default provided in Azure DevOps to release the code to Azure Files share.

High Level Solution