Saturday, October 18, 2014

Multiple web applications in single cloud service web role with multiple SSL certificates and custom domains over https using Server Name Indication


I know the title of this post is quite big and hard to understand in first go. But I wanted to highlight the important aspect I am going to touch in the article and hence such a long name.
Ok, so here in this post I am going to demo how you can host multiple web applications in single cloud service. I know you may say that there are numerous posts on the same topic what is different in here? Yes the difference is there. You may find that most of the posts talks about hosting multiple web sites in single cloud service web role but they do not focus on running them on HTTPS and with custom domain name. Exactly this is what I am explaining in this post.
So in essence, we will see that, how can we host multiple web applications in single cloud service web role. Then we will see how the same applications can be hosted on HTTPS using SSL certificates in single cloud service web role. After that we will extend the talk to hosting same apps with custom domains using new feature of IIS 8.0 called as Server Name Indication (SNI).
Applicable software stack – VS 2013 and above, Azure SDK 2.3 and above, IIS 8.0 and above. [IIS 7 not supported].
Create required project in Visual Studio
This part is pretty straightforward implementation. To start with I created a simple cloud service project in VS2013 and added one web role in it. I renamed the web role to ManegementAppWebRole as shown below –

I changed the default.aspx UI text to just make it identify as Management App. Now I added another ASP.NET web project in the same solution. This will be my survey app. In this also I changed the default.aspx text to just make it identify as Survey app. So in all my projects structure is as follows –

Hosting multiple web applications in the same cloud service web role
Okay, so here I am representing a way by which you can have two web sites hosted in the same cloud service web role. In my case one of the applications is Management app web role itself and another is Survey web application which is not the part of cloud service. So I will run the survey app as part of Management app web role. To configure this full IIS capability can be leveraged. We have a special file in cloud service project know ServiceDefinition.csdef file. My current structure of .csdef file is as shown below –


The <Sites> tag is the key to make this possible. So I just added another site in .csdef file and provided the path of Survey web application. And provided the actual physical path of my survey app. Most importantly if you observe I have provided the host header for my survey app as “surveyapp.kunal.com”. This is the main differentiator of the survey app running in the Management app web role. IIS do not allow to have more than one site listening on the same port without adding host header. We need to provide host header because both of our applications will be running on the same Endpoint port which is 80. Therefore there has to be some way by which IIS can understand the request of user need to be served by which application. Therefore we need host header. The complete structure of my .csdef file is as shown below –

Now in reality my host header site “surveyapp.kunal.com” do not exists. Therefore to make survey app resolve to this URL I need to update “hosts” file of my machine. Hosts file is located at C:\Windows\System32\drivers\etc\hosts. I copied the file to desktop and opened in notepad and added an entry as highlighted below –
Then from desktop I replaced / copy-pasted the hosts file back to its location.
Now let’s run the project in debug mode by pressing F5 and it open the browser with management app with url similar to below – http://127.0.0.1:60123/ Here 60123 is port opened by my emulator, this will different in your case. Open another browser and cope the same url of management app and just replace 127.0.0.1 by our host header name i.e. surveyapp.kunal.com. So final url in new browser window will be - http://surveyapp.kunal.com:60123/and bingo!! I see my management app and survey app running on the same port as shown below –

Let’s run the same implementation in Full IIS and Full emulator mode. This can be configured from the cloud service project properties. To open the properties of cloud service project, right click on Cloud service project and select Properties.
Note – With Azure SDK 2.4 FULL Emulator has been deprecated; however I guess you should be able to run the Full IIS with Emulator Express. This is what I think!!
Ok, now let’s run the project by pressing Ctrl+F5 [DO NOT RUN in DEBUG MOE!!] and here is what I see in browser and IIS –

As you can see, host header has been applied for my Survey Application. This is how we can run multiple web application in single cloud service web role.
Advantage – You are running the tow different web apps in the same IIS. In other words you are running two different web applications in same instance of Azure role and hence you save money. In case of 2 different roles for 2 different web app, you will be charged for role VM instances of 2 applications. In this case you are charged for only one role VM instances.
Now in next part we will see how we can run these web applications over HTTPS and custom domain. This means, I have custom domain mapped for survey as “surveyapp.kunal.com” and now we will have custom domain name updated in hosts file for Management app as “managementapp.kunal.com” plus to run them on https and certificates.

Monday, August 25, 2014

Using SAS, renew SAS and REST API to Upload large files to Azure blob storage in parallel and async


First of all thanks for overwhelming response to my earlier blog post related to upload of large files to azure block blob in parallel and async.
At the bottom of this post you will find the link for downloading the code sample.
In current post I will extend the same code library to support REST Api to perform azure blob upload. Also, as a best practice you should always use SAS (Shared Access Signature) for performing any operation on blob storage. Therefore in current post I will extend the code to support SAS as well.
Why we need to renew or extend Azure SAS –
Using SAS with blob storage is great option to provide one more level of security to azure storage operations however, anyone who gets access to SAS url can perform malicious operations. Therefore keeping SAS expiry time to minimum is always a good practice. But again, if we are performing upload of fairly large files then keeping SAS expiry to minimum will not help.
For example, let’s consider a scenario. Let’s say I will upload a file of size 100MB in azure blob storage using REST Api and SAS. As a best practice and already stated in my previous blog post of uploading large files to azure block blob in parallel, the files to be uploaded to azure need to be sliced in chunks.
Now if I have set the expiry time to 5 minutes. Of course upload of 100MB size will not be completed in 5 minutes and hence my block upload will fail in between.
Solution -
To continue to upload large files to azure block blob seamlessly after SAS expiry time, I need to renew the SAS token again. This is what exactly I am doing in this blog post and code sample.
So essentially, I check the response status code of every block upload. If response status code Created means current block is uploaded and continue to next. If the response status code is received as 403 Forbidden means the upload of block is failed due to invalid SAS. This is where I mark the particular failed block as Not Added.  Once the loop is over, I retrieve list of failed blocks and perform the same operation with new SAS. To perform same operation of upload with new SAS I am using popular and basic concept you also might know called as RECUSRSIVE FUNCTION.

Thursday, August 21, 2014

JavaScript runtime error: '$' is undefined – Azure web role and jQuery

I wanted to use jQuery in my Azure ASP.Net web role. I created a cloud service project and then added one asp.net web role in it.
Then to use jQuery I had decided to use HTML page. Therefore I added HTML page in my wen role sample. I wrote some jQuery code and at runtime I received error as -
JavaScript runtime error: '$' is undefined
This error is encountered because the jQuery version is referenced in HTML page however it’s path is incorrect.
If you open the solution explorer you will find that jQuery is already added in your web role project under the folder Scripts as shown below –

However in HTML page it is referenced with incorrect path as shown below –

The folder js as referenced in above screenshot does not exists and file jQuery files are present in Scripts folder. Therefore I changed the path of jQuery file as below –
<script src="Scripts/jquery-1.10.2.min.js"></script>
And you are done. No runtime error was displayed after this change.
Hope this helps.
Cheers!!!

Tuesday, August 5, 2014

Session management using SQL Azure and Update on automatic session clearance


Session management using InProc in azure roles is dangerous thing!!!
Why InProc is dangerous?
Create an azure cloud service project and add a web role in it. Then I added a sample page named as Session demo in it. To test the session data saving and retrieval I added following control in the <form> control present on the page –
<asp:TextBox ID="TextBox1" runat="server"></asp:TextBox>
<asp:Button ID="btnAddToSession" runat="server" OnClick="btnAddToSession_Click" Text="Add To Session!!" />
            <br />
            <br />
            <asp:Button ID="btnReadFromSession" runat="server" OnClick="btnReadFromSession_Click" Text="Read From Session!!" />
            <asp:Label ID="Label1" runat="server" Text="Label"></asp:Label>
            <br />
            <br />

            <asp:Label ID="lblInstanceId" runat="server" Text="Label"></asp:Label>
On the code behind I added below simple code to add values in session and retrieve it. Also in page load event I am retrieving the id of role instance to check that, my current request is being served from which instance.
public partial class SessionDemo : System.Web.UI.Page
    {
        protected void Page_Load(object sender, EventArgs e)
        {
            lblInstanceId.Text = "You are being served from - "+ RoleEnvironment.CurrentRoleInstance.Id;       
 } 

        protected void btnAddToSession_Click(object sender, EventArgs e)
        {
            Session["MySession"] = TextBox1.Text;
        } 

        protected void btnReadFromSession_Click(object sender, EventArgs e)
        {
            if (Session["MySession"] != null)
                Label1.Text = Session["MySession"].ToString();
        }
    }

In web.config file as shown below, I have configured my session state as InProc under <system.web> tag –
<sessionState mode="InProc"/>
Now right click on cloud Service project, select Properties and set the application to run in full emulator and full IIS as shown below -
 
Note – The option to run the cloud service project locally in Full Emulator mode is applicable till the Azure SDK 2.3 only. In Azure SDK 2.4, full emulator is deprecated. I don’t know the reason why they have deprecated full emulator mode in Azure SDK 2.4.
Now we always should set the number of instances for an azure role as minimum of 2. Therefore right click on web role node present in cloud service project and in properties set the instance count as 2.
 
 
 
We all set to see how InProc session management in Azure web role is devil. Set the cloud service project as startup project and I have set the Sessiondemo.aspx page as startup page and here I click F5. And here what I see on my page and in my IIS –

So as you can see, in IIS there are two instances got created as expected and below that the UI shows simple design to test sessions. Also if you see in IE, my current request is being server from INSTACE 0.
Now let’s say I add the value to session as “kunal” by clicking on button “Add to Session” and retrieve it from session by clicking on button “Read from Session” and I found that, my requests are still getting served from INSTACE 0 and session value is being retrieved correctly. Now I right click on instance 0 website in IIS and then I select the options as Manage Web Site | Stop. This actually stops the 0th instance running in IIs. Remember, my project is still running in debug mode. I just stopped one of the role instances of my azure application.

Once the IIS website is stopped you can view the black icon on the web site as shown. This confirms that my website is stopped.

Now I refresh my browser 2-3 times. This means now my request will start getting served from INSTANCE 1 of IIS website because instance 0 is stopped and here is what I see –

As you can see above, my request is being served from INSTACE 1 and my Session value is not retrieved on refresh. Therefore I click on button “Read from session” again and DAMN!!!
I am not receiving the value “kunal” that I had added in my session. This is the problem!!! Ideally I should have received my session values irrespective from which instance of role my request is getting served.
In Azure environment there is always possibility that, the instance of role gets recycled due to hardware failures or other errors. Means if you configure only 1 instance for your role then there is possibility that, your role instance is down and you DON”T GET HIGH AVAILABILITY for your application. Therefore it is always BEST PRACTICE TO HAVE ATLEAST 2 INSTANCE for your azure role.
But again, if you have 2 instance configured for your web role then session management using INPROC will not work as shown above!!!
What is the Solution?
You should always store the session state outside your role environment. How do I do that?

Saturday, August 2, 2014

Azure Automation - shutdown azure virtual machine – step by step

Update - 21st Nov 2019

This document is old now. You can refer below article to know the concepts of Azure Automation and Azure VM Shutdown. If you plan to do Hands on and implement one then refer to updated and recommended guide here - https://www.sanganakauthority.com/2017/07/start-stop-multiple-azure-vms-on.html



Why?
Managing Azure resources using Management portal is cumbersome job for infrastructure guys. In one of my customers company (where I am supporting Azure Project as a consultant), the developer used start the Azure VM in the morning and many times they never stopped the azure VM while leaving from office. Ofcourse this used to incur huge cost for them. Therefore IT team guys used to check if their development Azure VM's are running, if yes then they used to shutdown it from azure portal to save on cost of running Azure VM overnight. The number of VM’s to stop was around 30 to 50. They had been performing this manually in combination of powershell script and few operations from management portal. This simple activity had consumed 25% of the IT people bandwidth and it was problem for them. Thanks to Azure Automation!!! IT guys don’t have to shutdown Azure VM anymore manually and they are happy now.
What?
So, Azure automation is the new service in preview. Well, Azure automation is not the only service that can automate the common activities of Azure. Powershell exists since long and in fact powershell is the basis behind Azure Automation service. You can automate the creation, deployment, monitoring, and maintenance of resources in your Microsoft Azure environment using runbooks, which ultimately uses Windows PowerShell workflows.
Ofcourse, Chef and Puppet are also doing the same automation job greatly however, I find them pretty complex. I know most of you may not agree, however, I feel Chef, Puppet is best for Linux, Unix OS based Azure VM. For Windows OS based Azure VM, Azure Automation with Powershell is your key.
In this post, I will be giving step by step approach to shutdown your Azure virtual machine using Powershell and Azure automation.  So let’s start!!
You may not have observed, but I feel the concept of Azure automation is very much similar to what Chef is doing for automation. See below comparison between Chef and Azure Automation –
Chef Jargons – Recipe, cookbook
Azure Automation Jargons – Job, Runbook
Cookbook – Runbook!!! Of course it is just an observation.
Runbook – Runbook is a set of powershell commands that gets executed based on schedule set in Azure automation. So the book has sentences (or commands) that run in Azure Automation service. In azure automation we always execute powershell scripts under runbook.
Activate
When I am writing this post, the Azure Automation is in preview mode and hence you need to enable is for your subscription from here - https://account.windowsazure.com/PreviewFeatures
If it becomes generally available then this step will not be required.
Automation Account
Create Automation Account first as shown in below screenshots –

 
As a preview feature, these are supported only in US region as of now.
Certificate Management
An Automation Credential is both a username and password that can be used with Windows PowerShell commands or a certificate that is uploaded to the server. We will use certificate based approach. Therefore we need certificate to authenticate azure subscription. Best way is to use self-signed certificate either created from makecert command or created from IIS itself.
Let’s see the way of using IIS. Open run Window and type INETMGR to open IIS window. Select the local machine name and double click on Server Certificates option as shown below –

Saturday, July 26, 2014

Azure Virtual Machine using Powershell – common operations


This is very common scenario I have observed, many times the IT guys need to perform day to day tasks of Azure with the help of Powershell command and he finds dozens of ways to do it. In this post I am providing the list of all common and basic operation of Azure Virtual Machine using Powershell. If I have missed any operation on Azure virtual machine with powershell, then feel free to comment.
To perform any task in powershell there are numerous ways, I will be showing the easiest way to achieve it. As expected this post is also going to be big one!!!
Setting up Azure Account on Powershell window
This is mandatory step. Without this setup you cannot perform a single operation using Azure powershell. There are two ways to set up your credentials in Azure Powershell. One is to use Azure publish settings files and importing in powershell or you can directly login credentials by using Add-AzureAccount. Here I am depicting the second way –
#add azure account to powershell current session
Set-ExecutionPolicy RemoteSigned
Add-AzureAccount
Select-AzureSubscription -SubscriptionName  "Your Subscription name" 

Setting up Storage Account
Azure Virtual machine disks and images are stored in Azure Storage therefore we need to have storage account. If you have an existing storage account then definitely you can use that or create new. Following commands specify how you can create a new storage account and also depicts how can you turn ON or OFF the geo replication for account. I don’t wanted to have geo replication therefore to save on cost I am making locally redundant by specifying
-GeoReplicationEnabled  $false
Here are the complete commands –
#this section is required only if you don't have storage account available, if you have one ready then use that.
#set storage account details
$location = "West Europe"
#this can be any location of your choice
$storageAccountName = "YourStorageAccountName"
#set geo replication to locally redundant. If you wish to setup to Geo redundant then set below parameter to $true
Set-AzureStorageAccount  -StorageAccountName  $storageAccountName -GeoReplicationEnabled  $false

#set newly created storage account for current subscription
Set-Azuresubscription -SubscriptionName "YourSubscriptionName" -CurrentStorageAccountname $storageAccountName

Setting up Azure cloud Service (Optional)
Following command actually create a cloud service in your subscription, if you any cloud service existing then you can use the same to deploy your vm –
#create cloud service, if already present then you can use that
$cloudServiceName = "YourCloudServiceName"
New-AzureService -ServiceName $cloudServiceName -Location $locations


Selecting latest published Azure Virtual Machine Image using powershell
First run the command –
Get-AzureVMImage
This lists down all the VM images available. I will be taking the entire article with respect to Azure virtual Machine SQL Server with powershell therefore I searched for SQL Server family as shown below –
 
I used following command to list images based on published date in descending order so that I can select the latest image available in the location and here is the result screenshot.
#retrieve the latest image of your choice based on published date
$images = Get-AzureVMImage | where { $_.ImageFamily -eq "SQL Server 2012 SP1 Enterprise on Windows Server 2012" } | where { $_.Location.Split(";") -contains $location } | Sort-Object -Descending -Property PublishedDate 

$latestImage = $images[0]
$latestImage

 
Highlighted area shows the latest version available for SQL Server as on today in West Europe region and it got selected correctly.

Set up Credentials for Azure Virtual Machine using Powershell
In this step let’s setup the credentials for Azure virtual machine. Firs declare the variables with appropriate values –
#set virtual machine variables
$vmName = "YourVMName"
$adminLogin = "YourUserName"
$adminPasswd = "YourPassword"
Create Azure virtual Machine using Powershell
Let’s first define the VM specific variables as follows –
#set virtual machine variables
$vmName = "powershellvm"#should not be more that 13 characters as of now
$instanceSize = "Basic_A1"
#Possible values - #ExtraSmall,Small,Medium,Large,ExtraLarge,A5,A6,A7,A8,A9,Basic_A0,Basic_A1,Basic_A2,Basic_A3, Basic_A4
Below this point all commands uses variables declared above. So in case you are not able to understand from where the particular variable values come, do not hesitate to look above this point. J

Sunday, July 13, 2014

Upload large files to Azure block blob storage in parallel and Async using C#

Alright guys read carefully. This blog post explains the code by which you can UPLOAD MULTIPLE LARGE FILEs TO AZURE BLOB IN PARALLEL (PRECISELY BLOCK BLOB).
What I am doing and what I am not?
1.Ask user to provide list large files and blob names to upload in one go in parallel
2.The code will use TPL (Parallel.ForEach loop precisely) to perform simultaneous uploads of azure blobs.
3.Code will chop the large files into multiple blocks.
4.Every block of file is uploading in SYNC. I am not performing any ASYNC operation while uploading individual blocks to azure blob. (Precisely I will be using PutBlock and PutBlockList methods).
5.However, the UI project (In my case it is console application, also it can be either Worker Role) calls the method of upload blob in ASYNC way with the help of BeginInvoke and EndInvoke.
Applicable technology -
I am using VS2013, Azure SDK 2.3 and Storage library as “Windows Azure Storage 4.1.0” launched on 23rd of Jun 2014.
Implementation
In real world scenario, we always tend to perform large file uploads to azure blob using worker role. Hence I will depict code with respect to console application. Don’t worry it can be easily converted to worker role specific code. J
Alright so, let’s start with it!!
Again – This might also be long post due to heavy code blocks. So be prepared.
Reference – 70% of this code is based on solution provided by codeplex on this post - http://azurelargeblobupload.codeplex.com/SourceControl/latest#AzurePutBllockExample/
It is a great solution to upload large files to azure blob storage. Just awesome!! I will change a bit so that, we can perform parallel uploads of large files.
Let’s understand few important components.