Thursday, December 29, 2011

Synchronous vs asynchronous

Synchronous messaging –
Synchronous dictionary meaning say, occurring at the same time.
Synchronous messaging is same as the dictionary meaning. Synchronous messaging occurs when two systems or application transmits data to and fro. The data or message first enters queue then delivered to the recipient.
The further communication cannot take place until response data or message is received back to the sender. Hence sender will wait till a response from other recipient is received. Synchronous messaging is also called as Synchronous Communication.
The information is transferred in synchronous communication after every certain time interval, provided response is received from recipient end.
Example of synchronous communication can be conversation over mobile phone. Unless you receive response from other end speaker you don’t seem to take the conversation ahead.
Asynchronous messaging –
Asynchronous dictionary meaning say, not occurring at the same time.
In Asynchronous messaging information transfer takes place between the two applications or system; however the sender system will not wait for response from recipient system. The message first enters in the queue and remains there till it is not received by recipient end.
The sender can continue the message sending even if no response has been received from recipient. The response data is required but it is not required on immediate basis.
Asynchronous messaging is also called as “Fire and Forget Information Exchange” or “Message Oriented Middleware (MOM)”.
The best example of asynchronous messaging is Email service. The sender can send more mails even if the response from recipient is not received.

The additional component is required in asynchronous messaging to ensure that message present in queue is received by intended recipient. This affects performance and reliability of the system.
Hope above example and explanation clears the difference between synchronous and asynchronous.

Cheers…
Happy Messaging!!!

Tuesday, December 27, 2011

Azure Storage Analytics for Azure Storage –Blob, Queue and Tables - Metrics example

Metrics – Using Metrics to track storage usage –
To enable Azure Storage Analytics metrics on Azure storage refer to the link for library code – Azure Storage Analytics for Azure Storage Services – Logging example.
You need complete library code and for enabling Azure Storage Analytics metrics for storage use method – btnSetServiceSettings_Click and to verify use btnGetServiceSettings_Click method.
So till this point we have already enabled Azure Storage Analytics Metrics for blob storage. Now the Azure Storage Analytics metric data for azure BLOB storagewill be collected in tables named as - $MetricsCapaciltyBlob and $MetricsTransactionsBlob.

The screenshot showing the tables list that hold storage analytics metrics data is as shown above.
To read metrics information from above mentioned tables, we need to add two more classes to our storage extension library. I have named these classes as - MetricsEntities and MetricsHelper. The detailed code for these two classes can be found out on link – Azure Storage Analytics Metrics Classes.
So our final project structure will be as shown below –

Azure Storage Analytics for Azure Storage –Blob, Queue and Tables - Logging example


Today I will be going to explain about turning on Logging and Metrics for Azure storage like Blob, Queue and Table. Logging and metrics are two parts of Storage Analytics in Windows Azure. I will be listing few code samples also.
Before going further I will highly recommend you to understand Need for Storage Analytics and What Storage Analytics is?.
Assuming you have gone through the above link and has a fair idea about storage analytics, let’s understand how Azure storage analytics can be used for azure storage.
Logging – Using logs to track storage request for Azure storage –
Create a sample cloud application with one web role in it. We will use this web role to test the azure storage analytics logging and metrics data.
To enable logging for azure storage we will first create a library which will contain all the necessary functions. The Microsoft azure storage dll - Microsoft.WindowsAzure.StorageClient does not have any specific method related to Storage analytics logging and metrics. Therefore it is mandatory for us to write extensions methods for all the storage services. MSDN blog has an excellent article listed here along with code. Create a library project with suitable name and copy the code for classes, AnalyticsSettings, AnalyticsSettingsExtensions, SettingsSerializableHelper. Just for sake of my understanding I have changed the name of class AnalyticsSettings to StorageAnalyticsSettings.
So your final project in solution explorer will look as follows along with library.

Add reference of storage extension library to web role. Now I have added a page named as Analytics.aspx in my web role to test the storage analytics library.

Add using statement for following libraries in addition to the existing one –
using System.Configuration;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure;
using[your storage library name]AzureStorageExtensions;
using System.IO;

I have added a button named as Set Service Settings. On click of it I will be applying all the Azure Storage Analytics settings for Azure storage service.

protected void btnSetServiceSettings_Click(object sender, EventArgs e)
{
StorageCredentialsAccountAndKey storageCredentials = new StorageCredentialsAccountAndKey("YourStorageAccountName", "YourStorageAccountKey");
//alternatively you can read above key and account name from configuration file web.config of your web role project.
CloudStorageAccount storageAccount = new CloudStorageAccount(storageCredentials, true);
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
    CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
    CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();

StorageAnalyticsSettings settings = new StorageAnalyticsSettings()
{
//to TuRn on logging AND METRICS
LogType = LoggingLevel.Delete | LoggingLevel.Read | LoggingLevel.Write,
IsLogRetentionPolicyEnabled = true,//enable logging
       LogRetentionInDays = 4,
IsMetricsRetentionPolicyEnabled = true,//enable metrics
MetricsRetentionInDays = 7,
MetricsType = MetricsType.All

                //to turn off logging and metrics
                //LogType = LoggingLevel.None,               
                //IsLogRetentionPolicyEnabled = false,
                //IsMetricsRetentionPolicyEnabled = false,
                //MetricsType = MetricsType.None
};

//applying above settings to blob service service
StorageAnalyticsSettingsExtension.SetServiceSettings(blobClient, settings);
//applying above logging settings to table service
StorageAnalyticsSettingsExtension.SetServiceSettings(tableClient, settings);
//applying above logging settings to queue service            StorageAnalyticsSettingsExtension.SetServiceSettings(queueClient,storageAccount.QueueEndpoint, settings);
}

The settings above will get applied to blob storage service when above code is executed. Now If I want to retrieve the settings that are applied; you can use following code -

protected void btnGetServiceSettings_Click(object sender, EventArgs e)
        {
            StorageCredentialsAccountAndKey storageCredentials = new StorageCredentialsAccountAndKey("YourStorageAccountName", "YourStorageAccountKey");
//alternatively you can read above key and account name from configuration file web.config of your web role project.

            CloudStorageAccount storageAccount = new CloudStorageAccount(storageCredentials, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();

//retrieve blob settings
StorageAnalyticsSettings blobSettings = StorageAnalyticsSettingsExtension.GetServiceSettings(blobClient);
//retrieve table settings
StorageAnalyticsSettings tableSettings = StorageAnalyticsSettingsExtension.GetServiceSettings(tableClient);
//retrieve queue settings
StorageAnalyticsSettings queueSettings = StorageAnalyticsSettingsExtension.GetServiceSettings(queueClient, storageAccount.QueueEndpoint);
//alternatively you can write these settings to file or display in your web page or view while debugging in visual studio.
        }

Once you are done with above settings and verify that they are assigned to blob, queue and table with above get settings code, open up your storage account in any Cloud Storage Studio. I personally use Cerebrata as it is free for first 30 days. J
If you upload any file to blob storage or delete, or add an entity in table storage and delete and similarly if you perform any operation on Queue then these transactions will be recorded in Logging as per the current date and time.
You will see that, $logs container will have logs created under it as shown below -

To retrieve and show names of these log file I added a simple Listbox and populating all the logs in it on a button click. The code is as follows –
protected void btnListLoggingBlobs_Click(object sender, EventArgs e)
        {
            StorageCredentialsAccountAndKey storageCredentials = new StorageCredentialsAccountAndKey("YourStorageAccount", "YourStorageAccountKey");
            CloudStorageAccount storageAccount = new CloudStorageAccount(storageCredentials, true);
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("$logs");

            BlobRequestOptions blobRequest = new BlobRequestOptions();
            blobRequest.BlobListingDetails = BlobListingDetails.All;
            blobRequest.UseFlatBlobListing = true;

            ResultSegment<IListBlobItem> result = container.ListBlobsSegmented(blobRequest);          
           
            foreach (var blobItem in result.Results)
            {               
                lstBlobs.Items.Add(blobItem.Uri.AbsoluteUri);// lstBlobs is id of listbox I added
            }
        }
These logs of storage analytics then can be used for various purposes.
Hope this helps.
Cheers…

Azure Storage Analytics for Azure Storage –Blob, Queue and Tables - Metrics example.

Happy Analyzing!!

Azure Storage Analytics – Metrics classes

This blog is follow up post pertaining to blog series related to Azure Storage Analytics.  To go back to the main article and understand the background of this post click- Azure Storage Analytics for Azure Storage –Blob, Queue and Tables - Metrics example
Following is the code of classes required for analytics metrics information for Azure storage – Blob, Queue and Table.
Class MetricsHelper.cs -
public class MetricsHelper
    {
        /// <summary>
        /// Given a service, start time, end time search for, this method retrieves the metrics rows for requests and exports it to CSV file
        /// </summary>
        /// <param name="tableClient"></param> /// <param name="serviceName">The name of the service interested in</param>
        /// <param name="startTimeForSearch">Start time for reporting</param>
        /// <param name="endTimeForSearch">End time for reporting</param>
        /// <param name="fileName">The file to write the report to</param>
        public static void DumpTransactionsMetrics(CloudTableClient tableClient, string serviceName, DateTime startTimeForSearch, DateTime endTimeForSearch, string fileName)
        {
            string startingPK = startTimeForSearch.ToString("yyyyMMddTHH00");
            string endingPK = endTimeForSearch.ToString("yyyyMMddTHH00");

            // table names are case insensitive
            string tableName = string.Format("$MetricsTransactions{0}", serviceName);

Azure - Need of Storage Analytics and what is Storage Analytics

What is software optimization -?
Program optimization or software optimization is the process of modifying a software system to make some aspect of it work more efficiently or use fewer resources. In general, a computer program may be optimized so that it executes more rapidly, or is capable of operating with less memory storage or other resources, or draw less power.
To perform software optimization we need to identify problem areas in software which are not working efficiently. To identify problem areas, we need to analyze the program or software and cover various aspects like usage of memory by particular instruction, time duration taken by function execution or code segment. This will allow you to determine those improvements that are required in your application.
The process that measures for example, usage of memory, usage of particular instruction, or durations of functional calls is called as profiling or program profiling or software profiling. The tool used for profiling is called as Profiler.
The profiler available for .net is CLR profiler, Perfmon.exe tool. Similarly there are different profilers available for java, mainframe, SQL Server.
Similarly, to perform profiling on Windows Azure Storage, we use Storage Analytics. So in other words, Windows Azure Storage Analytics offers profiler like capabilities for Windows Azure Storage.

Tuesday, December 20, 2011

What is VPN – Virtual Private Network

Networking always amaze me. Sometimes I feel like I should have been studied networking more. Anyways…back to the point. I will try to describe what actually "Virtual Private Network" is. Let’s take a real life example.
As companies business expands, company opens up more offices across the various countries and around world. Generally people working from remote locations are sales people. Sales people always need to travel across countries and need to work while they are travelling or from remote locations around the globe. In such case, people working remotely need a fast, secure and reliable way to share information across computer networks of the company.
The technology which helps you to achieve this is Virtual Private Network. VPN is network which uses either private network or public network means internet to share information from remote locations. The information passed over VPN is always encrypted hence anyone can’t read this encrypted data. While using VPN over internet we use connections routed through internet only; still it gives us feel like we are working in private network by enabling us to access the desired network resources. Hence the connection established in VPN is not establishing any actual physical private network. It works over public network only, however it gives feel of working in private network by securing the information and provides seamless connectivity to desired network resources, therefore VPN is referred as Virtual.

To understand VPN let’s consider an example of real life and following diagram –

Monday, December 19, 2011

Science for Kids - Refraction of Light – Spoon in glass experiment

Thanks to all kids and their parents who responded heavily to me for my first science article on Ice on a string. Today I am going to take you through another interesting experiment and some cool science concepts. Here we go!!!
Spoon!!! Spoon is made of very hard material - Steel. It is difficult or not possible to break spoon with hands. What if I break spoon first and bring it back to its original straight shape? Believe me it is very easy. The experiment we are going to understand today will teach you how to break spoon and bring it back to its original shape.
Take one glass and fill water to the half of glass. Put one steel spoon in glass and watch. What do you see? Spoon in a glass is broken. Now take out the spoon. What do you see? Spoon is back to its original and it is not broken.

Tuesday, December 13, 2011

Science for Kids - Ice on a String – Pick Ice cube using thread experiment

If I tell you to pick up an Ice cube using thread what would be your reaction? Of course the condition is, you should not tie a knot to Ice cube using thread. I know some of you must be started thinking me as fool.
Ok, before thinking hard, let’s try the experiment. So our today’s Science project with Ice will require following things – One Ice cube, one thread about 1 feet length, Salt, small cup of water.
Things mentioned above are easily available. Ok here we go, take a thread and insert it into a cup of water. Let the thread be immersed in water until it becomes completely wet. Then lay the string or thread onto ice cube. Make sure that at least an inch of thread is touching the Ice surface. Now sprinkle salt on thread area which is touching the Ice surface. Wait for at least 30 seconds and then pick the thread. You will be amazed to see that, Ice cube also gets picked up by thread.
How come Ice got picked by thread even if I did not tie Ice to thread?
Science Fact –
Let’s understand the facts. If we mix saline substance to other substance and it gets dissolved in the other substance then melting point of that other substance get reduced.
What is melting point – The melting point of a solid substance is the temperature at which it changes the state from solid to liquid. For example, the temperature at which Ice turns into water again is called as melting point of Ice. The normal melting point of Ice is 0 degrees or 0degrees is the temperature at which Ice turns back to water.
Salt (or scientific name – NaCl – Sodium Chloride) is saline substance. When we sprinkle salt on Ice then melting point of Ice is reduced. Means Ice will not turn to water at 0 degrees rather below the 0 degrees temperature. Hence melting point of Ice is reduced because of salt.
When you lay wet string on Ice and sprinkled salt, Ice cube started melting at temperature much below than 0 degrees.  That is temperature of Ice cube surface which is in contact with salt dropped and turned into water.  When the temperature of water regains its original temperature, melted water freezes back to Ice. Thread becomes the part of Ice and Ice cube lifts up easily.
If we mix saline liquid with other substances, then melting point reduced because mixed saline resists crystallization of other substance. More the salinated liquid, more reduction in the melting point. Pure solid state substance will always have high melting point as compared to impure solid state substances.
If you live in a place that has lot of snow, this is the reason why salt is sprinkled on snow during winter on roads by highway departments. This reduces melting point of snow and roads become free of snow and traffic can continue.

If you use Ice and salt in correct ratio then you can lower temperature to -23 degree Celsius. This mixture is widely used in many industries and of course to prepare everyone’s favorite – IceCream.
Hope above experiment was good learning experience for you and you did not get bored.

Stay tuned for more in coming days…
Cheers…

Happy Salting!!!

Click here to go Back to experiments list…

Science for kids

Today I have decided to share my knowledge of science experiments with you. The experiments that I am going to demonstrate are very basic and most suitable for kids in age range 7 to 16.
In these experiments I will tell few concepts, the reasoning behind behavior of matters and of course few images demonstrating the experiments.

It was fantastic experience for me to conduct those experiments on my own. I hope you will also gain some knowledge from these experiments which will help you in your career and will definitely shape your learning curve. I will make sure that you enjoy a wide range of information and science facts that are sure to surprise and amaze you.

Here is the list of topics. Click on the links to visit the particular experiment –

Ice on a String – Wow, Pick Ice cube using thread…

Refraction of light - Breaking sppon in glass of water...

Stay tuned for more…

Monday, December 12, 2011

Uploading large file to Azure blob storage and httpRuntime maxRequestLength

Recently I was told to write a code for uploading large size files to Azure Blob storage from an aspx page. I asked for specifications of size of files. I was told that file to be uploaded will vary from 1MB to anything.
When I had actually implemented a code for file upload, things were working smoothly for the file up to 4MB. For all other files having size more than 4MB, I was receiving many errors.
Those errors included, time out errors, maxRequestLength exceeded, operation could not be completed, and my favorite error – “Internet explorer cannot display the web page”.
I had used normal asp.net file upload control in my web role and a very standard code to upload file to Azure Blob storage. After tweaking some code segments it finally worked. Following is the code segment used for uploading big files on a button click to Azure Blob storage from web role or web site page.
First read all the configuration values from project. I am here using web role project therefore my code to read configuration values will be as follows. If you are using simple asp.net web site project then your code to read configuration values will be different.
protected void btnUpload_Click(object sender, EventArgs e)
{
//using some local variables
string packageFilePath = string.Empty;
string packageFileName = string.Empty;
string referenceFileName = string.Empty;
string packageFileURL = string.Empty;
#region read required configuration values
            Container = RoleEnvironment.GetConfigurationSettingValue("ContainerName");
            AccountName = RoleEnvironment.GetConfigurationSettingValue("AccountName");
            SharedKey = RoleEnvironment.GetConfigurationSettingValue("SharedKey");
            blobEndPoint = RoleEnvironment.GetConfigurationSettingValue("BlobStorageEndpoint");
            tableEndpint = RoleEnvironment.GetConfigurationSettingValue("TableStorageEndpoint");
            #endregion

//create cloud blob client
CloudBlobClient blobClient = new CloudBlobClient(blobEndPoint, new StorageCredentialsAccountAndKey(AccountName, SharedKey));

// For large file copies you need to set up a custom timeout period and using parallel settings appears to spread the copy across multiple threads
// if you have big bandwidth you can increase the thread number below because Azure accepts blobs broken into blocks in any order of arrival.
blobClient.Timeout = new System.TimeSpan(1, 0, 0);
blobClient.ParallelOperationThreadCount = 2;

//get container reference           
CloudBlobContainer container = blobClient.GetContainerReference(Container);

//setup permission on container to be public
var permissions = new BlobContainerPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);

//retrieving file details. uploadCSPKG is the name of my file upload control              
if (uploadCSPKG.HasFile)
{
try
{
                  packageFileName = Path.GetFileName(uploadCSPKG.FileName);

//attaching application name before file name so as to create hierarchy in blob. txtApplicationName is my text box where in user provides sname for logical folder in blob under which file will be uploaded.
referenceFileName = txtApplicationName.Text + "/" + packageFileName;

                  uploadCSPKG.SaveAs(Server.MapPath("~/") + packageFileName);
                  packageFilePath = Server.MapPath("~/") + packageFileName;
            }
            catch (Exception ex)
            {
                throw ex;
            }
      }

//open the stream and upload package file to blob
using (FileStream fs = File.OpenRead(packageFilePath))
{
CloudBlob blob = container.GetBlobReference(referenceFileName);
      blob.UploadFromStream(fs);
      packageFileURL = blob.Uri.AbsoluteUri;
}
}

The above is the complete code required to perform a file upload to Azure blob. Apart from this, you will need to do one more configuration related to ASP.NET runtime setting using httpRuntime tag in web.config of your application.
Add following line in web.config under <system.web> -
<httpRuntime maxRequestLength="51200" enable ="true" executionTimeout="45"/>

Here I assume that, my file size will never exceed the value provided in maxRequestLength attribute. If you feel the size of file to be uploaded can exceed above value then you can change the value as per your need. So with the above setting of maxRequestLength, you should be able to upload file upto sizes of 50MB. If you wish to upload file of size more than 50MB then increase the value under maxRequestLength attribute.
maxRequestLength  - Indicates the maximum file upload size supported by ASP.NET. The size specified is in kilobytes. The default is 4096 KB (4 MB).
Therefore I was able to add 4MB file previously without any difficulty whereas file sizes more than 4MB was failing with various errors. The above code segment resolved the issue.
Hope this helps.
Cheers…
Happy Programming!!!