There are numerous ways to log and trace your application. Both build in with the framework, like debug and trace (listeners), and third parties like log4net, Elmah. Common for all these scenarios, is the writing to file or windows event log. Useful features like rolling files, and event logs that only grows to a given size is a must for productions systems.
Moving to the cloud and Azure this becomes a bit new. Where is the log written now? And how can I access the log? There is no easy way of accessing the file system, and behind the scene there might even be multiple instances of servers. Once I got this working, nothing was really strange or difficult. But googling around, and reading blogs sometimes made me and my code even more confused. This is apparently a mine field of typos, and changes from one version to another. What seemed to work for some, surely did not for me.
I will walk you through the steps needed to make a successful entry in Azure Storage, and how to read them.
First the cloud needs a setting. Here I got redirected in every other way for each blog I read. This worked for me.
Entries in web.config:
1: <system.diagnostics>
2: <trace autoflush="true">
3: <listeners>
4: <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
5: name="AzureDiagnostics" >
6: <filter type="" />
7: </add>
8: </listeners>
9: </trace>
10: </system.diagnostics>
The class WebRole.cs:
1: public class WebRole : RoleEntryPoint
2: {3: public override bool OnStart()
4: {5: // To enable the AzureLocalStorageTraceListner, uncomment relevent section in the web.config
6: DiagnosticMonitorConfiguration diagnosticConfig = DiagnosticMonitor.GetDefaultInitialConfiguration(); 7: diagnosticConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1); 8: diagnosticConfig.Directories.DataSources.Add(AzureLocalStorageTraceListener.GetLogDirectory()); 9: 10: // For information on handling configuration changes
11: // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
12: diagnosticConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromSeconds(15); 13: diagnosticConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;14: DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagnosticConfig);
15: Trace.TraceInformation("OnStart completed.");
16: 17: return base.OnStart();
18: } 19: }ServiceConfiguration.cscfg:
1: <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=myaccountname;AccountKey=longcryptingstringwithlotsofnumbersandchars" />
The name should be “Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString”. Quite a few blog posts have an other setting name, but this must be on previous versions. The value replaces the default setting “UseDevelopmentStorage=true”. The settings for AccountName and AccountKey are found where you created the storage account:
From now on tracing should be as easy as
Trace.TraceInformation("Add some trace info.");Reading the log I found that Azure Diagnostic Manager from Cerebrata. They also have an online tool Cloud Storage Studio. A screenshot from the first mentioned looks like this:
Mission complete. My traces are now logged to the cloud. Once completed this is nothing much or complex. If only I had this recipe in advance..

No comments:
Post a Comment