Log to Azure Table Storage in asp.net core to

Asp.net core has abstracted the logging operations into the ILogger interface and instead of having too many frameworks for logging, you just need to add a logging provider such as Console, SeriLog, Windows Event Viewer. In this post, I will create a custom logging provider that logs to Azure table storage

Why Azure Table Storage?

Azure table storage can store petebytes of semi-structured data with high performance inserts and updates. It is also cheap which makes it very suitable for storing log messages. It also allow us to connect to the log messages from many tools such as Azure Storage explorer unlike the case when you log messages to the file system on an Azure App Service

Structure of a custom logger

Logger

This is where the actual code for logging will exist and it must implement the ILogger

public class AzureTableLogger : ILogger
{
Microsoft.WindowsAzure.Storage.Table.CloudTable loggingTable;

    public AzureTableLogger(string connectionString, string tableName)
    {
        var storageAccount = Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(connectionString);
        var cloudTableClient = storageAccount.CreateCloudTableClient();
        loggingTable = cloudTableClient.GetTableReference(tableName);
        loggingTable.CreateIfNotExistsAsync();
    }

    public IDisposable BeginScope<TState>(TState state)
    {
        return null;
    }

    public bool IsEnabled(LogLevel logLevel)
    {
        return true;
    }

    public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func<TState, Exception, string> formatter)
    {
        var log = new LogEntity
        {
            EventId = eventId.ToString(),
            LogLevel = logLevel.ToString(),
            Message = formatter(state, exception),//exception?.ToString(),
            PartitionKey = DateTime.Now.ToString("yyyyMMdd"),
            RowKey = Guid.NewGuid().ToString()
        };

        var insertOp = TableOperation.Insert(log);
        loggingTable.ExecuteAsync(insertOp).GetAwaiter().GetResult();
    }
}

public class LogEntity : Microsoft.WindowsAzure.Storage.Table.TableEntity
{
    public string LogLevel { get; set; }
    public string EventId { get; set; }
    public string Message { get; set; }
}

As you can see, we just use the Azure Table Storage SDK to connect to the storage account and create a cloud table client. The logger accepts the table name and connection string as a constructor parameter which will be injected using the LoggingProvider. This is the minimum version of a logger and it needs a lot of improvement but it is good as a start.

LoggingProvider

public class AzureTableLoggerProvider : ILoggerProvider
{
    private readonly string connectionStringName;
    private readonly string tableName;

    public AzureTableLoggerProvider(string connectionString, string tableName)
    {
        connectionStringName = connectionString;
        this.tableName = tableName;
    }
    public ILogger CreateLogger(string categoryName)
    {
        return new AzureTableLogger(connectionStringName, tableName);
    }

    public void Dispose()
    {

    }
}

In this provider, we just use the two parameters passed in the constructor to create a new logger instance

LoggingProviderExtension

 public static class AzureTableLoggerExtensions
    {
        public static ILoggerFactory AddTableStorage(this ILoggerFactory loggerFactory, string tableName, string connectionString)
        {
            loggerFactory.AddProvider(new AzureTableLoggerProvider(connectionString, tableName));
            return loggerFactory;
        }
    }

This last block is not a must but its for the sake of clean code and clarity. In this step, we just inject our logging provider into the ILoggerFactory. When the application starts, we will have to call the AddTableStorage method to be able to use the new logger.

Add the provider in startup.cs

   public class Startup
    {
        // This method gets called by the runtime. Use this method to add services to the container.
        // For more information on how to configure your application, visit https://go.microsoft.com/fwlink/?LinkID=398940
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddControllers();
        }

        // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
        public void Configure(IApplicationBuilder app, IWebHostEnvironment env, ILoggerFactory loggerFactory)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }
            loggerFactory.AddTableStorage("LoggingSample", "UseDevelopmentStorage=true");
            app.UseRouting();
            app.UseEndpoints(endpoints =>
            {
                endpoints.MapDefaultControllerRoute();
            });
        }
    }

Now, our application is ready to log messages to Azure Table Storage. The sample above is using local storage emulator and will create a table named LoggingSample. Lets create a controller and log some messages

    [ApiController]
    [Route("[controller]")]
    public class CustomersController : Controller
    {
        private ILogger<CustomersController> logger;

        public CustomersController(ILogger<CustomersController> logger)
        {
            this.logger = logger;
        }

        public IActionResult Index()
        {
            logger.LogInformation("This is a message from customers controller");
            return Ok();
        }
    }

In the Index action, we called the logger to add a test message. Notice that you don’t have to specify anything other than the ILogger. Asp.net core DI will inject the proper logger at runtime.

Notice the new columns added to the table such as EventId and LogLevel. These are properties in the LogEntity classes added in the first code snippet. Notice also that it logs all messages logged by the asp.net infrastructure not just the test message that we added

Next steps

Our next step would be adding a middleware that will show a nice UI that will display log messages. This can come handy when you want to debug something and need to see the error messages

Migrate your asp.net settings to Azure App Configuration

Before asp.net core, we only had the web.config file to configure and read the application settings from code. The framework didn’t have a plugable way of adding an extra configuration providers. Any other confiruation data sources meant a custom solution and custom code to read and update configuration keys.

With asp.net core, we got the new appsettings.json as one source of configuration. Along with it came the environment variables, other files, secret keys that can be used to omit connection strings and passwords from being stored on source control.

Things event got better with Azure App Service that has a settings/configuration section that can override the application settings and connection strings sections. This applies to both asp.net framework and asp.net core.

Isn’t this enough already?

It is good as a start. but, it has a major issue. If you change any value in your configuration keys, it means the application must be restarted which affects the logged in users and may cause data loss.
Another issue is that our application code and configuration exists in the same place which contradicts with the twelve factor . These configuration keys cannot also be shared with other applications except using copy/paste

Azure App Configuration

Azure App Configuration is a service that is still in preview that can be a central place for all your configuration keys and feature flags.
It is so easy to add it to your asp.net core configuration provider using the provided nuget packages.

Add the need packages

First, install the package Microsoft.Azure.AppConfiguration.AspNetCore . You may need to check that you enabled prerelease versions if you are using visual studio

In your Program.cs file, add the following code which will inject Azure app configuration as a configuration provider. Notice that you will have to add a key named “AppConfig” in your connectionStrings section in appsettings.json that will store the connection string for your Azure App Configuration

public static IWebHostBuilder CreateWebHostBuilder(string&#091;] args) =>
    WebHost.CreateDefaultBuilder(args)
        .ConfigureAppConfiguration((hostingContext, config) =>
        {
            var settings = config.Build();
            config.AddAzureAppConfiguration(settings&#091;"ConnectionStrings:AppConfig"]);
        })
        .UseStartup<Startup>();

Now, your application is ready to read any configuration keys from Azure App Configuration. All you need to do is to create a new instance by following this link

Reuse your existing configuration keys

It’s not only too easy to add Azure App Configuration to your asp.net core app but also migrating your existing keys. You don’t want to end up with a situation where you have to spend hours moving your keys manually or writing scripts to do it. Azure App Configuration can do that for you.

As you can see above, it is too simple. Just choose Import/export from the left menu then choose where you want App Configuration to import the values from:

  1. App Configuration: you can import the values from another Azure App Configuration instance
  2. App Service: which can be perfect in our case since it will import the values from your Azure App Service
  3. Configuration File: you can pass the appsettings.json directly and it will import the values from there

Once you did that, you will have all keys/connection strings imported and ready to go.

Conclusion

Azure App Configuration is a great service that still in preview but it enables you to offload your application from configuration keys and connection strings. It also has the auto refresh which will allow your app to always have the latest config keys without the need to restart your application and impact your users.
It also has feature management which is a great addition but we will talk about it in future posts.

Azure Functions anatomy – Part 2 – What is inside a Function App?

In the previous post (Azure Functions 2.0 Anatomy- Part 1), we saw the structure of a function app. In this part, I will explain the different components of an Azure Function.

Components

Every Azure Function 2.0 consists of the following components:

  • Trigger
  • Bindings (one of which is the trigger). Binding can be input or output
  • Value Converter/Binder
  • Listener

A binding can be a Blob file, a queue message, an event from event hub or event grid or any custom binding developed by anyone. One of these bindings can initiate the function call and in this case it is called a Trigger.
A binding can also be Input or Output. Input binding delivers data to the function and output binding is used by the function to write data.

Now, assume we have the following function and its configuration

        [FunctionName("SampleFunction")]
        public static void Run([QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")]string myQueueItem, [Blob("sample-output")]Stream output, TraceWriter log)
        {
            log.Info($"C# Queue trigger function processed: {myQueueItem}");
        }

This function has 2 binding parameters. myQueueItem which is both an input binding and a trigger, and output which is an output binding.

How Azure Functions call this method and convert the queue message to a string and the stream to a Blob file?

Binding, Listener, Value Provider and Converter

When the azure function app starts, it scans all functions that exist in the app directory like we saw in the previous post, and for each function it reads the input, output and trigger from the function config JSON file.

The scanning process uses a descriptor provider which creates the input binding, output binding, trigger and an invoker

Once all functions are loaded, it tried to load the types for each function that was developed outside the portal. It uses the scriptFile and EntryPoint as you can see below. The entry point in the previous code snippet is FunctionApp1.Function1.Run method that exist in the assembly FunctionApp1.dll

{
  "generatedBy": "Microsoft.NET.Sdk.Functions-1.0.14",
  "configurationSource": "attributes",
  "bindings": [
    {
      "type": "timerTrigger",
      "schedule": "0 */5 * * * *",
      "useMonitor": true,
      "runOnStartup": false,
      "name": "myTimer"
    }
  ],
  "disabled": false,
  "scriptFile": "../bin/FunctionApp1.dll",
  "entryPoint": "FunctionApp1.Function2.Run"
}

Each function app can either have a DLL function or a function created by portal. But not both.

Now, lets see what are the other components inside a binding:

  • Listener (IListener): This one has a method named StartAsync, which basically start listening for an event to trigger the function. For Azure Blob trigger, it scans the logs for new or modified files. For Queue trigger, it check for new messages in the specified queue.
  • Value Provider (IValueProvider): Once the listener has the event that should trigger the function, it uses the value provider to fill all input parameters. If the trigger was a blog trigger, then the value will be the file, the metadata, the name and content of the file itself. For a Queue, it will be content of the message and so on and so forth.
  • Value Converter (IAsyncConverter): This will be used to convert the values found in the value provider to all the input parameters configured for the function. In the previous example, StorageQueueMessageToStringConverter will be used to convert the CloudQueueMessage instance to the string parameter named myQueueItem. The same will be used to write data to output parameters, the framework will choose the best converter that matches the Output parameter type (Stream) and the type of variable that will be used to write to stream, ex: String or any Stream Writer.
  • Executor (ITriggerExecutor): Now we know that the function should be triggered, we have the data from input binding and we bound all input parameters, now we need to call the function body. This takes place using an implementation of ITriggerExecutor which can use reflection to call the function or in a very special case, it may get executed as a web hook like the case with HTTP trigger.

In the next post, I will dig more into Azure Function App to explore the extensions and host.

Azure Functions 2.0 anatomy – Part 1

In these series of posts, I will explain the internals of Azure Functions and how it works. By the end of it, you should be able to understand what happens once an Azure Function is deployed till the moment it is triggered and invocation completes.

Is it a Web App?

Yes, every Azure Function App that uses .Net as a Language runs as as.net core web app. When the app starts, it loads all your functions and proxies and set up the routes and listeners for each function.

Web App Structure

Just like any other Azure Web App, it consists of the following folders:

  • data: This folder contains another folder named “functions” which has another folder called extensions. The extensions folder has a list of all extensions installed to your Functions App. Each file has a reference to a nuget package that has the extension implementation.
  • Log Files
  • site: it has the wwwroot which hosts the asp.net core web app and all the functions
Data -> Functions folder structure

But for Azure Functions App, the following folders are added:

.nuget

Contains all nuget packages installed when you install a new extension. A reference for each extension installed exist in a file under: data\functions\extensions. The sample extension file below specify the package :Microsoft.Azure.WebJobs.Extensions.ServiceBus

In Azure Functions 1.0, it was running on .Net 4.6 and the runtime already included all the supported triggers and bindings. But, with version 2.0, the runtime only includes http and timer binding and all the other bindings can be developed and installed separately using extensions. This allows you to develop your own extensions and upload it without the need to wait for a new version of Azure Functions.

{
  "Id": "d3a33dd0-3941-4acf-a303-72773995901d",
  "Status": 1,
  "StartTime": "2019-01-02T02:49:23.5568341+00:00",
  "EndTime": "2019-01-02T02:50:50.1029788+00:00",
  "Error": null,
  "Properties": {
    "id": "Microsoft.Azure.WebJobs.Extensions.ServiceBus",
    "version": "3.0.0"
  }
}

ASP.NET

This folder contains all data protection keys used in the web app

To have a look at this folder structure and explore it yourself, use the following URL: https://%5BFUNCTION_APP_NAME%5D.scm.azurewebsites.net/DebugConsole

wwwroot folder structure

wwwroot folder structure

As you can see, there are 3 folders. Each folder represent a single function. All DLLs for each function exist in the bin folder.

The host.json file by default only has the version number for the function

{
  "version": "2.0"
}

Inside each function, there is a function.json file that has the configuration for this function.

{
  "generatedBy": "Microsoft.NET.Sdk.Functions-1.0.14",
  "configurationSource": "attributes",
  "bindings": [
    {
      "type": "timerTrigger",
      "schedule": "0 */5 * * * *",
      "useMonitor": true,
      "runOnStartup": false,
      "name": "myTimer"
    }
  ],
  "disabled": false,
  "scriptFile": "../bin/FunctionApp1.dll",
  "entryPoint": "FunctionApp1.Function2.Run"
}

The Azure Functions run-time use this file to load the metadata for each function during application startup. The most important properties are the scriptFile which points to the DLL that has the code and the entryPoint that has the exact c# method name that will be executed when the function triggers.
The configurationSource specifies where to read the function configuration such as connection strings for the blob storage or service bus. attributes means it will be retrieved from code.
The binding key lists all bindings for this function. For this sample, it is a timer and it is the trigger, so it has a single binding. For others, it can be bindings that represents input, output and a trigger.

In the next post, I will explain the internals of each function, and how the bindings and triggers works.

How Azure Functions Blob Trigger works

Introduction

Azure functions enable you to quickly build a piece of functionality that can be triggered by an external system such as Azure Blob Storage, Storage Queue, CosmoDB, Event Hub and the list goes on. In this post, I will explain how Azure Blob Trigger works.

Sample Function

If you created a new Azure Function using Visual Studio, you will end up with the following code:

 public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "")]Stream myBlob, string name, TraceWriter log)
        {
            log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }

The secret lies in the attribute [BlobTrigger]. It accepts the path to the storage container which Azure Web Jobs will monitor. As you know, Azure Functions is based on Azure Web Jobs, so the same triggers are used.

How Azure Web Job knows about a file that is being added, updated or removed to a container?

Run the sample app you just created in visual studio, it should work on the local storage emulator. Now, you should be able to invoke the function by creating a blob container called samples-workitems in your development storage account and you should end up with the following structure.

blob container

You can see the container “samples-workitems” which Azure Web Jobs will monitor and invoke the function whenever a file is changed there. But there is also another container named “azure-webjobs-hosts” which is the secret for how Azure monitor the files in that container.

If you opened that container, you will find a folder called “blobreceipts” which has a folder for the function name.

Inside the folder with the function name, there are some other folders with strange names like below

etags

These strange names are the ETag of each blob file added, edited or removed. So, when you change a file in any container, Web Jobs will create a new folder here with the new version ETag for that file and inside this folder you will find the same structure of the file being edited. In our case, there should be a folder called samples-workitems and inside this folder the file that was modified.

When a new file is added, Azure will check if  its ETag exist in the azure-webjobs-hosts folder and if not, then it will call the Azure Function. This way it will prevent duplicate calls for the same file. This pattern is called blob receipt.

Note that this process depends on Azure Blob Storage Logging which can take up to 10 minutes to write to the container azure-webjobs-hosts to improve performance. If you need your function to trigger faster then consider using Storage Queue trigger instead.

 

How to build a web application on Azure?

Microsoft Azure is one of the leading cloud platforms that enables us to build a scalable and highly performing web applications. In this article,  I will list the available options that allow you to quickly understand what you need in order to build your application and what services to use in each layer.

What do we need to build a web application?

Each web application needs some services in order to work propably. For example, a web server, a database server, queuing system etc..

In the past few years, I have found many people using Azure wrong or inefficiently. Some used virtual machines while they should have used Web Apps, others used a huge size of a virtual machine because they didn’t know they could scale their VM without losing the data. Accordingly, I will do my best to give you a summery of Azure services that you can utilize to build your next fantastic Web App.

Application Life-cycle Management

So, first things first. You need a place to store your use cases, test cases, source code, run unit tests and perform a continuous integration and deployment.

Although it may not be a part from Azure, but Team Services integrates seamlessly with Azure and many Azure Services like Web Apps, Mobile Apps, Azure Functions … support continuous integration with Azure.

Team Services help you to have a central repository for your source code, documents and requirement. It also supports Visual Studio Team Services and Git as a source control. It is free and supports unlimited number of projects with 5 free users.

Hosting

Now, you developed your web application, tested it and it is time to host it. Azure supports hosting for .Net, Java, PHP, NodeJS, and Python built applications.

In this phase you have more than one option:

  1. Virtual Machine: This is the very basic option and should not be used unless you have no other way, ex: migrating a legacy application that cannot work with a PAAS offering, in this case you simply have a virtual machine in the cloud and you can remotely connect to it, install the needed software and deploy your application. Note that this is the most expensive solution.
  2. Cloud Services: This is the same as Virtual Machine except that it offloads some of the work to be done by you such as setting up software and windows updates. It has 2 types, a web role which is setup to host a web application directly, and a worker role which enables you to host an executable application such as a windows service or a console app. The good part about cloud services is that you provide it with a package that has your code and it manages the deployment of the application. If the virtual machine hosting the app went down, Azure will automatically creates another virtual machine and deploy the code to it.
  3. App Service: This is the Platform as a Service or PAAS offering from Azure, all you have to worry about here is your code. It gives you a vitrual directory on cloud and all you have to do is to deploy your App to it either through web deploy, FTP or upload a package. It also offers continous integration and monitoring capabilities. This is the most flexible and cheapest option and you can autoscalre it according to the CPU usage, RAM usage and other factors. So basically, you can start small and scale as you need later. Azure App Service has Web Apps to host your application, Mobile App which is a backend as a service to your mobile apps, Logic Apps which gives you the option to build your businss logic and integration between different systems, and finally API Apps which is used to host your REST api web Apps.

Integration

If your application integrates with other systems, then you can use Azure Logic App, Azure Functions, BizTalk Service and Service Bus

Most services gives you a very nice visual designer that allows you to orchestrate your business logic and it offers out of the box seamless integration with many external systems such Office 365, Dynamic CRM or On-premise system through BizTalk Services

Data Storage

Azure offers many services to store your data, all of it are based on Azure Storage Account which is the main storage system for all services in Azure. Each account has some storage and throughput limitation, you can check these limitation from here

  1. If you need to store text files or video files, you can use Page and Block blobs. Each type is suitable for specific file types, you can read more about it from here
  2. Use Azure Media Services if you have to store and stream media files.
  3. Azure table storage is suitable if you need a NoSQL database, it provides a high throughput storing and retrieving entities that have no referential integrity between them.
  4. Azure DocumentDB is a NoSQL document database similar to MongoDB, it is a born in the cloud database that you can use if you need a low latency high performing database
  5. Azure SQL Database is a SQL Server database in the cloud where you need not to worry about SQL Server installation, Backup and Restore or anything else, you just create a database and use it.
  6. Azure File Share is used to replace any legacy File System Share in any legacy application.

 

Now, you should be able to have some basic knowledge about the features you can use to build your next web application.

Azure Services and Components

In the previous post What is Microsoft Azure? we introduced azure and cloud computing in general, we also explained how can we signup for an account using either the free trial or MS Dev Essentials.

In this post, I will explain the different parts and services in Azure.

How are services grouped in Azure?

Azure has different services for different needs, it can be organized in many ways as below

Services

  1. Compute
  2. Storage
  3. Networking
  4. Integration

Layers

  1. Application
  2. Data
  3. Integration

We can also divide it according to the consumption model (IAAS, PAAS, or SAAS), but we will stick to the above categorization for now

In this post, we will only explain the services part

Compute

This category has many wonderful services, like Virtual machines, Cloud Services which is a type of a virtual machine but more manageable by Azure and it has 2 types, web role used to host web sites and a worker role to host other code that runs on any windows machine.

Compute also has App service which as a developer you will be always using it as it offers everything you can wish for, starting from databases to DevOps, App Service has the following features

  1. Web App, similar to cloud service but it is more PAAS and lightweight, the same web app can be used to host both a web application and a timer job using Web Jobs, note that it has been always called Azure web sites, so if you heard that name, don’t be confused, it is the same
  2. Mobile App, as the name implies, it is a backend as a service for your mobile application, it offers a lot of features like a storage for your data, authentication, offline sync and push notification, it has also been known by Azure Mobile Services for long time
  3. Logic App, used to build a workflow applications to handle different business logic scenarios, ex: receive a message, post an order to the inventory service, sends an email, assign a task to shipping employee …, the logic app is based on BizTalk services which is an online version from BizTalk server
  4. API App, used to host REST APIs with a lot of features like throttling, payment for usage, reports …

Storage

This is one of the most important services in Azure and without it, you can nearly do nothing, this is used for storing anything starting from simple files, VM VHD files to databases.

Storage is divided into storage accounts, each account can hold upto 500 TB, with some limits on the number of operations per second, to know more about Azure services limits you can read more from this link

You can create more than one account and use it for different services, ex: you can have an account for your databases and another for virtual machines , but you have to take care of the limits to avoid any issues

Networking

Azure enables you to create VPN  just like on-premise and only virtual machines, web apps or any other services joined to the VPN can see each other, not only this, but you can join the VPN to on-premise VPN in any of the following ways

  1. Site to Site: this will join a whole VPN on-premise to Azure VPN, think of it as if you are expanding your datacenter
  2. Point to Site: this will join a single machine on-premise to Azure VPN

Integration

This feature allows you to integrate applications deployed on Azure with other applications running on-premise or even on Azure using BizTalk services.

 

There are a lot of other services like Azure Active Directory, SQL Database, DocumentDB, Search, I will write about these services in next post, then I will start a series where we will build an application and utilize most of azure services in one place.

 

If you have any suggestions about it, please leave a comment.