Azure Bot Service Overview

Today, I started learning Azure BOT Service, I will summarise each section into a blog post.

What is BOT?

A BOT is a modern way of user interaction. Instead of user clicking buttons and typing data in text fields, the user will be chatting or talking to an application that will understand the human language and respond accordingly.
This of course increase the human interaction with your site or application and also decrease the need for employees to handle user requests.
Imagine a user who wants to know about car insurance quote, instead of calling the insurance company, they can chat with the bot or even call it and ask them about the prices and the bot can respond as if it was a human or delegate the call to a human if it is not able to assist. This can save a lot for the company.

How BOT works?

Users interact with the BOT through a channel. A channel can be DirectLine like the case when you chat directly with the BOT from web site. It can be Facebook Messenger, Twitter, or any other popular social media platform.
The chat between users and BOT can be a text messages, a card that display graphic content to the user or a voice message or even an interactive form that ask the user to enter some information like SSN or Policy Number.
BOT can also use AI services such as Azure Congnitive Services to make it smarter.

https://haithamshaddad.files.wordpress.com/2019/02/84064-1xmuakcv_xkoszd4odvrika.png

BOT HTTP Communication

The communication between the BOT and Users/Channels take place through HTTP Post requests. When a user joins a conversation, an HTTP message is posted to the BOT service and the BOT just acknowledge the message with 200 OK response. Then, the BOT joins the conversation and again another HTTP Post message is sent to the channel and the channel replies back with a 200 Ok. These messages are called ConversationUpdate messages.
All these messages between the BOT and the channel are called activities. During the ConversationUpdate activity, you can check the members added or removed.
Once the connection is established, the messages send from the channel to the BOT are sent as an HTTP Post requests and the BOT replies back with a different HTTP Post not via response to the original HTTP Post request. So, as you can see from the image below, the Channel sent a Hi message, the BOT just echoed the message but it can send a different message. Then, the channel confirmed with a 200 Ok to the BOT and finally the BOT replies back to the original hi message with 200 Ok.
With the current version (4) it may not be very clear about the HTTP post messages but if you tried with the older version (3), it was very clear and when you create a new project you can clearly see an API controller called MessagesController and you can access it via /api/messages.

activity diagram
https://docs.microsoft.com/en-us/azure/bot-service/v4sdk/media/bot-builder-activity.png

Defining Rounds

Humans talk one at a time, one talk and the others listen (At least this is what should happen 🙂 ).

activity processing stack
https://docs.microsoft.com/en-us/azure/bot-service/v4sdk/media/bot-builder-activity-processing-stack.png

The important thing to note in the above image is the TurnContext
which holds information about whose turn and what data was sent (Typically JSON) and the middleware which holds some components that execute in order to do something with the activity received. After all middleware components finish execution, it passes the control to the Turn Handler which will execute the application logic that will decide the response that will be sent to the channel/user.
The middleware is executed on the incoming activity received from the channel and also on the outbound activity sent from the bot to the channel itself.

Advertisement

Azure Functions anatomy – Part 2 – What is inside a Function App?

In the previous post (Azure Functions 2.0 Anatomy- Part 1), we saw the structure of a function app. In this part, I will explain the different components of an Azure Function.

Components

Every Azure Function 2.0 consists of the following components:

  • Trigger
  • Bindings (one of which is the trigger). Binding can be input or output
  • Value Converter/Binder
  • Listener

A binding can be a Blob file, a queue message, an event from event hub or event grid or any custom binding developed by anyone. One of these bindings can initiate the function call and in this case it is called a Trigger.
A binding can also be Input or Output. Input binding delivers data to the function and output binding is used by the function to write data.

Now, assume we have the following function and its configuration

        [FunctionName("SampleFunction")]
        public static void Run([QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")]string myQueueItem, [Blob("sample-output")]Stream output, TraceWriter log)
        {
            log.Info($"C# Queue trigger function processed: {myQueueItem}");
        }

This function has 2 binding parameters. myQueueItem which is both an input binding and a trigger, and output which is an output binding.

How Azure Functions call this method and convert the queue message to a string and the stream to a Blob file?

Binding, Listener, Value Provider and Converter

When the azure function app starts, it scans all functions that exist in the app directory like we saw in the previous post, and for each function it reads the input, output and trigger from the function config JSON file.

The scanning process uses a descriptor provider which creates the input binding, output binding, trigger and an invoker

Once all functions are loaded, it tried to load the types for each function that was developed outside the portal. It uses the scriptFile and EntryPoint as you can see below. The entry point in the previous code snippet is FunctionApp1.Function1.Run method that exist in the assembly FunctionApp1.dll

{
  "generatedBy": "Microsoft.NET.Sdk.Functions-1.0.14",
  "configurationSource": "attributes",
  "bindings": [
    {
      "type": "timerTrigger",
      "schedule": "0 */5 * * * *",
      "useMonitor": true,
      "runOnStartup": false,
      "name": "myTimer"
    }
  ],
  "disabled": false,
  "scriptFile": "../bin/FunctionApp1.dll",
  "entryPoint": "FunctionApp1.Function2.Run"
}

Each function app can either have a DLL function or a function created by portal. But not both.

Now, lets see what are the other components inside a binding:

  • Listener (IListener): This one has a method named StartAsync, which basically start listening for an event to trigger the function. For Azure Blob trigger, it scans the logs for new or modified files. For Queue trigger, it check for new messages in the specified queue.
  • Value Provider (IValueProvider): Once the listener has the event that should trigger the function, it uses the value provider to fill all input parameters. If the trigger was a blog trigger, then the value will be the file, the metadata, the name and content of the file itself. For a Queue, it will be content of the message and so on and so forth.
  • Value Converter (IAsyncConverter): This will be used to convert the values found in the value provider to all the input parameters configured for the function. In the previous example, StorageQueueMessageToStringConverter will be used to convert the CloudQueueMessage instance to the string parameter named myQueueItem. The same will be used to write data to output parameters, the framework will choose the best converter that matches the Output parameter type (Stream) and the type of variable that will be used to write to stream, ex: String or any Stream Writer.
  • Executor (ITriggerExecutor): Now we know that the function should be triggered, we have the data from input binding and we bound all input parameters, now we need to call the function body. This takes place using an implementation of ITriggerExecutor which can use reflection to call the function or in a very special case, it may get executed as a web hook like the case with HTTP trigger.

In the next post, I will dig more into Azure Function App to explore the extensions and host.

Azure Functions 2.0 anatomy – Part 1

In these series of posts, I will explain the internals of Azure Functions and how it works. By the end of it, you should be able to understand what happens once an Azure Function is deployed till the moment it is triggered and invocation completes.

Is it a Web App?

Yes, every Azure Function App that uses .Net as a Language runs as as.net core web app. When the app starts, it loads all your functions and proxies and set up the routes and listeners for each function.

Web App Structure

Just like any other Azure Web App, it consists of the following folders:

  • data: This folder contains another folder named “functions” which has another folder called extensions. The extensions folder has a list of all extensions installed to your Functions App. Each file has a reference to a nuget package that has the extension implementation.
  • Log Files
  • site: it has the wwwroot which hosts the asp.net core web app and all the functions
Data -> Functions folder structure

But for Azure Functions App, the following folders are added:

.nuget

Contains all nuget packages installed when you install a new extension. A reference for each extension installed exist in a file under: data\functions\extensions. The sample extension file below specify the package :Microsoft.Azure.WebJobs.Extensions.ServiceBus

In Azure Functions 1.0, it was running on .Net 4.6 and the runtime already included all the supported triggers and bindings. But, with version 2.0, the runtime only includes http and timer binding and all the other bindings can be developed and installed separately using extensions. This allows you to develop your own extensions and upload it without the need to wait for a new version of Azure Functions.

{
  "Id": "d3a33dd0-3941-4acf-a303-72773995901d",
  "Status": 1,
  "StartTime": "2019-01-02T02:49:23.5568341+00:00",
  "EndTime": "2019-01-02T02:50:50.1029788+00:00",
  "Error": null,
  "Properties": {
    "id": "Microsoft.Azure.WebJobs.Extensions.ServiceBus",
    "version": "3.0.0"
  }
}

ASP.NET

This folder contains all data protection keys used in the web app

To have a look at this folder structure and explore it yourself, use the following URL: https://%5BFUNCTION_APP_NAME%5D.scm.azurewebsites.net/DebugConsole

wwwroot folder structure

wwwroot folder structure

As you can see, there are 3 folders. Each folder represent a single function. All DLLs for each function exist in the bin folder.

The host.json file by default only has the version number for the function

{
  "version": "2.0"
}

Inside each function, there is a function.json file that has the configuration for this function.

{
  "generatedBy": "Microsoft.NET.Sdk.Functions-1.0.14",
  "configurationSource": "attributes",
  "bindings": [
    {
      "type": "timerTrigger",
      "schedule": "0 */5 * * * *",
      "useMonitor": true,
      "runOnStartup": false,
      "name": "myTimer"
    }
  ],
  "disabled": false,
  "scriptFile": "../bin/FunctionApp1.dll",
  "entryPoint": "FunctionApp1.Function2.Run"
}

The Azure Functions run-time use this file to load the metadata for each function during application startup. The most important properties are the scriptFile which points to the DLL that has the code and the entryPoint that has the exact c# method name that will be executed when the function triggers.
The configurationSource specifies where to read the function configuration such as connection strings for the blob storage or service bus. attributes means it will be retrieved from code.
The binding key lists all bindings for this function. For this sample, it is a timer and it is the trigger, so it has a single binding. For others, it can be bindings that represents input, output and a trigger.

In the next post, I will explain the internals of each function, and how the bindings and triggers works.

How Azure Functions Blob Trigger works

Introduction

Azure functions enable you to quickly build a piece of functionality that can be triggered by an external system such as Azure Blob Storage, Storage Queue, CosmoDB, Event Hub and the list goes on. In this post, I will explain how Azure Blob Trigger works.

Sample Function

If you created a new Azure Function using Visual Studio, you will end up with the following code:

 public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "")]Stream myBlob, string name, TraceWriter log)
        {
            log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
        }
    }

The secret lies in the attribute [BlobTrigger]. It accepts the path to the storage container which Azure Web Jobs will monitor. As you know, Azure Functions is based on Azure Web Jobs, so the same triggers are used.

How Azure Web Job knows about a file that is being added, updated or removed to a container?

Run the sample app you just created in visual studio, it should work on the local storage emulator. Now, you should be able to invoke the function by creating a blob container called samples-workitems in your development storage account and you should end up with the following structure.

blob container

You can see the container “samples-workitems” which Azure Web Jobs will monitor and invoke the function whenever a file is changed there. But there is also another container named “azure-webjobs-hosts” which is the secret for how Azure monitor the files in that container.

If you opened that container, you will find a folder called “blobreceipts” which has a folder for the function name.

Inside the folder with the function name, there are some other folders with strange names like below

etags

These strange names are the ETag of each blob file added, edited or removed. So, when you change a file in any container, Web Jobs will create a new folder here with the new version ETag for that file and inside this folder you will find the same structure of the file being edited. In our case, there should be a folder called samples-workitems and inside this folder the file that was modified.

When a new file is added, Azure will check if  its ETag exist in the azure-webjobs-hosts folder and if not, then it will call the Azure Function. This way it will prevent duplicate calls for the same file. This pattern is called blob receipt.

Note that this process depends on Azure Blob Storage Logging which can take up to 10 minutes to write to the container azure-webjobs-hosts to improve performance. If you need your function to trigger faster then consider using Storage Queue trigger instead.

 

Download Attachments in Single Page App and Asp.Net Core

Introduction

If you have a SPA built with any JavaScript framework and it has an attachment feature, you must hit the part where you need to allow the user to download an attachment and the App is authenticating users using Tokens.

The problem

With normal forms authentication based on cookies the browser will simply send the authentication cookie with each request to your web server without you doing anything. If you have a link that will allow the user to download a file from your server, the browse will automatically send the authentication cookie when the user clicks the link. This makes it so easy for you. But if you are using token based authentication, it is your responsibility to send a token for each request sent to the server via Ajax by using the Authorization header.

Unfortunately, you cannot control the headers sent to the server if the user is opening a link in a new browser window and the user will end up with unauthorized request.

The solution

Download the file using Ajax Request

In this solution, you have to request the endpoint that downloads the file using Ajax Request which will include the authorization header and then get all the file content in a variable and push the content to the user. This works fine if the file size is very small. But imagine the case when you are downloading a 500MB file. This is not going to work since the file is stored in a JavaScript variable before the download takes place.

Make the API that download the attachment anonymous

If the endpoint that downloads the file doesn’t require authentication then we are good. But now the file will be available for every one to download. So, we have to find a way to secure the file even when the endpoint is anonymous.

If you have some experience with Azure Storage, you may have heard of Azure Storage Shared Access Signature. The idea is simple. When the user requests a file, generate a token, save it to a temporary storage  and append it to the URL of the download file endpoint. When the user clicks the link, the endpoint will be called and the token will be validated against the temporary storage and if it matches then send the file contents. This way we will be sure that the link was generated by the application to that user. Still, if the link was shared to another user, he will be able to download the file. But this is another issue that we can worry about later.

Implementation

We will create a new asp.net core site with an endpoint to download files but I will not create a SPA in this article. That will be left for the reader. I will test the idea though using Postman.

Open Visual Studio, Create a new project of type “Asp.NET Core Web Application” then Choose “API” in the next dialog. You can still choose “Web Application (Model-View-Controller)”. I will leave authentication to the default “No Authentication”.

Right Click on the Controllers folder and choose “New Controller”, choose “API Controlller – Empty” and name it AttachmentsController. You should end up with the following

[Produces("application/json")]
[Route("api/Attachments")]
 //[Authorize]
public class AttachmentsController : Controller
{
}

Notice that I have commented the [Authorize] attribute since I didn’t setup authentication in this demo. In real life scenario, you will setup authentication and authorization using Token based Authentication.

Create a folder named Services and then create a new interface called ISecureUrlGenerator. The content should look like the following:

   public interface ISecureUrlGenerator
    {
        string GenerateSecureAttachmentUrl(string id, string url);
        bool ValidateUrl(string url, string id);
        bool ValidateToken(string id, string token);
    }

Now, add class to implement the previous interface

using Microsoft.Extensions.Caching.Memory;
using System;
using System.Collections.Generic;

namespace SecureAttachmentsDownload.Services
{
    public class SecureUrlGenerator : ISecureUrlGenerator
    {
        private readonly IMemoryCache memoryCache;

        public SecureUrlGenerator(IMemoryCache memoryCache)
        {
            this.memoryCache = memoryCache ?? throw new ArgumentNullException(nameof(memoryCache));
        }

        public string GenerateSecureAttachmentUrl(string id, string url)
        {
            var token = Guid.NewGuid().ToString().ToLower();
            StoreToken(id, token);
            var separator = url.Contains("?") ? "&" : "?";
            return $"{url}{separator}token={token}";
        }

        public bool ValidateToken(string id, string token)
        {
            var tokens = memoryCache.Get(id);
            if (tokens != null && tokens.Contains(token))
                return true;

            return false;
        }

        public bool ValidateUrl(string url, string id)
        {
            var uri = new Uri(url);
            var queryStringParams = uri.Query.Split("&");
            foreach (var param in queryStringParams)
            {
                var values = param.Split("=");
                if (values[0].ToLower() == "token")
                {
                    return ValidateToken(id, values[1]);
                }
            }

            return false;
        }

        private bool IsTokenValid(string id, string token)
        {
            var tokens = memoryCache.Get(id);
            if (tokens != null && tokens.Contains(token))
                return true;

            return false;
        }

        private void StoreToken(string id, string token)
        {
            var tokens = memoryCache.Get(id);
            if (tokens == null)
                tokens = new List();
            tokens.Add(token);

            memoryCache.Set(id, tokens);
        }
    }
}

In this implementation, I am storing the tokens in asp.net core memory cache. To enable this feature, you have to add the caching service in Starup.cs file

public void ConfigureServices(IServiceCollection services)
        {
            services.AddMemoryCache();
            services.AddMvc();
        }

You can replace the memory cache with a database if you want the tokens to be permanent and in this case you have to add an expiration date.

Before we utilize the secure URL genrator, we need a class to hold the attachments metadata since the user will request the list of attachments first and then download it.
Create a folder called Models and put the following class in it.

namespace SecureAttachmentsDownload.Models
{
    public class AttachmentMetadata
    {
        public int Id { get; set; }

        public string DownloadUrl { get; set; }

        public string Name { get; set; }

        public int FileSize { get; set; }
    }
}

Now, lets get to the part where we utilise our secure URL generator.
The flow will be as below:

  1. The user requests endpoint to return a list of attachments to be displayed to the user. Here, the DownloadUrl will have the token already. This will be secured by tokens
  2. The SPA will display this list to the user as links or buttons that the user can click to download the file. The href for the anchor tag will be the DownloadURl property
  3. The user will click the link to download the attachment
  4. The AttachmentController will be called and the endpoint will validate the token and return the file or else a 401

Open the AttachmentsController file and add the following 2 action methods

  using System.Collections.Generic;
using System.IO;
using System.Linq;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Mvc;
using SecureAttachmentsDownload.Models;
using SecureAttachmentsDownload.Services;

namespace SecureAttachmentsDownload.Controllers
{
    [Produces("application/json")]
    [Route("api/Attachments")]
    //[Authorize]
    public class AttachmentsController : Controller
    {
        private readonly ISecureUrlGenerator _secureUrlGenerator;
        private readonly IHostingEnvironment _hostingEnvironment;

        private readonly List Attachments = new List()
            {
                new AttachmentMetadata
                {
                    Id = 1,
                    Name = "bitcoin.pdf",
                    ContentType = "application/pdf",
                    FileSize = 1024
                },
                  new AttachmentMetadata
                {
                    Id = 2,
                    Name = "report 1.pdf",
                    FileSize = 3024
                },
                  new AttachmentMetadata
                {
                    Id = 3,
                    Name = "report 2.pdf",
                    FileSize = 2024
                }
            };

        public AttachmentsController(ISecureUrlGenerator secureUrlGenerator, IHostingEnvironment hostingEnvironment)
        {
            _secureUrlGenerator = secureUrlGenerator;
            _hostingEnvironment = hostingEnvironment;
        }

        [HttpGet]
        [Route("")]
        public IActionResult Get()
        {
            foreach (var attachment in Attachments)
            {
                var url = Url.Action(nameof(AttachmentsController.Get), "Attachments", new { attachment.Id }, Url.ActionContext.HttpContext.Request.Scheme);
                attachment.DownloadUrl = _secureUrlGenerator.GenerateSecureAttachmentUrl(attachment.Id.ToString(), url);
            }

            return Ok(Attachments);
        }

        [HttpGet]
        [Route("{id}")]
        [AllowAnonymous]
        public IActionResult Get(int id, string token)
        {
            if (!_secureUrlGenerator.ValidateToken(id.ToString(), token))
                return Forbid();

            var attachment = Attachments.FirstOrDefault(a => a.Id == id);
            if (attachment == null)
                return NotFound();

            var stream = new FileStream($"{_hostingEnvironment.WebRootPath}\\Files\\{attachment.Name}", FileMode.Open);

            return File(stream, attachment.ContentType);
        }
    }
}

Now run the application and open the URL /api/Attachments. You will get the following excepttion:

InvalidOperationException: Unable to resolve service for type ‘SecureAttachmentsDownload.Services.ISecureUrlGenerator’ while attempting to activate ‘SecureAttachmentsDownload.Controllers.AttachmentsController’.

To fix it, open the startup.cs file and add the following line to the ConfigureServices method

  public void ConfigureServices(IServiceCollection services)
        {
            services.AddMemoryCache();
            services.AddScoped ();
            services.AddMvc();
        }

Now open the URL api/attachments again and you should see the following JSON response

[
{
"id": 1,
"downloadUrl": "http://localhost:53098/api/Attachments/1?token=b78763c2-0109-4c12-b771-5f5cc5d19017",
"name": "bitcoin.pdf",
"fileSize": 1024,
"contentType": "application/pdf"
},
{
"id": 2,
"downloadUrl": "http://localhost:53098/api/Attachments/2?token=12497a4a-8f08-44ba-b9f6-914c4b484cc5",
"name": "report 1.pdf",
"fileSize": 3024,
"contentType": null
},
{
"id": 3,
"downloadUrl": "http://localhost:53098/api/Attachments/3?token=8647bb52-e47f-4580-8149-0b1d238ab0e2",
"name": "report 2.pdf",
"fileSize": 2024,
"contentType": null
}
]

As you can see, the downloadUrl property has the absolute URL for the file and the `token` query string parameter is appended. If you open the first link in a new browser window, the Action Get(id) will be called and the token will be bound to the parameter token.
In my implementation, I have put some files in a folder called Files under the wwwroot folder. But in actual projects, you may retrieve the files from a Database, FTP or any Document Management System.

If you want to make sure that it is really working, just try to change any character in the token query string and you should get a forbid response from the server. In this example you will get an exception: InvalidOperationException: No authenticationScheme was specified, and there was no DefaultForbidScheme found.
This is because I didn’t configure the authentication middleware.

You can find the source code for this article on GitHub.

This implementation has as flaw. The list of attachments are returned with the download URL and the tokens are saved in memory. If the user didn’t click the link but after sometime, the tokens may have been already expired. So, either you save the tokens in a DB or you before clicking the link, fire an Ajax request to an endpoint that gets the metadata for a single attachment. This way, the downloadUrl will be always fresh and working.

If you have any questions or suggestions, please leave a comment below.

Web API with windows authentication on asp.net Core 2

Most REST services that are being built using asp.net core now are using token based authentication either using asp.net core authentication middleware or third party products such as Identity Server. But, sometimes you only need to build your APIs for intrenal use within your organization who happens to be using Windows Authentication.

In this point, I will explain how to build a web API that utilizes AD for authentication and AD groups for authorization and how to integrate it with authorization policies.

Creating the project

Open Visual Studio 2017, Create new asp.net core Web Application and name it AspnetCoreWindowsAuth, then press Ok. Choose Web API as a project Template and Change the authentication method to Windows then press Ok to create the project.

If you select the project in the solution explorer and press F4, you will find nothing to set the authentication mode to Windows and enable/disable anonmous access just like you used to do in normal MVC web application. This is because it is moved to the launchsettings.json file under the properties folder. If you want to change it, you have to open the file and edit the value of the json property iisSettings which looks like below:

IIS SettingsYou can also modify the URL and SSL settings.

Now, if you run the project, it will run just fine and you can call the default Values controller and see the output and even windows authentication will be working as well and you can get the name of the logged in user using the User.Identity.Name property and it will return the Domain\\username although we didn’t add any authentication code yet in the pipeline

Add windows authentication middleware

Now, lets add the authentication middleware into the request processing pipeline. Add the line  app.UseAuthentication(); in the Configure method just before the  app.UseMvc(); . Remeber that the middlewares run in the same order they were added in the Configure method.

Add the following code in the ConfigureServices method before the services.AddMvc();

services.Configure(options =>
{
options.AutomaticAuthentication = true;
});

services.AddAuthentication(IISDefaults.AuthenticationScheme);

To make sure this is working fine, you can edit the Authorize attribute on the ValuesController and add the role name which should be an AD group name, ex: Employees

[Authorize(Roles ="Employees")]

Now you have asp.net core working fine with Active Directory and you can can authenticate the users according to the AD groups they belong to.

Using Authorization Policies

If you need more fine grained control over your controllers and you need to add more authorizastion logc, then you can go for authorization policies and it is really easy to configure as you can see below. Just add the following lines in the ConfigureServices method before the AddMvc statement

services.AddAuthorization(options =>
{
options.AddPolicy("OnlyEmployees", policy =>
{
policy.AddAuthenticationSchemes(IISDefaults.AuthenticationScheme);
policy.RequireRole(""S-1-5-4");
});
});

Here we defined a policy called OnlyEmployees and it requires the users to be windows authenticated and in the Role named Employees which is eventually mapped to AD group named employees. Notice that I didn’t write the name Employees in the RequireRole method. Instead, the value “S-15-4” was used, which is the SID for the AD Group named Employees. I found that this is how the group names are mapped to Roles in asp.net core and even if you tried to retrive the list of claims that the user have, it will translate to all SIDs of the groups that the user belongs to in AD.

To utilize this policy you have to annotate the controller or method with it as below

[Authorize(Policy = "OnlyEmployees")]
[Route("api/[controller]")]
public class ValuesController : Controller
{

}

By now you should have a working solution that depends on windows authentication and AD groups. Notice that this will only work with windows and most probably IIS.

You can find the code on GitHub if you want to use it or add to it.

https://github.com/haitham-shaddad/aspnetcore-windows-auth

MVC 5 – OWIN and Katana

OWIN was the first step towards elegant design for the whole asp.net stack. By separating the hosting concerns from the framework, Microsoft was able to build modular services and add it to the stack such as Web API, SignalR and now the whole asp.net core.

Katana was the first implementation for OWIN on IIS, you won’t feel much change as it is still running using the System.Web assembly but at least it enables you to run your middlware under IIS.

To understand more about the motivation, the moving parts and samples of OWIN, you can watch the following video in Arabic.

 

 

Security Options for Asp.Net (Arabic)

This is a series of videos to explain the available options to secure your asp.net application. Although the demonstration is done using MS Stack, it applies to any technology.

In each video, we will take an authentication type, explain its concept and show a demo on an asp.net application. we should cover the following authentication types

  1. Basic
  2. Digest
  3. Windows
  4. Certificate
  5. Forms
  6. Claims
  7. OAuth

 

Here is the link to the YouTube playlist, I will add each video once it is recorded. Feel free to leave any comments or suggestions

How to build a web application on Azure?

Microsoft Azure is one of the leading cloud platforms that enables us to build a scalable and highly performing web applications. In this article,  I will list the available options that allow you to quickly understand what you need in order to build your application and what services to use in each layer.

What do we need to build a web application?

Each web application needs some services in order to work propably. For example, a web server, a database server, queuing system etc..

In the past few years, I have found many people using Azure wrong or inefficiently. Some used virtual machines while they should have used Web Apps, others used a huge size of a virtual machine because they didn’t know they could scale their VM without losing the data. Accordingly, I will do my best to give you a summery of Azure services that you can utilize to build your next fantastic Web App.

Application Life-cycle Management

So, first things first. You need a place to store your use cases, test cases, source code, run unit tests and perform a continuous integration and deployment.

Although it may not be a part from Azure, but Team Services integrates seamlessly with Azure and many Azure Services like Web Apps, Mobile Apps, Azure Functions … support continuous integration with Azure.

Team Services help you to have a central repository for your source code, documents and requirement. It also supports Visual Studio Team Services and Git as a source control. It is free and supports unlimited number of projects with 5 free users.

Hosting

Now, you developed your web application, tested it and it is time to host it. Azure supports hosting for .Net, Java, PHP, NodeJS, and Python built applications.

In this phase you have more than one option:

  1. Virtual Machine: This is the very basic option and should not be used unless you have no other way, ex: migrating a legacy application that cannot work with a PAAS offering, in this case you simply have a virtual machine in the cloud and you can remotely connect to it, install the needed software and deploy your application. Note that this is the most expensive solution.
  2. Cloud Services: This is the same as Virtual Machine except that it offloads some of the work to be done by you such as setting up software and windows updates. It has 2 types, a web role which is setup to host a web application directly, and a worker role which enables you to host an executable application such as a windows service or a console app. The good part about cloud services is that you provide it with a package that has your code and it manages the deployment of the application. If the virtual machine hosting the app went down, Azure will automatically creates another virtual machine and deploy the code to it.
  3. App Service: This is the Platform as a Service or PAAS offering from Azure, all you have to worry about here is your code. It gives you a vitrual directory on cloud and all you have to do is to deploy your App to it either through web deploy, FTP or upload a package. It also offers continous integration and monitoring capabilities. This is the most flexible and cheapest option and you can autoscalre it according to the CPU usage, RAM usage and other factors. So basically, you can start small and scale as you need later. Azure App Service has Web Apps to host your application, Mobile App which is a backend as a service to your mobile apps, Logic Apps which gives you the option to build your businss logic and integration between different systems, and finally API Apps which is used to host your REST api web Apps.

Integration

If your application integrates with other systems, then you can use Azure Logic App, Azure Functions, BizTalk Service and Service Bus

Most services gives you a very nice visual designer that allows you to orchestrate your business logic and it offers out of the box seamless integration with many external systems such Office 365, Dynamic CRM or On-premise system through BizTalk Services

Data Storage

Azure offers many services to store your data, all of it are based on Azure Storage Account which is the main storage system for all services in Azure. Each account has some storage and throughput limitation, you can check these limitation from here

  1. If you need to store text files or video files, you can use Page and Block blobs. Each type is suitable for specific file types, you can read more about it from here
  2. Use Azure Media Services if you have to store and stream media files.
  3. Azure table storage is suitable if you need a NoSQL database, it provides a high throughput storing and retrieving entities that have no referential integrity between them.
  4. Azure DocumentDB is a NoSQL document database similar to MongoDB, it is a born in the cloud database that you can use if you need a low latency high performing database
  5. Azure SQL Database is a SQL Server database in the cloud where you need not to worry about SQL Server installation, Backup and Restore or anything else, you just create a database and use it.
  6. Azure File Share is used to replace any legacy File System Share in any legacy application.

 

Now, you should be able to have some basic knowledge about the features you can use to build your next web application.

Tips and tricks to enhance asp.net security

There are many ways to enhance Asp.Net Web Application security, either on the web, back-end or database level, in this post, I will list some useful tips and tricks that will help you make your web application more secure.

Although the article discusses the concerns in asp.net context, it is applicable to any other framework

Web Level

Connection String

There are 2 ways to configure the security for your database connection in connection string. The first one is  windows authentication, your application connect to the database server using the Application Pool Identity, and this is the recommended way as it doesn’t include writing any User Passwords in your connection string, not to mention the need to edit web.config with the new password if it has expired, which will cause a restart in the IIS application pool.

The second way is to connect through SQL authentication, in this case, you should not have the password added as a plain text, you must secure your connection string by encrypting it, .Net already support this by running the following command

aspnet_regiis -pe “connectionStrings” -app “/MyCustomApplication”

The aspnet_regiis exe can be found in your Microsoft.net folder under C:\Windows\Microsoft.net, you should find a folder with the version of your framework, ex: v2.0.*

the -pe parameter specifies which section of your web.config you want to encrypt and the -app parameter specifies which IIS web site you want to use.

The algorithm used to encrypt is RSA, for more information about how does this work, please refer to https://msdn.microsoft.com/en-us/library/dtkwfdky.aspx?f=255&MSPPError=-2147217396

Service Accounts Privillages

The service account is used to run the application pool which in turn run the web application, and it is the account that you web application uses as identity when it needs to connect to the database if you specified Integrated Security=True or if you need to have an access to file system, for ex: Logging.

This account must have the least privilege, on the file system, only give it a read access to a specific folder if this is all what you need, on the database, don’t just get lazy and give it db_owner, only grant it the permissions it needs, if it only needs read or write on certain tables, then let it be and don’t give it access to everything.

Why is that? because simply, if someone succeeded to get to this service account, they could harm your servers, I have seen many cases where people uses a domain admin as a service account, imaging what a hacker could do with such account.

Cross Site Scripting (XSS)

Imagine you have a web application, and there is the page with the URL: http://somesite/dashboard/pay?amount=1000.

This page is requested using GET and it needs the user to be authenticated, now the user can only access if he is logged in, but think of the following scenario:

You access a site named: http://someothersite, and in this site, there is an image with a src =http://somesite/dashboard/pay?amount=1000&toAccount=123-123-123.

What will happen is that your browser will automatically try to fetch the source of the image and make a get request to the URL, and by design, the browser will automatically send the cookies for somesite domain including the authentication cookie, and the result will be the execution of the URL and pay the amount of 1000 because the request seems legitimate.

To prevent this issue, you must implement something called Anti-Forgery tokens, this simply means that with each request, the site will put a hidden field in the page and store its value in the user session or cookie, and when the user posts the page to the server, the server will validate that the value in the hidden field is equal to the value stored in the session or cookie, then it will clear the value in the session, this way, if the user tried to submit the page again or in our example, another site request the URL on behalf of the user, the server will deny the request because the request will not include the correct value for the anti-forgery token.

MVC has an excellent support for this by calling the @Html.AntiForgeryToken() in the view and annotating the action method by [ValidateAntriForgery] attribute

to read more about it, follow these links Anti-Forgery in asp.net , Anti-Forgery in web API

SQL Injection

Although this is a very old issue and nearly most of the existing ORM solutions handle it, still some people fall into this problem, the danger of this issue, is that the hacker can get to your database, your database server and from there, he can get to all your other servers, so always validate the user input, always use parameterized queries and skip the user strings if you are building a dynamic SQL.

Web.Config Encryption

Same issue as the connection string, but applies also to keys stored in app.config, you can also call the same command and encrypt the appsettings section, you don’t need to worry about the decryption cause .Net makes it automatically for you

HTTPS

If your application has sensitive data and you want to prevent any chance of the network between the user and the web server being spoofed, then enable HTTPS, it works by encrypting the traffic between the browser and the server by a public/private key using certificate.

Cookies

If your site is using cookies for authentication or maintaining client state, then make sure no sensitive data is there, also if you don’t need to access the cookie inside your client browser, then mark the cookie as Http Only, if the cookie has httponly enabled then it will not be accessible through JavaScript and the document.cookie JavaScript object will not contain this cookie.

Another trick is to mark your cookie as secure only, this will make it accessible and the client will send it to the server only if HTTPS was enabled on the web site, this will ensure that your cookie is safe

Files under virtual directory

Don’t leave any backups in your virtual directory, sometimes we make a backup from web.config in the shape of web.bak or some other format, these files may be served to the client and a hacker can use the info inside it to hack your application.

Session Hijacking

If you are using session variables, which is mainly some variables stored on server session or a state server, depending on your configuration, the web server has to has a key for your session in order to be able to retrieve the data for your session, this key is called Session ID, and by default it is stored in a cookie named ASP.NET_SessionId, if someone gets to this cookie value, they will be able to log in to the site, use any cookie editor extension and update the session ID to your ID and from there they will be logged in as you, so if your site has an edit profile page or a page that allows the user to add credit card info, this data will be stolen.

To prevent session hijacking, you have to do the following:

  1. Use SSL, this way no one will be able to capture the traffic between your users and the web application, consequently, they won’t capture the session ID cookie value
  2. Make the session ID value harder to guess, the worst thing to do is to make the session ID as incremental value, if I logged in and found my session ID as 12345, then I can easily edit it to another number and it will be easy to obtain the session for another user
  3. Set the session ID cookie to be always Http Only, this will prevent any JavaScript from reading its value and will make it harder for any XSS
  4. Regenerate the session ID, this will make it harder for the attacker, even if he got the session ID, it won’t be valid for long time
  5. Prevent concurrent sessions, if you have your user logged in from one session, and he tried to login from another session, there is a probability that he is not the same person, you can be more certain by validating the IP and time used, ex: if the user IP is from USA, and at the same time the user is trying to connect from Europe, then most probably this is an attack

Eliminate unneeded headers

Some headers are not necessary in your web page response, such as Server, Asp-Net version and so on, these headers not only add a performance headache, but make the attacker job easier as he already knows what framework, servers and environment he will be attacking, so always edit your web.config or IIS Console and remove these unneeded headers

Database Level

Data Encryption

A very simple case for encrypting the data is the user passwords, never store it in plain text, you have to use a strong algorithm to encrypt the important data in your database, but note that this will affect the performance of the application because the web application will have to do some calculations to encrypt the data before persistence and decrypt it after retrieval, also you will not be able to write queries that include any of the encrypted columns.

SQL Server 2016 has a great feature called Dynamic Data Masking, it allows you to mask the data stored in a column in some predefined formats, ex: Credit Card, Social Security Number or an email address, permissions can be granted for some users and only those users will be able to see the data unmasked.

To read more about the topic, please follow this link

IPSec

This feature blocks all incoming connections to a specific server on all ports and allows you to allow certain IP and certain port, so in this case, the database server will not allow any connections but from the web servers for your application

IPsec Configuration

SQL Server Transparent Data Encryption (TDE)

In case all the above precautions failed to protect your database, at least you can protect the data on rest, TDE allows you to encrypt the database’s data file itself so that no one can open it on another computer without a secret key, to encrypt your database file, you need a master key, certificate protected by this master key, a database encryption key which is protected by the certificate and finally set the database to use encryption.

To read more about it, following this link https://msdn.microsoft.com/en-us/library/bb934049.aspx