Browse Tag: microsoft

Create a RESTful API with authentication using Web API and Jwt

Web API is a feature of the ASP .NET framework that dramatically simplifies building RESTful (REST like) HTTP services that are cross platform and device and browser agnostic. With Web API, you can create endpoints that can be accessed using a combination of descriptive URLs and HTTP verbs. Those endpoints can serve data back to the caller as either JSON or XML that is standards compliant. With JSON Web Tokens (Jwt), which are typically stateless, you can add an authentication and authorization layer enabling you to restrict access to some or all of your API.

The purpose of this tutorial is to develop the beginnings of a Book Store API, using Microsoft Web API with (C#), which authenticates and authorizes each requests, exposes OAuth2 endpoints, and returns data about books and reviews for consumption by the caller. The caller in this case will be Postman, a useful utility for querying API’s.

In a follow up to this post we will write a front end to interact with the API directly.

Set up

Open Visual Studio (I will be using Visual Studio 2015 Community edition, you can use whatever version you like) and create a new Empty project, ensuring you select the Web API option;

Where you save the project is up to you, but I will create my projects under *C:\Source*. For simplicity you might want to do the same.

New Project

Next, packages.

Packages

Open up the packages.config file. Some packages should have already been added to enable Web API itself. Please add the the following additional packages;

install-package EntityFramework
install-package Microsoft.AspNet.Cors
install-package Microsoft.AspNet.Identity.Core
install-package Microsoft.AspNet.Identity.EntityFramework
install-package Microsoft.AspNet.Identity.Owin
install-package Microsoft.AspNet.WebApi.Cors
install-package Microsoft.AspNet.WebApi.Owin
install-package Microsoft.Owin.Cors
install-package Microsoft.Owin.Security.Jwt
install-package Microsoft.Owin.Host.SystemWeb
install-package System.IdentityModel.Tokens.Jwt
install-package Thinktecture.IdentityModel.Core

These are the minimum packages required to provide data persistence, enable CORS (Cross-Origin Resource Sharing), and enable generating and authenticating/authorizing Jwt’s.

Entity Framework

We will use Entity Framework for data persistence, using the Code-First approach. Entity Framework will take care of generating a database, adding tables, stored procedures and so on. As an added benefit, Entity Framework will also upgrade the schema automatically as we make changes. Entity Framework is perfect for rapid prototyping, which is what we are in essence doing here.

Create a new IdentityDbContext called BooksContext, which will give us Users, Roles and Claims in our database. I like to add this under a folder called Core, for organization. We will add our entities to this later.

namespace BooksAPI.Core
{
    using Microsoft.AspNet.Identity.EntityFramework;

    public class BooksContext : IdentityDbContext
    {

    }
}

Claims are used to describe useful information that the user has associated with them. We will use claims to tell the client which roles the user has. The benefit of roles is that we can prevent access to certain methods/controllers to a specific group of users, and permit access to others.

Add a DbMigrationsConfiguration class and allow automatic migrations, but prevent automatic data loss;

namespace BooksAPI.Core
{
    using System.Data.Entity.Migrations;

    public class Configuration : DbMigrationsConfiguration<BooksContext>
    {
        public Configuration()
        {
            AutomaticMigrationsEnabled = true;
            AutomaticMigrationDataLossAllowed = false;
        }
    }
}

Whilst losing data at this stage is not important (we will use a seed method later to populate our database), I like to turn this off now so I do not forget later.

Now tell Entity Framework how to update the database schema using an initializer, as follows;

namespace BooksAPI.Core
{
    using System.Data.Entity;

    public class Initializer : MigrateDatabaseToLatestVersion<BooksContext, Configuration>
    {
    }
}

This tells Entity Framework to go ahead and upgrade the database to the latest version automatically for us.

Finally, tell your application about the initializer by updating the Global.asax.cs file as follows;

namespace BooksAPI
{
    using System.Data.Entity;
    using System.Web;
    using System.Web.Http;
    using Core;

    public class WebApiApplication : HttpApplication
    {
        protected void Application_Start()
        {
            GlobalConfiguration.Configure(WebApiConfig.Register);
            Database.SetInitializer(new Initializer());
        }
    }
}

Data Provider

By default, Entity Framework will configure itself to use LocalDB. If this is not desirable, say you want to use SQL Express instead, you need to make the following adjustments;

Open the Web.config file and delete the following code;

<entityFramework>
    <defaultConnectionFactory type="System.Data.Entity.Infrastructure.LocalDbConnectionFactory, EntityFramework">
        <parameters>
            <parameter value="mssqllocaldb" />
        </parameters>
    </defaultConnectionFactory>
    <providers>
        <provider invariantName="System.Data.SqlClient" type="System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer" />
    </providers>
</entityFramework>

And add the connection string;

<connectionStrings>
    <add name="BooksContext" providerName="System.Data.SqlClient" connectionString="Server=.;Database=Books;Trusted_Connection=True;" />
</connectionStrings>

Now we’re using SQL Server directly (whatever flavour that might be) rather than LocalDB.

JSON

Whilst we’re here, we might as well configure our application to return camel-case JSON (thisIsCamelCase), instead of the default pascal-case (ThisIsPascalCase).

Add the following code to your Application_Start method;

var formatters = GlobalConfiguration.Configuration.Formatters;
var jsonFormatter = formatters.JsonFormatter;
var settings = jsonFormatter.SerializerSettings;
settings.Formatting = Formatting.Indented;
settings.ContractResolver = new CamelCasePropertyNamesContractResolver();

There is nothing worse than pascal-case JavaScript.

CORS (Cross-Origin Resource Sharing)

Cross-Origin Resource Sharing, or CORS for short, is when a client requests access to a resource (an image, or say, data from an endpoint) from an origin (domain) that is different from the domain where the resource itself originates.

This step is completely optional. We are adding in CORS support here because when we come to write our client app in subsequent posts that follow on from this one, we will likely use a separate HTTP server (for testing and debugging purposes). When released to production, these two apps would use the same host (Internet Information Services (IIS)).

To enable CORS, open WebApiConfig.cs and add the following code to the beginning of the Register method;

var cors = new EnableCorsAttribute("*", "*", "*");
config.EnableCors(cors);
config.MessageHandlers.Add(new PreflightRequestsHandler());

And add the following class (in the same file if you prefer for quick reference);

public class PreflightRequestsHandler : DelegatingHandler
{
    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        if (request.Headers.Contains("Origin") && request.Method.Method == "OPTIONS")
        {
            var response = new HttpResponseMessage {StatusCode = HttpStatusCode.OK};
            response.Headers.Add("Access-Control-Allow-Origin", "*");
            response.Headers.Add("Access-Control-Allow-Headers", "Origin, Content-Type, Accept, Authorization");
            response.Headers.Add("Access-Control-Allow-Methods", "*");
            var tsc = new TaskCompletionSource<HttpResponseMessage>();
            tsc.SetResult(response);
            return tsc.Task;
        }
        return base.SendAsync(request, cancellationToken);
    }
}

In the CORS workflow, before sending a DELETE, PUT or POST request, the client sends an OPTIONS request to check that the domain from which the request originates is the same as the server. If the request domain and server domain are not the same, then the server must include various access headers that describe which domains have access. To enable access to all domains, we just respond with an origin header (Access-Control-Allow-Origin) with an asterisk to enable access for all.

The Access-Control-Allow-Headers header describes which headers the API can accept/is expecting to receive. The Access-Control-Allow-Methods header describes which HTTP verbs are supported/permitted.

See Mozilla Developer Network (MDN) for a more comprehensive write-up on Cross-Origin Resource Sharing (CORS).

Data Model

With Entity Framework configured, lets create our data structure. The API will expose books, and books will have reviews.

Under the Models folder add a new class called Book. Add the following code;

namespace BooksAPI.Models
{
    using System.Collections.Generic;

    public class Book
    {
        public int Id { get; set; }
        public string Title { get; set; }
        public string Description { get; set; }
        public decimal Price { get; set; }
        public string ImageUrl { get; set; }

        public virtual List<Review> Reviews { get; set; }
    }
}

And add Review, as shown;

namespace BooksAPI.Models
{
    public class Review
    {
        public int Id { get; set; }    
        public string Description { get; set; }    
        public int Rating { get; set; }
        public int BookId { get; set; }
    }
}

Add these entities to the IdentityDbContext we created earlier;

public class BooksContext : IdentityDbContext
{
    public DbSet<Book> Books { get; set; }
    public DbSet<Review> Reviews { get; set; }
}

Be sure to add in the necessary using directives.

A couple of helpful abstractions

We need to abstract a couple of classes that we need to make use of, in order to keep our code clean and ensure that it works correctly.

Under the Core folder, add the following classes;

public class BookUserManager : UserManager<IdentityUser>
{
    public BookUserManager() : base(new BookUserStore())
    {
    }
}

We will make heavy use of the UserManager<T> in our project, and we don’t want to have to initialise it with a UserStore<T> every time we want to make use of it. Whilst adding this is not strictly necessary, it does go a long way to helping keep the code clean.

Now add another class for the UserStore, as shown;

public class BookUserStore : UserStore&lt;IdentityUser&gt;
{
    public BookUserStore() : base(new BooksContext())
    {
    }
}

This code is really important. If we fail to tell the UserStore which DbContext to use, it falls back to some default value.

A network-related or instance-specific error occurred while establishing a connection to SQL Server

I’m not sure what the default value is, all I know is it doesn’t seem to correspond to our applications DbContext. This code will help prevent you from tearing your hair out later wondering why you are getting the super-helpful error message shown above.

API Controller

We need to expose some data to our client (when we write it). Lets take advantage of Entity Frameworks Seed method. The Seed method will pre-populate some books and reviews automatically for us.

Instead of dropping the code in directly for this class (it is very long), please refer to the Configuration.cs file on GitHub.

This code gives us a little bit of starting data to play with, instead of having to add a bunch of data manually each time we make changes to our schema that require the database to be re-initialized (not really in our case as we have an extremely simple data model, but in larger applications this is very useful).

Books Endpoint

Next, we want to create the RESTful endpoint that will retrieve all the books data. Create a new Web API controller called BooksController and add the following;

public class BooksController : ApiController
{
    [HttpGet]
    public async Task<IHttpActionResult> Get()
    {
        using (var context = new BooksContext())
        {
            return Ok(await context.Books.Include(x => x.Reviews).ToListAsync());
        }
    }
}

With this code we are fully exploiting recent changes to the .NET framework; the introduction of async and await. Writing asynchronous code in this manner allows the thread to be released whilst data (Books and Reviews) is being retrieved from the database and converted to objects to be consumed by our code. When the asynchronous operation is complete, the code picks up where it was up to and continues executing. (By which, we mean the hydrated data objects are passed to the underlying framework and converted to JSON/XML and returned to the client).

Reviews Endpoint

We’re also going to enable authorized users to post reviews and delete reviews. For this we will need a ReviewsController with the relevant Post and Delete methods. Add the following code;

Create a new Web API controller called ReviewsController and add the following code;

public class ReviewsController : ApiController
{
    [HttpPost]
    public async Task<IHttpActionResult> Post([FromBody] ReviewViewModel review)
    {
        using (var context = new BooksContext())
        {
            var book = await context.Books.FirstOrDefaultAsync(b => b.Id == review.BookId);
            if (book == null)
            {
                return NotFound();
            }

            var newReview = context.Reviews.Add(new Review
            {
                BookId = book.Id,
                Description = review.Description,
                Rating = review.Rating
            });

            await context.SaveChangesAsync();
            return Ok(new ReviewViewModel(newReview));
        }
    }

    [HttpDelete]
    public async Task<IHttpActionResult> Delete(int id)
    {
        using (var context = new BooksContext())
        {
            var review = await context.Reviews.FirstOrDefaultAsync(r => r.Id == id);
            if (review == null)
            {
                return NotFound();
            }

            context.Reviews.Remove(review);
            await context.SaveChangesAsync();
        }
        return Ok();
    }
}

There are a couple of good practices in play here that we need to highlight.

The first method, Post allows the user to add a new review. Notice the parameter for the method;

[FromBody] ReviewViewModel review

The [FromBody] attribute tells Web API to look for the data for the method argument in the body of the HTTP message that we received from the client, and not in the URL. The second parameter is a view model that wraps around the Review entity itself. Add a new folder to your project called ViewModels, add a new class called ReviewViewModel and add the following code;

public class ReviewViewModel
{
    public ReviewViewModel()
    {
    }

    public ReviewViewModel(Review review)
    {
        if (review == null)
        {
            return;
        }

        BookId = review.BookId;
        Rating = review.Rating;
        Description = review.Description;
    }

    public int BookId { get; set; }
    public int Rating { get; set; }
    public string Description { get; set; }

    public Review ToReview()
    {
        return new Review
        {
            BookId = BookId,
            Description = Description,
            Rating = Rating
        };
    }
}

We are just copying all he properties from the Review entity to the ReviewViewModel entity and vice-versa. So why bother? First reason, to help mitigate a well known under/over-posting vulnerability (good write up about it here) inherent in most web services. Also, it helps prevent unwanted information being sent to the client. With this approach we have to explicitly expose data to the client by adding properties to the view model.

For this scenario, this approach is probably a bit overkill, but I highly recommend it keeping your application secure is important, as well as is the need to prevent leaking of potentially sensitive information. A tool I’ve used in the past to simplify this mapping code is AutoMapper. I highly recommend checking out.

Important note: In order to keep our API RESTful, we return the newly created entity (or its view model representation) back to the client for consumption, removing the need to re-fetch the entire data set.

The Delete method is trivial. We accept the Id of the review we want to delete as a parameter, then fetch the entity and finally remove it from the collection. Calling SaveChangesAsync will make the change permanent.

Meaningful response codes

We want to return useful information back to the client as much as possible. Notice that the Post method returns NotFound(), which translates to a 404 HTTP status code, if the corresponding Book for the given review cannot be found. This is useful for client side error handling. Returning Ok() will return 200 (HTTP ‘Ok’ status code), which informs the client that the operation was successful.

Authentication and Authorization Using OAuth and JSON Web Tokens (JWT)

My preferred approach for dealing with authentication and authorization is to use JSON Web Tokens (JWT). We will open up an OAuth endpoint to client credentials and return a token which describes the users claims. For each of the users roles we will add a claim (which could be used to control which views the user has access to on the client side).

We use OWIN to add our OAuth configuration into the pipeline. Add a new class to the project called Startup.cs and add the following code;

using Microsoft.Owin;
using Owin;

[assembly: OwinStartup(typeof (BooksAPI.Startup))]

namespace BooksAPI
{
    public partial class Startup
    {
        public void Configuration(IAppBuilder app)
        {
            ConfigureOAuth(app);
        }
    }
}

Notice that Startup is a partial class. I’ve done that because I want to keep this class as simple as possible, because as the application becomes more complicated and we add more and more middle-ware, this class will grow exponentially. You could use a static helper class here, but the preferred method from the MSDN documentation seems to be leaning towards using partial classes specifically.

Under the App_Start folder add a new class called Startup.OAuth.cs and add the following code;

using System;
using System.Configuration;
using BooksAPI.Core;
using BooksAPI.Identity;
using Microsoft.AspNet.Identity;
using Microsoft.AspNet.Identity.EntityFramework;
using Microsoft.Owin;
using Microsoft.Owin.Security;
using Microsoft.Owin.Security.DataHandler.Encoder;
using Microsoft.Owin.Security.Jwt;
using Microsoft.Owin.Security.OAuth;
using Owin;

namespace BooksAPI
{
    public partial class Startup
    {
        public void ConfigureOAuth(IAppBuilder app)
        {            
        }
    }
}

Note. When I wrote this code originally I encountered a quirk. After spending hours pulling out my hair trying to figure out why something was not working, I eventually discovered that the ordering of the code in this class is very important. If you don’t copy the code in the exact same order, you may encounter unexpected behaviour. Please add the code in the same order as described below.

OAuth secrets

First, add the following code;

var issuer = ConfigurationManager.AppSettings["issuer"];
var secret = TextEncodings.Base64Url.Decode(ConfigurationManager.AppSettings["secret"]);
  • Issuer – a unique identifier for the entity that issued the token (not to be confused with Entity Framework’s entities)
  • Secret – a secret key used to secure the token and prevent tampering

I keep these values in the Web configuration file (Web.config). To be precise, I split these values out into their own configuration file called keys.config and add a reference to that file in the main Web.config. I do this so that I can exclude just the keys from source control by adding a line to my .gitignore file.

To do this, open Web.config and change the <appSettings> section as follows;

<appSettings file="keys.config">
</appSettings>

Now add a new file to your project called keys.config and add the following code;

<appSettings>
  <add key="issuer" value="http://localhost/"/>
  <add key="secret" value="IxrAjDoa2FqElO7IhrSrUJELhUckePEPVpaePlS_Xaw"/>
</appSettings>

Adding objects to the OWIN context

We can make use of OWIN to manage instances of objects for us, on a per request basis. The pattern is comparable to IoC, in that you tell the “container” how to create an instance of a specific type of object, then request the instance using a Get<T> method.

Add the following code;

app.CreatePerOwinContext(() => new BooksContext());
app.CreatePerOwinContext(() => new BookUserManager());

The first time we request an instance of BooksContext for example, the lambda expression will execute and a new BooksContext will be created and returned to us. Subsequent requests will return the same instance.

Important note: The life-cycle of object instance is per-request. As soon as the request is complete, the instance is cleaned up.

Enabling Bearer Authentication/Authorization

To enable bearer authentication, add the following code;

app.UseJwtBearerAuthentication(new JwtBearerAuthenticationOptions
{
    AuthenticationMode = AuthenticationMode.Active,
    AllowedAudiences = new[] { "Any" },
    IssuerSecurityTokenProviders = new IIssuerSecurityTokenProvider[]
    {
        new SymmetricKeyIssuerSecurityTokenProvider(issuer, secret)
    }
});

The key takeaway of this code;

  • State who is the audience (we’re specifying “Any” for the audience, as this is a required field but we’re not fully implementing it).
  • State who is responsible for generating the tokens. Here we’re using SymmetricKeyIssuerSecurityTokenProvider and passing it our secret key to prevent tampering. We could use the X509CertificateSecurityTokenProvider, which uses a X509 certificate to secure the token (but I’ve found these to be overly complex in the past and I prefer a simpler implementation).

This code adds JWT bearer authentication to the OWIN pipeline.

Enabling OAuth

We need to expose an OAuth endpoint so that the client can request a token (by passing a user name and password).

Add the following code;

app.UseOAuthAuthorizationServer(new OAuthAuthorizationServerOptions
{
    AllowInsecureHttp = true,
    TokenEndpointPath = new PathString("/oauth2/token"),
    AccessTokenExpireTimeSpan = TimeSpan.FromMinutes(30),
    Provider = new CustomOAuthProvider(),
    AccessTokenFormat = new CustomJwtFormat(issuer)
});

Some important notes with this code;

  • We’re going to allow insecure HTTP requests whilst we are in development mode. You might want to disable this using a #IF Debug directive so that you don’t allow insecure connections in production.
  • Open an endpoint under /oauth2/token that accepts post requests.
  • When generating a token, make it expire after 30 minutes (1800 seconds).
  • We will use our own provider, CustomOAuthProvider, and formatter, CustomJwtFormat, to take care of authentication and building the actual token itself.

We need to write the provider and formatter next.

Formatting the JWT

Create a new class under the Identity folder called CustomJwtFormat.cs. Add the following code;

namespace BooksAPI.Identity
{
    using System;
    using System.Configuration;
    using System.IdentityModel.Tokens;
    using Microsoft.Owin.Security;
    using Microsoft.Owin.Security.DataHandler.Encoder;
    using Thinktecture.IdentityModel.Tokens;

    public class CustomJwtFormat : ISecureDataFormat<AuthenticationTicket>
    {
        private static readonly byte[] _secret = TextEncodings.Base64Url.Decode(ConfigurationManager.AppSettings["secret"]);
        private readonly string _issuer;

        public CustomJwtFormat(string issuer)
        {
            _issuer = issuer;
        }

        public string Protect(AuthenticationTicket data)
        {
            if (data == null)
            {
                throw new ArgumentNullException(nameof(data));
            }

            var signingKey = new HmacSigningCredentials(_secret);
            var issued = data.Properties.IssuedUtc;
            var expires = data.Properties.ExpiresUtc;

            return new JwtSecurityTokenHandler().WriteToken(new JwtSecurityToken(_issuer, null, data.Identity.Claims, issued.Value.UtcDateTime, expires.Value.UtcDateTime, signingKey));
        }

        public AuthenticationTicket Unprotect(string protectedText)
        {
            throw new NotImplementedException();
        }
    }
}

This is a complicated looking class, but its pretty straightforward. We are just fetching all the information needed to generate the token, including the claims, issued date, expiration date, key and then we’re generating the token and returning it back.

Please note: Some of the code we are writing today was influenced by JSON Web Token in ASP.NET Web API 2 using OWIN by Taiseer Joudeh. I highly recommend checking it out.

The authentication bit

We’re almost there, honest! Now we want to authenticate the user.

using System.Linq;
using System.Security.Claims;
using System.Security.Principal;
using System.Threading;
using System.Threading.Tasks;
using System.Web;
using BooksAPI.Core;
using Microsoft.AspNet.Identity;
using Microsoft.AspNet.Identity.EntityFramework;
using Microsoft.AspNet.Identity.Owin;
using Microsoft.Owin.Security;
using Microsoft.Owin.Security.OAuth;

namespace BooksAPI.Identity
{
    public class CustomOAuthProvider : OAuthAuthorizationServerProvider
    {
        public override Task GrantResourceOwnerCredentials(OAuthGrantResourceOwnerCredentialsContext context)
        {
            context.OwinContext.Response.Headers.Add("Access-Control-Allow-Origin", new[] {"*"});

            var user = context.OwinContext.Get<BooksContext>().Users.FirstOrDefault(u => u.UserName == context.UserName);
            if (!context.OwinContext.Get<BookUserManager>().CheckPassword(user, context.Password))
            {
                context.SetError("invalid_grant", "The user name or password is incorrect");
                context.Rejected();
                return Task.FromResult<object>(null);
            }

            var ticket = new AuthenticationTicket(SetClaimsIdentity(context, user), new AuthenticationProperties());
            context.Validated(ticket);

            return Task.FromResult<object>(null);
        }

        public override Task ValidateClientAuthentication(OAuthValidateClientAuthenticationContext context)
        {
            context.Validated();
            return Task.FromResult<object>(null);
        }

        private static ClaimsIdentity SetClaimsIdentity(OAuthGrantResourceOwnerCredentialsContext context, IdentityUser user)
        {
            var identity = new ClaimsIdentity("JWT");
            identity.AddClaim(new Claim(ClaimTypes.Name, context.UserName));
            identity.AddClaim(new Claim("sub", context.UserName));

            var userRoles = context.OwinContext.Get<BookUserManager>().GetRoles(user.Id);
            foreach (var role in userRoles)
            {
                identity.AddClaim(new Claim(ClaimTypes.Role, role));
            }

            return identity;
        }
    }
}

As we’re not checking the audience, when ValidateClientAuthentication is called we can just validate the request. When the request has a grant_type of password, which all our requests to the OAuth endpoint will have, the above GrantResourceOwnerCredentials method is executed. This method authenticates the user and creates the claims to be added to the JWT.

Testing

There are 2 tools you can use for testing this.

Technique 1 – Using the browser

Open up a web browser, and navigate to the books URL.

Testing with the web browser

You will see the list of books, displayed as XML. This is because Web API can serve up data either as XML or as JSON. Personally, I do not like XML, JSON is my choice these days.

Technique 2 (Preferred) – Using Postman

To make Web API respond in JSON we need to send along a Accept header. The best tool to enable use to do this (for Google Chrome) is Postman. Download it and give it a go if you like.

Drop the same URL into the Enter request URL field, and click Send. Notice the response is in JSON;

Postman response in JSON

This worked because Postman automatically adds the Accept header to each request. You can see this by clicking on the Headers tab. If the header isn’t there and you’re still getting XML back, just add the header as shown in the screenshot and re-send the request.

To test the delete method, change the HTTP verb to Delete and add the ReviewId to the end of the URL. For example; http://localhost:62996/api/reviews/9

Putting it all together

First, we need to restrict access to our endpoints.

Add a new file to the App_Start folder, called FilterConfig.cs and add the following code;

public class FilterConfig
{
    public static void Configure(HttpConfiguration config)
    {
        config.Filters.Add(new AuthorizeAttribute());
    }
}

And call the code from Global.asax.cs as follows;

GlobalConfiguration.Configure(FilterConfig.Configure);

Adding this code will restrict access to all endpoints (except the OAuth endpoint) to requests that have been authenticated (a request that sends along a valid Jwt).

You have much more fine-grain control here, if required. Instead of adding the above code, you could instead add the AuthorizeAttribute to specific controllers or even specific methods. The added benefit here is that you can also restrict access to specific users or specific roles;

Example code;

[Authorize(Roles = "Admin")]

The roles value (“Admin”) can be a comma-separated list. For us, restricting access to all endpoints will suffice.

To test that this code is working correctly, simply make a GET request to the books endpoint;

GET http://localhost:62996/api/books

You should get the following response;

{
  "message": "Authorization has been denied for this request."
}

Great its working. Now let’s fix that problem.

Make a POST request to the OAuth endpoint, and include the following;

  • Headers
    • Accept application/json
    • Accept-Language en-gb
    • Audience Any
  • Body
    • username administrator
    • password administrator123
    • grant_type password

Shown in the below screenshot;

OAuth Request

Make sure you set the message type as x-www-form-urlencoded.

If you are interested, here is the raw message;

POST /oauth2/token HTTP/1.1
Host: localhost:62996
Accept: application/json
Accept-Language: en-gb
Audience: Any
Content-Type: application/x-www-form-urlencoded
Cache-Control: no-cache
Postman-Token: 8bc258b2-a08a-32ea-3cb2-2e7da46ddc09

username=administrator&password=administrator123&grant_type=password

The form data has been URL encoded and placed in the message body.

The web service should authenticate the request, and return a token (Shown in the response section in Postman). You can test that the authentication is working correctly by supplying an invalid username/password. In this case, you should get the following reply;

{
  "error": "invalid_grant"
}

This is deliberately vague to avoid giving any malicious users more information than they need.

Now to get a list of books, we need to call the endpoint passing in the token as a header.

Change the HTTP verb to GET and change the URL to; http://localhost:62996/api/books.

On the Headers tab in Postman, add the following additional headers;

Authorization Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1bmlxdWVfbmFtZSI6ImFkbWluaXN0cmF0b3IiLCJzdWIiOiJhZG1pbmlzdHJhdG9yIiwicm9sZSI6IkFkbWluaXN0cmF0b3IiLCJpc3MiOiJodHRwOi8vand0YXV0aHpzcnYuYXp1cmV3ZWJzaXRlcy5uZXQiLCJhdWQiOiJBbnkiLCJleHAiOjE0NTgwNDI4MjgsIm5iZiI6MTQ1ODA0MTAyOH0.uhrqQW6Ik_us1lvDXWJNKtsyxYlwKkUrCGXs-eQRWZQ

See screenshot below;

Authorization Header

Success! We have data from our secure endpoint.

Summary

In this introduction we looked at creating a project using Web API to issue and authenticate Jwt (JSON Web Tokens). We created a simple endpoint to retrieve a list of books, and also added the ability to get a specific book/review and delete reviews in a RESTful way.

This project is the foundation for subsequent posts that will explore creating a rich client side application, using modern JavaScript frameworks, which will enable authentication and authorization.

How to create your own ASP .NET MVC model binder

Model binding is the process of converting POST data or data present in the Url into a .NET object(s).  ASP .NET MVC makes this very simple by providing the DefaultModelBinder.  You’ve probably seen this in action many times (even if you didn’t realise it!), but did you know you can easily write your own?

A typical ASP .NET MVC Controller

You’ve probably written or seen code like this many hundreds of times;

public ActionResult Index(int id)
{
    using (ExceptionManagerEntities context = new ExceptionManagerEntities())
    {
        Error entity = context.Errors.FirstOrDefault(c => c.ID == id);

<pre><code>    if (entity != null)
    {
        return View(entity);                    
    }
}

return View();
</code></pre>

}

Where did Id come from? It probably came from one of three sources; the Url (Controller/View/{id}), the query string (Controller/View?id={id}), or the post data.  Under the hood, ASP .NET examines your controller method, and searches each of these places looking for data that matches the data type and the name of the parameter.  It may also look at your route configuration to aid this process.

A typical controller method

The code shown in the first snippet is very common in many ASP .NET MVC controllers.  Your action method accepts an Id parameter, your method then fetches an entity based on that Id, and then does something useful with it (and typically saves it back to the database or returns it back to the view).

You can create your own MVC model binder to cut out this step, and simply have the entity itself passed to your action method. 

Take the following code;

public ActionResult Index(Error error)
{
    if (error != null)
    {
        return View(error);
    }

<pre><code>return View();
</code></pre>

}

How much sweeter is that?

Create your own ASP .NET MVC model binder

You can create your own model binder in two simple steps;

  1. Create a class that inherits from DefaultModelBinder, and override the BindModel method (and build up your entity in there)
  2. Add a line of code to your Global.asax.cs file to tell MVC to use that model binder.

Before we forget, tell MVC about your model binder as follows (in the Application_Start method in your Global.asax.cs file);

ModelBinders.Binders.Add(typeof(Error), new ErrorModelBinder());

This tells MVC that if it stumbles across a parameter on an action method of type Error, it should attempt to bind it using the ErrorModelBinder class you just created.

Your BindModel implementation will look like this;

public override object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
    if (bindingContext.ModelType == typeof(Error))
    {
        ValueProviderResult valueProviderValue = bindingContext.ValueProvider.GetValue("id");

<pre><code>    int id;
    if (valueProviderValue != null &amp;&amp; int.TryParse((string)valueProviderValue.RawValue, out id))
    {
        using (ExceptionManagerEntities context = new ExceptionManagerEntities())
        {
            return context.Errors.FirstOrDefault(c =&gt; c.ID == id);
        }
    }
}

return base.BindModel(controllerContext, bindingContext);
</code></pre>

}

The code digested;

  1. Make sure that we are only trying to build an object of type Error (this should always be true, but just as a safety net lets include this check anyway).
  2. Get the ValueProviderResult of the value provider we care about (in this case, the Id property).
  3. Check that it exists, and that its definitely an integer.
  4. Now fetch our entity and return it back.
  5. Finally, if any of our safety nets fail, just return back to the model binder and let that try and figure it out for us.

And the end result?

ErrorIsBound

Your new model binder can now be used on any action method throughout your ASP .NET MVC application.

Summary

You can significantly reduce code duplication and simplify your controller classes by creating your own model binder.  Simply create a new class that derives from DefaultModelBinder and add your logic to fetch your entity.  Be sure to add a line to your Global.asax.cs file so that MVC knows what to do with it, or you may get some confusing error messages.

Should I get certified?

The value of Microsoft certifications has split opinion for years, and both camps feel very passionate about their side of the argument.  In this post I’ll try and look constructively at the value of Microsoft certifications, so you can make the decision for yourself.  I’m specifically talking about Microsoft developer certifications here, but the concepts/points could likely be applied to any certifying body.

1. What are the current Microsoft Certification paths (for developers)?

There is a route for just about every job role in the industry, I have to narrow the criteria quite a bit just to stop this post from becoming long and boring.  Here is a high level overview;

MTA (Microsoft Technology Associate)
This is a foundation level certification targeting people getting started in their career.  There are 3 main routes; IT Infrastructure (up to 4 exams), Database (1 exam) and Developer (8 exams).  Reading the overview of each route shows that each exam is meant as an introduction to that particular field.

MCSD (Microsoft Certified Solutions Developer)
This is a middle-level (most developers will probably fall into this band) certification with several different routes; Web, Windows Store (JS, CSS, HTML), Windows Store (C’#) and Application Lifecycle Management.

MCPD (Microsoft Certified Professional Developer)
Now all but defunct (replaced by MCSD). this was a more advanced path, again with several routes available including; Desktop, Windows Phone and Silverlight.  The desktop route had a strong focus on WPF, WCF, and Entity Framework, as well as developing applications for the enterprise.

So many paths

The fact that there are so many paths available targeting all developers from novice to seasoned professional forms the basis of the first argument against getting certified.

The facts are quite simple.  Each exam costs around £99 to sit, and there may be up to 4 exams in any certificate route (usually no less than 2).  This price does not include between £20-£40 for an official Microsoft training guide and subscriptions to third party training providers, such as PluralSight, which will set you back another $29 a month.  If you opt for classroom training, you could easily be looking at £5000 for a 5 day intensive course with no guarantee that you’ll pass at the end.

Microsoft would probably argue that they are trying to provide useful training to developers of all skill levels, and that £99 is cheap compared to something like CISCO CCNA, costing between $150-$295.

Conclusion: Microsoft offer exams to make money plain and simple.  Whilst not the most expensive, Microsoft and their partners make money from the exam itself, and supporting training material such as books.

2. Will getting certified help me get a job/earn more money?

The answer is not simple, “Yes with an if, no with a but”.

According to Mutually Human, employers hire developers based on the following criteria;

  • Experience
  • Skills
  • Education

When a company is looking to match a candidate to a role, they favour experience and skills over education.  Don’t be fooled, Microsoft certifications fall under the education banner, and aren’t necessarily “proof” that you are highly skilled.  This is true for all but the most junior positions.

Yes, if you are at the very beginning of your career.
Having a decent Microsoft certification will give you a good edge over your competition.  At worst, it shows that you know how to study training material and that you had the drive and determination to do so.  It’s also a couple of extra lines on your CV and a little “something extra” to chat about when you get invited to interview.

No, but exams like the ones offered by Microsoft will highlight areas that you are most passionate about.
If you can identify where your skills are strongest, you may become a specialist.  If your specialist skill is in high demand (a good SQL developer is always a good example) you will certainly be able to demand higher rates if you are a contractor/freelancer/consultant.

3. Is it worth the time/effort?

If you are serious about sitting a Microsoft exam, and I mean without cheating then you will be required to make a significant time investment.

If you are already an experienced software developer looking to sit the MCSD Web Applications route, I estimate that it should take about 90 days to complete (30 days per exam [70-480, 70-486, 70-487]).  And really that is for developers who have already been actively using/developing code using those technologies on a daily basis for at least a year.  If you are not using these technologies regularly, you would have to at least double or triple the amount of time/effort required.

So it is?

Another incredibly tough question to answer.  Honestly, it depends.  It all depends on your motives.

If you want to complete a Microsoft certification because you believe it will get you a promotion or a pay rise FORGET IT.  Don’t waste your time because it probably won’t happen.  And if you get lucky, you will truly be in the minority.  I’m speaking from personal experience because this was my motivation once upon a time.

If, say, you are a desktop developer looking to make the transition into web development then ABSOLUTELY.  You will gain valuable insight into a wide range of technologies, and gain some hands on experience.  The Microsoft exams tend to be at quite a high level, but they will certainly give you a taste for what you can expect from a full time position.  And for a couple of hundred quid, you can get a nice shiny certificate to show your current/next employer.

Summary

Microsoft offers a wide range of courses, from novice (MTA) to more advanced (MCPD), targeting desktop, mobile, and web developers.  Microsoft offers certifications because it generates revenue, not only from the exams but from books and supporting training material.  Getting certified likely wont help you get a promotion or a pay rise, but they probably will help you get your foot in the door if you are at a more junior level.  Microsoft exams, however, may help you decide on an area on specialise, and specialists usually get paid more money (especially if you’re a contractor/freelancer/consultant).  Microsoft exams are a big commitment no matter what level you’re at, so be prepared to have to invest significant amounts of your time into them.

I’m 70/30 in favour of sitting Microsoft exams, as long as you’re doing it for the right reasons.

How to pass Microsoft Exam 70-487 (Developing Microsoft Azure and Web Services) in 30 days

Before you continue reading this blog post, you need to be aware of the following; This is not a “quick fix” or an “easy solution”.  I have not discovered some secret formula to guarantee you pass with 100% marks every time.  The exam is genuinely challenging and the only way you are going to pass is by working hard!  You will not find any brain dumps here!  If you’re afraid of working hard to achieve your goals, you best leave now …

Still Here? Congratulations, you are taking your first steps towards passing the Microsoft exam 70-487.  The purpose of this post is to link to all the resources that I used when revising for the exam myself. So why 30 days? Well its important to set yourself a target.  Setting yourself a target motivates you.  If you are paying for this exam yourself (like I did), you’ll really want to make sure you give the exam your best shot (after all, £99 is a hefty amount of money!)

Know the exam objectives

Probably the most important thing you should do before starting studying for any exam is to find out what the exam objectives are.  Basically the exam objectives tell you what to study for! There is not much point in learning material that is of no relevance! You can find the exam objectives under the “Skills Measured” section on the official 70-487 exam page.

Books

Books aren’t for everybody, some people find it hard to sit down and read a book … I get that, but you should at least try. Have a look at Exam Ref 70-487: Developing Windows Azure and Web Services. AzureWebServicesBook This book was written by William Ryan, Wouter de Kort, and Shane Milton.  Its no secret that I am generally not a fan of these books.  I generally find that they’re not particularly well written, boring, the examples arbitrary…but not this book.  I found this book to be a breath of fresh air and actually pleasurable and enjoyable to read.

Each objective on the exam receives equal coverage with helpful realistic exams.  The book is not chatty (which I like) but is in no way robotic (like some Microsoft books I have read in the past).  There are some good insights into the various technologies at a high level, and the authors are clearly very experienced in this field.

You may also want to scrub up on your Entity Framework as well, as this is mentioned in the exam objectives several times.  Probably one of the best Entity Framework books I have ever read was written by Julie Lerman, named Programming Entity Framework.  If you want to become a top Entity Framework developer, I highly recommend that you check it out.  There are also lots of Entity Framework posts on my blog.

Microsoft Virtual Academy (MVA) JumpStart

Microsoft has provided some great FREE training videos on the Microsoft Virtual Academy website, so its only polite that you fully exploit these resources. You will want to start with the Windows Azure Web Sites – Deep Dive Jump Start video series.  Just a note, you will need a free Microsoft account to access the videos.

Microsoft (Windows) Azure Deep Dive

Back in January, Microsoft put on an event in celebration of the awesomeness that is Microsoft (then called Windows) Azure.  This was 5 full days of Microsoft Azure training videos, hosted by the likes of Scott Gu and Scott Hanselman (et al).  I highly recommend that you check it out, but don’t spend too much time watching the videos targeted at DevOps or IT support people.  Just focus on the developer videos.

Pluralsight Training Videos

Pluralsight is a subscription (paid for) site offering training material for developers (and now IT professionals as well) at all levels, and in all stages of their careers.  If you don’t already have a subscription (??) you can get a free 10 day trial (up to 200 minutes) to give you a taste.  The subscription starts at a mere $29 (£17.08 ish) a month. Here are some of the videos I watched whilst preparing for this exam; (Make sure you follow along whilst the presenter is talking!)

My Honest Opinion

Brace yourselves, the truth is coming.  Don’t waste spend too much time studying Microsoft Azure.  But wait, isn’t this a Microsoft Azure exam? Well yes it is, but the questions I was asked about Microsoft Azure were pretty straightforward, and any competent developer could have used their powers of deduction to figure out the answers.  Instead, you should focus more of your efforts on getting hands on with WCF.  I wrote a few blog posts about various WCF topics, I recommend you check them out.

Summary

It is possible to pass Microsoft exams in 30 days, assuming you have some background knowledge in the subject and are prepared to work (very!) hard.  Microsoft make a lot of training resources available to you for free, and there are online training providers that can help you out as well (for a small fee).  There is no “one size fits all” or “silver bullet”, so you’ll want to try a range of resources to find what works best for you.  Don’t resort to cheating or you will be caught and banned for life! In case anybody is wondering, I passed the exam with a score of 93% in April 2014. If you found this article useful, please leave comments below!

Easy WCF Security and authorization of users

There are several steps involved in making your WCF service secure, and ensure that clients consuming your service are properly authenticated.  WCF uses BasicHttpBinding out-of-the-box, which generates SOAP envelopes (messages) for each request.  BasicHttpBinding works over standard HTTP, which is great for completely open general purpose services, but not good if you are sending sensitive data over the internet (as HTTP traffic can easily be intercepted).

This post discusses how to take a basic WCF service, which uses BasicHttpBinding, and upgrade it to use WsHttpBinding over SSL (with username/password validation). If you want to become a better WCF developer, you may want to check out Learning WCF: A Hands-on Guide by Michele Lerouz Bustamante. This is a very thorough and insightful WCF book with detailed and practical samples and tips.

Here is the basic sequence of steps needed;

  • Generate a self-signed SSL certificate (you would use a real SSL certificate for live) and add this to the TrustedPeople certificate store.
  • Add a UserNamePasswordValidator.
  • Switch our BasicHttpBinding to WsHttpBinding.
  • Change our MEX (Metadata Exchange) endpoint to support SSL.
  • Specify how the client will authenticate, using the ServiceCredentials class.

You may notice that most of the changes are configuration changes.  You can make the same changes in code if you so desire, but I find the process easier and cleaner when done in XML.

 

BasicHttpBinding vs. WsHttpBinding

Before we kick things off, i found myself asking this question (like so many others before me).  What is the difference between BasicHttpBinding and WsHttpBinding?

If you want a very thorough explanation, there is a very detailed explanation written by Shivprasad Koirala on CodeProject.com.  I highly recommend that you check this out.

The TL:DR version is simply this;

  • BasicHttpBinding supports SOAP v1.1 (WsHttpBinding supports SOAP v1.2)
  • BasicHttpBinding does not support Reliable messaging
  • BasicHttpBinding is insecure, WsHttpBinding supports WS-* specifications.
  • WsHttpBinding supports transporting messages with credentials, BasicHttpBinding supports only Windows/Basic/Certificate authentication.

The project structure

You can view and download the full source code for this project via GitHub, see the end of the post for more details.

We have a WCF Service application with a Service Contract as follows;

[ServiceContract]
public interface IPeopleService
{
    [OperationContract]
    Person[] GetPeople();
}

And the implementation of the Service Contract;

public class PeopleService : IPeopleService
{
    public Person[] GetPeople()
    {
        return new[]
                    {
                        new Person { Age = 45, FirstName = "John", LastName = "Smith" }, 
                        new Person { Age = 42, FirstName = "Jane", LastName = "Smith" }
                    };
    }
}

The model class (composite type, if you will) is as follows;

[DataContract]
public class Person
{
    [DataMember]
    public int Age { get; set; }

    [DataMember]
    public string FirstName { get; set; }

    [DataMember]
    public string LastName { get; set; }
}

The initial configuration is as follows;

<system.serviceModel>
  <behaviors>
    <serviceBehaviors>
      <behavior>
        <serviceMetadata httpGetEnabled="true" httpsGetEnabled="true"/>
        <serviceDebug includeExceptionDetailInFaults="false"/>
      </behavior>
    </serviceBehaviors>
  </behaviors>
  <protocolMapping>
    <add binding="basicHttpsBinding" scheme="https"/>
  </protocolMapping>
  <serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true"/>
</system.serviceModel>

The WCF service can easily be hosted in IIS, simply add a service reference to the WSDL definition file and you’re away. In the interest of completeness, here is the entire client code;

static void Main(string[] args)
{
    PeopleServiceClient client = new PeopleServiceClient();

    foreach (var person in client.GetPeople())
    {
        Console.WriteLine(person.FirstName);
    }

    Console.ReadLine();
}

Hosting in IIS

As briefly mentioned, you can (and probably always will) host your WCF service using Internet Information Services (IIS).

Generating an SSL certificate

Before doing anything, you need an SSL certificate.  Transport based authentication simply does not work if A) You are not on a secure channel and B) Your SSL certificate is not trusted.  You don’t have to purchase an SSL certificate at this stage as a self-signed certificate will suffice (with 1 or 2 extra steps).  You will want to purchase a real SSL certificate when you move your service to the production environment.

You can generate a self-signed SSL certificate either 1 of 2 ways.  You can either do it the hard way, using Microsoft’s rather painful MakeCert.exe Certificate Creation Tool or you can download a free tool from PluralSight (of all places), which provides a super simple user interface and can even add the certificate to the certificate store for you.

Once you have downloaded the tool, run it as an Administrator;

SelfCert

For the purposes of this tutorial, we will be creating a fake website called peoplesite.local.  We will add an entry into the hosts file for this and set it up in IIS.  Its very important that the X.500 distinguished name matches your domain name (or it will not work!).  You will also want to save the certificate as a PFX file so that it can be imported into IIS and used for the HTTPS binding.

Once done open up IIS, click on the root level node, and double click on Server Certificates.  Click Import (on the right hand side) and point to the PFX file you saved on the desktop.  Click OK to import the certificate.

Import

Next, create a new site in IIS called PeopleService.  Point it to an appropriate folder on your computer and edit the site bindings.  Add a new HTTPS binding and select the SSL certificate you just imported.

EditBinding

Be sure to remove the standard HTTP binding after adding the HTTPS binding as you wont be needing it.

Update the hosts file (C:\Windows\System32\Drivers\etc\hosts) with an entry for peoplesite.local as follows;

127.0.0.1            peoplesite.local

Finally, flip back to Visual Studio and create a publish profile (which we will use later once we have finished the configuration).  The publish method screen should look something like this;

Publish

Configuration

Ok we have set up our environment, now its time to get down to the fun stuff…configuration.  Its easier if you delete everything you have between the <system.serviceModel> elements and follow along with me.

Add the following skeleton code between the <system.serviceModel> opening and closing tags, we will fill in each element separately;  (update the Service Name to match that in your project)

<services>
  <service name="PeopleService.Service.PeopleService" behaviorConfiguration="ServiceBehaviour">
    <host>
    </host>
  </service>
</services>
<bindings>
</bindings>
<behaviors>
  <serviceBehaviors>
  </serviceBehaviors>
</behaviors>

Base Address

Start by adding a base address (directly inside the host element) so that we can use relative addresses’;

<baseAddresses>
  <add baseAddress="https://peoplesite.local/" />
</baseAddresses>

Endpoints

Next, add two endpoints (one for the WsHttpBinding and one for MEX);

<endpoint address="" binding="wsHttpBinding" bindingConfiguration="BasicBinding" contract="PeopleService.Service.IPeopleService" name="BasicEndpoint" />
<endpoint address="mex" binding="mexHttpsBinding" contract="IMetadataExchange" name="mex" />

Note that we are using mexHttpsBinding because our site does not support standard HTTP binding.  We don’t need to explicitly add a binding for the MEX endpoint as WCF will deal with this automatically for us.  Add a wsHttpBinding as follows;

<wsHttpBinding>
  <binding name="BasicBinding">
    <security mode="TransportWithMessageCredential">
      <message clientCredentialType="UserName" />
    </security>
  </binding>
</wsHttpBinding>

Bindings

This is where we specify what type of security we want to use.  In our case, we want to validate that the user is whom they say they are in the form of a username/password combination.  The TransportWithMessageCredential basic http security mode requires the username/password combination be passed in the message header.  A snoop using a HTTP proxy tool (such as Fiddler) reveals this;

fiddler

Service Behaviours

Finally we need to update our existing service behaviour with a serviceCredentials element as follows;

<behavior name="ServiceBehaviour">
  <serviceMetadata httpGetEnabled="true" httpsGetEnabled="true" />
  <serviceDebug includeExceptionDetailInFaults="true" />
  <serviceCredentials>
    <userNameAuthentication userNamePasswordValidationMode="Custom" customUserNamePasswordValidatorType="PeopleService.Service.Authenticator, PeopleService.Service" />
    <serviceCertificate findValue="peoplesite.local" storeLocation="LocalMachine" storeName="TrustedPeople" x509FindType="FindBySubjectName" />
  </serviceCredentials>
</behavior>

The two elements of interest are userNameAuthentication and serviceCertificate.

User Name Authentication

This is where we tell WCF about our custom authentication class.  Lets go ahead and create this.  Add a new class to your project called Authenticator.cs and add the following code;

using System.IdentityModel.Selectors;
using System.ServiceModel;

public class Authenticator : UserNamePasswordValidator
{
    public override void Validate(string userName, string password)
    {
        if (userName != "peoplesite" && password != "password")
        {
            throw new FaultException("Invalid user and/or password");
        }
    }
}

Basically, you can add whatever code you want here to do your authentication/authorisation.  Notice that the Validate method returns void.  If you determine that the credentials supplied are invalid, you should throw a FaultException, which will be automatically handled for you by WCF.

You should ensure that the customUserNamePasswordValidatorType attribute in your App.config file is the fully qualified type of your authenticator type.

Service Certificate

This is key, if this is not quite right nothing will work.  Basically you are telling WCF where to find your SSL certificate.  Its very important that the findValue is the same as your SSL certificate name, and that you point to the correct certificate store.  Typically you will install the certificate on the LocalMachine in the TrustedPeople certificate store.  I would certainly recommend sticking with the FindBySubjectName search mode, as this avoid issues when you have multiple SSL certificates with similar details.  You may need a little trial and error when starting out to get this right.  If you have been following this tutorial throughout, you should be OK with the default.

Supplying user credentials

We just need one final tweak to our test client to make all this work.  Update the test client code as follows;

PeopleServiceClient client = new PeopleServiceClient();
client.ClientCredentials.UserName.UserName = "peoplesite";
client.ClientCredentials.UserName.Password = "password";

We pass in the client credentials via the, you guessed it, ClientCredentials object on the service client.

If you run the client now, you should get some test data back from the service written out to the console window.  Notice that you will get an exception if the username/password is incorrect, or if the connection is not over SSL.

Troubleshooting

SecurityNegotiationException

As an aside, if you receive a SecurityNegotiationException please ensure that your self-signed certificate is correctly named to match your domain, and that you have imported it into the TrustedPeople certificate store.

SecurityNegotiationException

A handy trick for diagnosing the problem is by updating the service reference, Visual Studio will advise you as to what is wrong with the certificate;

SecurityAlert

Summary

With a few small configuration changes you can easily utilise WS-Security specifications/standards to ensure  that your WCF service is secure.  You can generate a self-signed SSL certificate using a free tool from Pluralsight, and install it to your local certificate store and IIS.  Then you add a UserNamePasswordValidator to take care of your authentication.  Finally, you can troubleshoot and debug your service using Fiddler and Visual Studio.

github4848_thumb.pngThe source code is available on GitHub

Publish your website to an IIS staging environment using Microsoft Web Deploy

One of the simplest and quickest ways to publish your website to a staging environment is, at least in my opinion, using Microsoft Web Deploy.  This post is about how you approach this, a future article will discuss why you probably shouldn’t do this.

Key points;

  1. The remote server should be running Internet Information Services (IIS) 7.0 or later.
  2. You can use the Microsoft Web Platform Installer to install all the extra bits you need to make this work.
  3. You need to set appropriate permissions to allow remote publishing.

Windows Server 2012 R2

On my local machine, for testing purposes, I have a Windows Server 2012 R2 virtual machine which is bare bones configured.

The first thing you need to do is install IIS.  You can do this using the Server Manager;

Open the Server Manager > click Add roles and features > select Role-based or feature-based installation > select the target server > and finally, select Web Server (IIS) and Windows Deployment Services.  Feel free to drill into each item and ensure you have the following selected (as well as whatever the defaults are);

  • Basic Authentication (very important)
  • ASP .NET 3.5 / 4.5
  • .NET Extensibility 3.5 / 4.5
  • IIS Management Console and Management Service (very important)

Once installed, you should be able to open IIS Manager by opening the Start menu, type inetmgr and press enter.

When IIS Manager opens (referred to herein as IIS), you should be prompted to download Microsoft Web Platform installer.  Ensure you do this.  Use the Web Platform installer to ensure you have all the following installed;

  • IIS 7 Recommended Configuration
  • IIS Management Service (should already be installed)
  • IIS Basic Authentication (should already be installed)
  • Web Deployment Tool (The current version is 3.5 at the time of writing, I also like to install Web Deploy for Hosting Servers as well)
  • Current version of the Microsoft .NET Framework
  • ASP .NET MVC 3 (as we will be publishing an ASP .NET MVC website)

I like to do a restart at this point, just to ensure that everything is tidied up (although I don’t think its 100% necessary, just ensure you restart IIS at the very least).

Enabling web deployment

The next step is to “switch on” the web management service.  This will allow remote users to connect up and deploy the website.

For the sake of simplicity, we will use basic authentication.  There are other means of authenticating users, but that is out of the scope of this tutorial.

AuthenticationIn IIS, select the server level node and then select the Authentication module (under the IIS grouping).

Simply right click on Basic Authentication, and the click Enable.

Next we need to configure the web management service to accept incoming connections.  Again, select the server level node, and select Management Service.

If the management service is already running, you need to stop it before continuing.  To do this, go to the Start Menu and type services.msc.  This will open the Services manager.  Search for Web Management Service, right click, and click Stop.  I ran through this process twice from scratch and the first time the service wasn’t running and the second time it was.  I not sure what triggers it to run.

Tick Enable Remote Connections and feel free to accept the default settings for now.  You could always revisit this later.  Click Start on the right hand side to start the service.

Configure your website

I’m sure you’ve done this many times before, so I will not regurgitate the details here.

Add a new website, give it a host name if you like, and specify the physical path (remember this).  Please ensure that you set the application pool to .NET CLR Version 4.0.30319 to avoid errors running the website further down the line.

Set the appropriate permissions for IIS_IUSRS

IIS requires read permissions to access the files that make up your website.  The simplest way is to head over to the physical folder for your website (that path you’re remembering from earlier), right click the folder, click Properties > Security > Edit > Add.  Type IIS_IUSRS then click Check Names.  Click OK, then OK to close the properties windows.

Create a Web Deploy Publish Profile

Finally, you can create a web deploy publish profile (simply an XML file with a few basic settings)  which you can import into Visual Studio to save you the hassle of having to type anything.

imageHead back over to IIS, right click on your website, click Deploy > Configure Web Deploy Publishing.

You can (and definitely should) create a restricted user account and grant permission to publish to that account (either an IIS account of a Windows authentication based account).

Once you have selected a user, click Setup.  A message should appear in the Results text area;

Publish enabled for 'WIN-DLICU73MRD0Jon'
Granted 'WIN-DLICU73MRD0Jon' full control on 'C:inetpubwwwroottestwebsite'
Successfully created settings file 'C:UsersJonDesktopWIN-DLICU73MRD0_Jon_TestWebsite.PublishSettings'

Success! This is your publish profile that you can import into Visual Studio.

Import your publish profile into Visual Studio

To import your publish profile, open your web solution and from the Build menu select Publish [your website].

PublishWebOn the Profile tab, click Import… and browse to the publish profile you just created.  Once imported, switch to the Connection tab and type the password for the account you selected earlier on the Configure Web Deploy Publishing dialog you saw earlier.

If you’re feeling lucky, hit the Validate Connection button.  All should validate properly.  If not, please refer to this little hidden gem from the IIS team to help troubleshoot any error messages you might be receiving.

Browse to your website using the host name you specified earlier (don’t forget to update your hosts file if you are “just testing”) and congratulations, after the initial feels like a lifetime compilation period, all should be good.

Next time you’re ready to publish, simply open the Publish dialog in Visual Studio, go straight to the Preview tab and hit Publish.  No more manual deployment for you my friend!

Summary

The Microsoft Web Deployment tool is a quick and convenient way to publish your website to a staging area for further testing.  You use the web deployment tool to generate a publish profile which can be imported into Visual Studio (saving you the hassle of having to type all that connection info)  and then call that service and pass it a package which will automatically deploy your site to the appropriate folders.

How to pass Microsoft Exam 70-486 (Developing ASP.NET MVC 4 Web Applications) in 30 days

Before you continue reading this blog, you need to be aware of the following; This is not a “quick fix” or an “easy solution”.  I have not discovered some secret formula to guarantee you pass with 100% marks every time.  I am not trying to sell you anything.  The exam is genuinely challenging and the only way you are going to pass is by working hard!  You will not find any brain dumps here!  If you’re afraid of working hard to achieve your goals, you best leave now …

Still Here? Congratulations, you are taking your first steps towards passing the Microsoft exam 70-486 in just 30 days.  The purpose of this post is to link to all the resources that I used when revising for the exam myself.

So why 30 days? Well its important to set yourself a target.  Setting yourself a target motivates you.  If you are paying for this exam yourself (like I did), you’ll really want to make sure you give the exam your best shot (after all, £99 is a hefty amount of money!)

Know the exam objectives

Probably the most important thing you should do before starting studying for any exam is to find out what the exam objectives are.  Basically the exam objectives tell you what to study for! There is not much point in learning material that is of no relevance!

You can find the exam objectives under the “Skills Measured” section on the official 70-486 exam page.

Books

Books aren’t for everybody, some people find it hard to sit down and read a book … I get that, but you should at least try.

Have a look at Exam Ref 70-486: Developing ASP.NET MVC 4 Web Applications

Developing ASP .NET MVC 4 Web Applications

This book was written by William Penberthy.  And its terrible not the best book I’ve ever read.

Pros; Each objective on the exam receives equal coverage.  There are some good insights into the various technologies at a high level, and the author is clearly very experienced in this field.

Cons;  This is the official book from Microsoft from the 70-486 exam, and it is somewhat off the mark.  The objectives/sections/chapters are disjointed and only covered at a very high level.  The book is severely lacking in detail and code samples/walkthroughs.

If you really want to read a book, I highly recommend reading Professional ASP.NET MVC 4, which was written by Jon Galloway, Phil Haack, Brad Wilson, Scott Allen and Scott Hanselman.  Five people who are leading experts in this field.  I learnt a lot from this book, it flows well, there are sufficient code samples and the book is very engaging.

Microsoft Virtual Academy (MVA) JumpStart

Microsoft has provided some great FREE training videos on the Microsoft Virtual Academy website, so its only polite that you fully exploit these resources.

You will want to start with the Building Web Apps with ASP .NET Jump Start video series.  Just a note, you will need a free Microsoft account to access the videos.

Pluralsight Training Videos

Pluralsight is a subscription (paid for) site offering training material for developers (and now IT professionals as well) at all levels, and in all stages of their careers.  If you don’t already have a subscription (??) you can get a free 10 day trial (up to 200 minutes) to give you a taste.  The subscription starts at a mere $29 (£17.62 ish) a month.

Here are some of the videos I watched whilst preparing for this exam; (Make sure you follow along whilst the presenter is talking!)

You may also want to scrub up on your HTML 5, JavaScript (jQuery) and CSS, as these are mentioned in the exam objectives as well.

Summary

It is possible to pass Microsoft exams in 30 days, assuming you have some background knowledge in the subject and are prepared to work (very!) hard.  Microsoft make a lot of training resources available to you for free, and there are online training providers that can help you out as well (for a small fee).  There is no “one size fits all” or “silver bullet”, so you’ll want to try a range of resources to find what works best for you.  Don’t resort to cheating or you will be caught and banned for life!

In case anybody is wondering, I passed the exam with a score of 94% in January 2014.

If you found this article useful, please leave comments below!

How to create a new Outlook 2013 Email using C# in 3 simple steps

It has traditionally been quite painful to interact with any part of the Microsoft Office product family from a C# application, but thanks to the introduction of dynamics and optional parameters over recent years, the process has dramatically improved.

Step 1 – Prerequisites and Assembly References

Before doing anything, it is important to note that you must have Microsoft Office 2013 installed for this to work. Seems obvious, but, its still worth mentioning.

You also need two references;

Microsoft.Office.Core
Microsoft.Office.Interop.Office

The quickest way to add these references to your project is to right click on the References folder in your project, and click Add Reference. The Reference Manager dialog window will appear as shown below;

Reference Manager

  1. Click the COM tab
  2. Type Outlook into the search box
  3. Tick Microsoft Outlook 15.0 Object Library
  4. Click OK

You should now see that the appropriate references have been added to your project;

References

Step 2 – Using Directives and Initialization

Next, add the appropriate using directives to your code file.

using Microsoft.Office.Interop.Outlook;
using OutlookApp = Microsoft.Office.Interop.Outlook.Application;

The second directive is a recommendation to avoid ambiguity with other classes with the name Application.

In the constructor of your application (or wherever you want this code to go), create an instance of the Outlook Application and create a new MailItem object, as shown;

OutlookApp outlookApp = new OutlookApp();
MailItem mailItem = outlookApp.CreateItem(OlItemType.olMailItem);

Step 3 – Format and display the email to the user

Finally you can begin to flesh out your email.

mailItem.Subject = "This is the subject";
mailItem.HTMLBody = "<html><body>This is the <strong>funky</strong> message body</body></html>";

//Set a high priority to the message
mailItem.Importance = OlImportance.olImportanceHigh;

And to display the email, simply call the Display method;

mailItem.Display(false);

There are literally dozens of things you can do to an Outlook Email, including adding attachments, business cards, images, recipient, CC/BCC fields.

Summary

To create an Outlook 2013 email from C#, simply add the Microsft Outlook 15.0 Object Library to your solution, add the appropriate using directives, create a new Application object, and MailItem object, and flesh out your email. When ready, simply call MailItem.Display(false) to show the email to the user.

Please leave a comment below if you found this post useful

How to pass Microsoft Exam 70-480 (HTML 5, CSS3 and JavaScript) in 30 days

Before you continue reading this blog, you need to be aware of the following; This is not a “quick fix” or an “easy solution”.  I have not discovered some secret formula to guarantee you pass with 100% marks every time.  I am not trying to sell you anything.  The exam is genuinely challenging and the only way you are going to pass is by working hard!  You will not find any brain dumps here!  If you’re afraid of working hard to achieve your goals, you best leave now …

Still Here? Congratulations, you are taking your first steps towards passing the exam.  The purpose of this post is to link to all the resources that I used when revising for the exam myself.

So why 30 days? Well its important to set yourself a target.  Setting yourself a target motivates you.  If you are paying for this exam yourself (like I did), you’ll really want to make sure you give the exam your best shot (after all, £99 is a hefty amount of money!)

Know the exam objectives

Probably the most important thing you should do before starting studying for any exam is to find out what the exam objectives are.  Basically the exam objectives tell you what to study for! There is not much point in learning material that is of no relevance!

You can find the exam objectives under the “Skills Measured” section on the official 70-480 exam page.  You can also find other sites (try GeeksWithBlogs.net/WTFNext/ as an example) that will try and match the objectives with relevant material.

Books

Books aren’t for everybody, some people find it hard to sit down and read a book … I get that, but you should at least try.  Microsoft really is your friend here, as one of the two books that I recommend reading is provided by them for free!

Programming Windows 8 Apps with HTML, CSS, and JavaScriptProgramming Windows 8 Apps with HTML, CSS, and JavaScript

This book was written by Kraig Brockschmidt.  It attempts to achieve several goals at the same time.

The book is focused on introducing developers to Windows 8 Store Application development using HTML 5, CSS 3 and JavaScript.  It is very detailed and contains a lot of useful code samples and links to Microsoft resources.

Pros; This book is free, gives lots of good sample code, its thorough and its a great resource for anybody looking to write Windows Store applications.  Its also the official Microsoft book of the 070-480 exam.

Cons; Whilst this is the official Microsoft 070-480 exam, it doesn’t target the exam itself.  What I mean is, there is a lot of generic code in this book, and the book does not target the exam directly (unlike the next book)

Programming in HTML 5 with JavaScript and CSS 3Training Guide: Programming in HTML5 with JavaScript and CSS3

This book was authored by Glenn Johnson and was written specifically to help you pass the exam by giving hands on, practical examples, specifically target at the exam objectives.

Each chapter is divided into manageable sections, complete with hands on exercises (usually one or more per chapter).  This book very much helps you learn by doing, which in my opinion, is the best way to learn.

I believe that if I hadn’t read this book, I genuinely believe I wouldn’t have passed the exam.

Microsoft Virtual Academy (MVA) JumpStart

Microsoft has provided some great FREE training videos on the Microsoft Virtual Academy website, so its only polite that you fully exploit these resources.

Depending on what level you are currently at, you may want to start with the HTML 5 & CSS 3 Fundamentals: Development for Absolute Beginners video series.  You will need a free Microsoft account to access the videos.

MVA Website
MVA Website

I also strongly recommend checking out the Developing HTML 5 with JavaScript and CSS3 Jump Start (and the refresher) training courses, brilliantly hosted by Jeremy Foster and Michael Palermo.  By the way, you should also follow the blogs of these people, as they are constantly posting useful information that you may find helpful.

Pluralsight Training Videos

Pluralsight is a subscription (paid for) site offering training material for developers (and now IT professionals as well) at all levels, and in all stages of their careers.  If you don’t already have a subscription (??) you can get a free 10 day trial (up to 200 minutes) to give you a taste.  The subscription starts at a mere $29 (£12.60 ish) a month.

Here are some of the videos I watched whilst preparing for this exam; (Make sure you follow along whilst the presenter is talking!)

And probably the most important video on the site (from your perspective at least ) … HTML 5 Advanced Topics.

Summary

It is possible to pass Microsoft exams in 30 days, assuming you have some background knowledge in the subject and are prepared to work (very!) hard.  Microsoft make a lot of training resources available to you for free, and there are online training providers that can help you out as well (for a small fee).  There is no “one size fits all” or “silver bullet”, so you’ll want to try a range of resources to find what works best for you.  Don’t resort to cheating or you will be caught and banned for life!

In case anybody is wondering, I passed the exam with a score of 93% in September 2013.

If you found this article useful, please leave comments below!