Installing Yeoman.io to Make Life Easier

Yeoman.io is a collection of 3 technologies. The idea behind Yeoman is to help developers accomplish the common, tedious tasks such as building, linting, minification, scaffolding, previewing/running our code, etc. Yeoman isn’t just a tool, but it is a companion for helping you at all stages of the development phase, that is from starting, developing, testing, and deploying a project. Earlier I gave a quick dive into the various components of Yoeman.io and you can check it out at A Quick Into Into Yeoman.io. Next, I wanted to show you how easy it is to install Yeoman to start taking advantage of its power.

First, you are going to need to have node.js and the node.js package manager (npm) installed. This is easy enough, simply navigate to node.js download page and follow the node.js instructions. By default the node package manager will be installed. Be sure not to omit that from the installation as we will need it to install Yeoman, Grunt, and Bower.

Once you have installed node.js open your terminal or command window. To test you have successfully installed node.js and npm type the following commands.

node --version
npm --version

If your terminal window spits out a version number then you are good to go. Now that node.js and npm are installed we need to install Yeoman. We are going to use npm to install Yeoman locally, and this will automagically install Grunt and Bower as they are dependencies of Yeoman. To install Yeoman npm makes this easy. With the terminal window still open run the commands below to being installation.

npm install -g yo

This will take a bit of time to download the app as well as it’s dependencies, but you only have to do this one time. Once, the Yeoman (yo) is done installing we can being using the tool. Unfortunately, Yeoman doesn’t come with many generators out of the box it does however allow you to download many generators that will meet your needs. So we can start downloading generators we will commonly use to help make Yeoman more useful. For the purpose of this we will use generator-angualar to scaffold our next application, and first we must download the generator.

npm install -g generator-angular

This generator provides instructions to yo for scaffolding your next angular project. There are many more generators available to you, but here are some listed on Yeoman.

Once the generator is installed you can begin using it. It is just that easy!

mkdir c:/project/projectName
cd c:/project/projectName
yo angular

After a few seconds of downloading your dependencies and scaffolding the application your new angular project is good to go. Each generator may ask you a set of questions about your application. These steps are to help Yeoman create a more customized project to meet your needs. Help Yeoman out so he can help you out.

Enjoy!

Securing An Mvc Application

When building an MVC application authentication is an important part when securing your website. I was recently creating a second application that consumed the authentication ticket from our main application. In my last post I showed how to share the forms authentication ticket between multiple applications, and now that we have the ticket being shared we need to plug it into our site.

If you recall from the last post the secondary application delegates the authentication and credential verification. Instead of using a form for the user to enter credentials, we validate the authentication ticket and establish the principal and identity. This can be done in the Application_AuthenticateRequest method.

// This is the Global.asax.cs file
public class MyApplicatoin : HttpApplication
{
    protected void Application_AuthenticateRequest(object sender, EventArgs e)
    {
        // Pulls the cookie name from the configuration (default .ASPXAUTH)
        string cookieNaame = FormsAuthentication.FormsCookieName;
        HttpCookie cookie = Context.Request.Cookies[cookieName];

        bool doesCookieExist =  cookie != null;

        if (cookieExists)
        {
                try
                {
                        FormsAuthenticationTicket ticket = FormsAuthentication.Decrypt(cookie.Value);
                        string[]  roles = //get your roles from somewhere.
                        FormsIdentity identity = new FormsIdentity(ticket);
                        GenericPrincipal principal = new GenericPrincipal(identity, roles);
                }
        }
    }
}

Once the authentication mechanism is in place we can handle the authorization by decorating controllers or actions with the AuthorizeAttribute. By tagging the controller with the AuthorizeAttribute we are saying that any action in this class will require the user to be authenticated. Since we did not provide any roles to the attribute it just prevents access to anonymous users. This is all good until you want to allow certain actions in a controller to be accessible by anonymous users such as a login action. We can enable anonymous users by tagging actions with the AllowAnonymousAttribute.

[Authorize]
public class AccountController : Controller
{
    [AllowAnonymous]
    public ActionResult Login()
    {
        return View();
    }

    // Requires authorization
    public ActionResult Index()
    {
        return View();
    }
}

This model for authorization seems pretty good until we want to add another protected controller. When adding another controller we realize we could easily forget to add the AuthorizeAttribute to the new controller or maybe another developer adds a controller and is not aware how to protect the code.

public class AdminController : Controller
{
    public ActionResult Index()
    {
        return View();
    }

    public ActoinResult ManageUserPasswords()
    {
        return View();
    }
}

You or a another developer could easily miss this detail. Now sensitive functionality is exposed to anonymous users and your application is just waiting to be compromised. We can avoid these mistakes by implementing a global filter. Instead of each controller and action opting into requiring authorization all controllers will require authorization and controls needing anonymous access must be tagged to opt out of requiring authorization.

public class FilterConfig
{
    public static void RegisterGlobalFilter(GlobalFilterCollection filters)
    {
        filters.Add(new HandelErrorAttribute());
        filters.Add(new AuthorizeAttribute());
    }
}

Our code uses the AuthorizeAttribute as a global filter every controller and action prevents anonymous users. If we wanted to continue to allow anonymous users on a certain action we simply decorate it with [AllowAnonymous].

A Quick Intro Into Yeoman.io

When I built my first HTML5 game, Super Space Odyssey, I did things the hard way.  I created everything from scratch and I did not leverage a game engine to do some of the heavy lifting.  I was looking around on the web and found several options, but I decided to go with Crafty.js for my next game.

As it turns out I get distracted easily…  I found, just like every framework, setting up dependencies and the structure of the application is tedious and time consuming.  While searching the Crafty site.  I found the developers at Crafty recommend a particular structure for games, and they also supplied a handy boilerplate to get things rolling (CraftyBoilerplate).

I have wanted to create my own Yeoman generator and now I have a reason.  Yeoman.io is a self declared “workflow; a collection of tools and best practices working in harmony to make developing for the web even better.”  This sounded pretty Totes McGoats, but I didn’t really know how awesome it could get.  With a little terminal kung fu, I was able to setup a new game faster and a lot easier (first world problems, I know).

Before I can get started, I need to understand essence de Yeoman.  There are three main pieces to Yeoman: Yo, Grunt, and Bower.  Yo is the templating engine that I will use to push the scaffolding for Crafty games.  Grunt is the javascript based task runner that I use for minifying, build, and deploy my game to a local node server.  Bower does all the dependency management so we don’t have too!

To start things off I used Node Package Manager (npm) to install yeoman, which installs Grunt and Bower as well.  Both node and npm are required for yo, grunt, and bower.

npm install -g yo

This basically installs npm so it is accessible through the terminal globally.  Once yo (Yoeman) is installed we can start creating a generator.  Generators are what we want to create, they are the magic that makes starting new projects easier.  Specifically, this is what will scaffold our application, setup our local dev env, and allow us to use bower dependency management when building our games.  Once the game is created the scaffolding will setup the Gruntfile.js, package.json, and bower.json file.

Generator: This is the definition of the scaffolding.  The generator will be used to start a new project.  It is basically a command line installer.  It will ask you various questions about your project and push out the boilerplate you need so you can just get things done right (GTDR).

Gruntfile.js: This file is created by the generated and used by Grunt to build, deploy, setup a webserver, minify, clean the project, etc.  We will use grunt during the development phase to help us test our site on a local node server.

Package.json: This file has metadata about the project.  It can be used by NPM to publish data.  It isn’t needed for much more than that.  This too will be created by the generator, but the generator will also have one to describe itself as well.

Bower.json: Here is where we describe the dependencies of our project.  We can use Bower to update and pull our projects dependencies.  No more navigating to sites, downloading JS files, unpacking, and inserting the scripts.  Nope, only magic here.

Ok, so now we need to get started with our generator.  I don’t know about you, but I don’t want to start from scratch.  After all we are going through this in the first place because we don’t want the tedious work.  So my first step is download yo’s generator-generator.  This bad boy is all the scaffolding you need to create yo generators.

npm install -g generator-generator

Once the generator to scaffold generators is installed we need to create a new directory to contain our generator code.

mkdir /Users/shawnmeyer/Documents/Projects/Generators/generator-crafty
yo generator

This will spit out various questions.  Fill them in accordingly to have the generator setup your local project, but be sure to name you directory generator-blank where blank is the name of your generator.  This naming convention is important.  The second part is how we will tell yo to scaffold using your generator.  In my scenario it would be:

yo crafty

In your new generator project you will have a package.json file.  This file contains the metadata and dependencies of your application.  When the user installs your generator using the Node Package Manager, the npm will download and install of the dependencies of your generator if you need them.

I am not going to show you all the steps I took to create the generator-crafty package, but  you can check it out at https://github.com/sgmeyer/generator-crafty.  I do have some handy tips.  When you are creating your generator you can test locally a couple of ways.

First, you can run unit tests by entering the command in the root director of your generator.  This command looks at your test/*js files and runs those tests in the terminal.

npm test

Next you can link the generator to npm and test the generator to create the scaffolding.  This is handy if you don’t want to publish your generator and you just want to horde and kee all your fancy generators to your self.   Run this command in the root directory of you generator.

npm link

And then you can use yo to run your generator. To run your generator and setup the scaffolding navigate to your directory.  Since I used generator-crafty as my name I will go to the crafty directory.  Use the command below will tell yo to run generator-crafty, and the generator will setup all the scaffolding we defined in the Index.js file of our generator.  I didn’t show this step, but you can check out my code to see an example (http://github.com/sgmeyer/generator-crafty”>Generator-Crafty Repo).

yo crafty

Assuming you added some grunt support to your Gruntfile.js you could also use the grunt command in this directory to execute your tasks.  In my code I setup a custom grunt task that creates a node server, opens a browser, and runs live reload to detect updates to the JS and HTML to give you live preview.

grunt server

Grunt lets you setup a default task as well that it pulls from the Gruntfile.js or you can call tasks by name.  In my example I am calling the server task by name, however I have it setup to work by calling grunt without a task name.

Once the scaffolding is done running you might want to add some more magic to your project and this is where Bower comes in.  Let’s say we want to add a dependency to underscore.  Bower will let us do this.

bower install --save underscore

This will install your underscore dependencies in the bower_components directory of your new project (in my crafty directory) and it will update the bower.json file’s dependency list. if you don’t want to update your bower.json file you can drop the –save flag.  Again, Bower is just a package manager.  It does not replace RequireJS or other front-end dependency loader.  Bower simply installs and updates your local dependencies.

When new developers pull your project from source control, they can run the command below to install and pull the projects dependencies. This makes it convenient and you won’t have to commit your js files.

bower install

This is just a quick glimpse at the Yeoman workflow.  There is a ton of power available to us by using Yo, Grunt, and Bower.  These tools really can work together and help us be more productive.

Sharing Forms Authentication Tickets Between Applications

Recently, I ran across a fun challenge. I have an application modularized across multiple websites. I also need to allow users to navigate between these applications seamlessly with a single authentication mechanism or a single sign on and minimal HTTP redirects. What I have traditionally done is link the two applications using some form of SSO (e.g. encrypted query strings or SAML). My personal opinion is that these solutions are ideal when talking between two separate entities such as integration between two websites owned by two different organizations. After thinking about this problem I wanted to eliminate redirecting the user for SSO purposes. Consider my typical scenario.

As a user I click a link on my intranet to log into the application. The initial URL sends me to a website that attempts to pull my credentials and builds the SSO request. Once the SSO request is built the user is returned an HTTP response directing them to POST or GET the new a URL with the SSO request. The service provider then verifies the SSO request and then redirects the user to their landing page. That is a lot of work to get into the application, but for communication between entities it is pretty standard.

When a user originally authenticates with the application from the Identity Provider this is pretty standard. My personal opinion is that once a user authenticates into your website the user should not see authentication between multiple websites owned by the service provider. I do not like the idea of exposing the authentication between each application and that is what multiple redirects do. Instead, I want the applications to share authentication. So when the user goes from one application to the next they are sent directly to the URL and not an intermediary sign on handler that eventually redirects the user to the requested page. I want one of the applications to handle authentication and allow users to move between applications without requiring re-authentication on each website. After doing some digging I found that ASP.NET provides an easy solution, sharing the forms authentication ticket.

The Authentication Ticket:

The basic purpose of the Authentication Ticket is to tell your web application who you are and that you are authenticated. When using forms authentication, each authenticated user will have a forms authentication ticket. This ticket is encrypted, signed, and then stored in a cookie (I am ignoring cookie-less configurations). It is also important to know, ASP.NET uses the Machine Key secure your authentication cookie. According to the MSDN .NET 1.0 and 1.1 are different than .NET 2.0 and later. For the purpose of this article we are going to assume .NET 4.5, but I will mention a configuration for .NET 2.0 SP1. For more information on authentication tickets check out the MSDN article http://support.microsoft.com/kb/910441.

Configuring Forms Authentication

<system.web>
  <machineKey validationKey="validationKeyA" decryptionKey="decryptionKeyA" validation="SHA1" decryption="AES" />
  <authentication mode="Forms">
    <forms name=".ASPXAUTH" domain="shawnmeyer.com"></forms>
  </authentication>
</system.web>

This configuration needs to be configured in every application’s Web.config file. The machineKey node is used to allow applications to read the authentication ticket across sites. If one of your applications is a .NET 2.0 application that was if it was upgraded to .NET 4.5 and it still uses the legacy mode you might need to add the compatibilityMode=”Framework20SP1” attribute to the machineKey element to all applications. If not then you can ignore this attribute.

The authentication element is where we configure forms authentication. The child element is used to set the properties of forms authentication. Each application must have the same name as this is what the cookie will be named. If your applications live in the same subdomain you can ignore that attribute, however if they live across multiple subdomains and the same domain you will need to set the domain without a subdomain.

Troubleshooting:

  1. If your browser is not sending the cookie this might be caused by the domain attribute needing to be set. When reading the cookie the domain should read '.shawnmeyer.com' and the web.config attribute for domain should read domain="shawnmeyer.com".
  2. If your browser is sending the cookie, but the cookie does not show up in the server's HttpCookieCollection it means the cookie is probably not passing validation. Ensure the machineKey elements match across each application. If they do match you should ensure they are using the same compatibility mode.

Creating the cookie

In the application that hands authentication you can create and store the authentication ticket. This will setup the cookie and ticket so it can be consumed by each application.

// Gets the cookie
var cookie = FormsAuthentication.GetAuthCookie(username, rememberMe);


// Gets an authentication ticket with the appropriate default and configured values.
var ticket = FormsAuthentication.Decrypte(cookie.Value);
var newTicket = new FormsAuthenticationTicket(
                             ticket.Version,
                             username,
                             createDate,
                             expirationDate,
                             isPersistent,
                             userDate);

var encryptedTicket = FormsAuthentication.Encrypt(newTicket);
cookie.Value = encryptedTicket;
Response.Cookies.Add(cookie);

Reading the Authentication Ticket from the Cookie

To read the cookie you can use this code.

// This could throw an exception if it fails the decryption process. Check MachineKeys for consistency.
var authenticationTicket = FormsAuthentication.Decrypt(authCookie.Value);

// Retrieve information from the ticket
var username = authenticationTicket.Name;
var userData = authenticationTicket.UserData;

Magic

Once you have these applications configured to share the authentication ticket. From this point we can more seamlessly handle authentication between sites. Enjoy!

Using Enterprise Library To Decrypt a Java Encrypted Message

Last week I stumbled upon a tricky problem when tasked with encrypting and decrypting messages sent between a Java and .NET application.  The Java application was responsible for encrypting a message while the .NET application consumed and decrypted the message.  Ignoring the debate over the usage of an asymmetric algorithm, I was required to use a symmetric algorithm, specifically Rijndael.

Since we are primarily a .NET shop I was expected to use the Enterprise Library, for good reason, as it simplifies encryption and decryption.  I quickly realized the simplicity gained on the .NET side was a trade off for additional complexity on the Java side.  To start let us look at the basic Java code needed to encrypt a string.

public String encrypt(String auth) {
  try {
    byte[] key = // shared key as Base 64
    byte[] iv = Base64.decodeBase64("IV STRING");

    SecretKeySpec spec = new SecretKeySpec(key, "AES");
    Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");

    cipher.init(Cipher.ENCRYPT_MODE, spec, new IvParameterSpec(iv));
    byte[] unencrypted = StringUtils.getBytesUtf16(auth);
    byte[] encryptedData = cipher.doFinal(unencrypted);
    String encryptedText = new String(encryptedData);
    encryptedText = Base64.encodeBase64String(encryptedData);

    return encryptedText;
  } catch (Exception e) { }

  return "";
}

From the code above you can see we have a method that performs a pretty basic encryption.  Now let us look at the .NET code we will use to decrypt the message encrypted using the above method.

string message = Cryptographer.DecryptSymmetric(RijndaelManaged, encryptedText);

Next, I built the applications and tested out my encryption and decryption methods and they soon failed.  As you can see above the .NET code does not appear to use an IV.  Thankfully, Microsoft published the Enterprise Library code.  After taking a look I noticed something interesting.  Microsoft randomly generates an IV each time the EncryptSymmetric method is called and then prepends the IV to the encrypted message.  Accordingly, the DecryptSymmetric method assumes the IV will be prepended to the encrypted message and extracts the prepended bytes before attempting to decrypt the message.   I then went back to the Java application and added the following code.

encryptedText = Base64.encodeBase64String(encryptedData);

byte[] entlib = new byte[iv.length + encryptedData.length];
System.arraycopy(iv, 0, entlib, 0, iv.length);
System.arraycopy(encryptedData, 0, entlib, iv.length, encryptedData.length);

return Base64.encodeBase64String(entlib);

Again, I built the code and to my frustration it did not work.  However, this time the .NET code actually decrypted the message, but the output resulted in funny characters.  Now I suspected it was an encoding problem, but I was still confused as I knew the Enterprise Library used UTF-16 or Unicode encoding.  I decided to open up the code in Visual Studio and I realized the Enterprise Library did indeed use Unicode encoding, however it used little endian byte order.  So I went back to the StringUtils API and found a static method called getBytesUtf16Le(String string).

So I replaced:

byte[] unencrypted = StringUtils.getBytesUtf16(auth);

With:

byte[] unencrypted = StringUtils.getBytesUtf16Le(auth);

After building the code I finally had this little problem solved.  This turned out to be a bit more complex than I initially thought, however thankfully Microsoft provided the source code needed to solve the problem.