Thursday, June 27, 2013

INotifyPropertyChanged implementation

If you created any XAML based application (WPF, Silverlight, Windows Phone, Windows App Store) using MVVM pattern than you are probably familiar with INotifyPropertyChanged interface.  However to implement the interface most developers are faced with two problem that they should be concerned with.  Consider the follow snippet of code:
private string _propertyA;
public string PropertyA
{
    get { return _propertyA; }
    set
    {
        // P1 repetitive code
        if (value == _propertyA) return;
        _propertyA = value;
        // P2 "magic" string
        RaisePropertyChanged("PropertyA");
    }
}

private void OnPropertyChanged(object sender, PropertyChangeEventArg e)
{
 // P2 "magic" string usage
 if (e.PropertyName == "PropertyA")
  HandlePropertyAChanged();
}

There are two basic problem with this implementation. The first problem is the repetitive code, the 3 lines of code inside set method is pretty much the same for majority of the property and in most implementation it is simply copied and pasted everywhere.  The second issue, magic string, is a slightly more challenging problem to deal with. The big deal with the magic string is when code changes and the name of the property changes the programmer has to not only change the string inside the property's setter but everywhere that listens to the PropertyChanged event.

The first problem can be solved by encapsulate the code into a generic method, and there are a few different strategies for dealing with the second problem and we will explore the pros and cons of different solutions. The second problem can be solved by using constant member for the string thus change only need to occur at one place and we gain the refactor support or using reflection. There's also other implementation by leveraging new .Net feature of optional method parameter mark with attribute. What we will do in this article is to explore a few different implementations that attempt to address these two issue and compare the performance of each implementation. As a bonus there is also and implementation that uses weak event handler to address the possibility of memory leak.

Implementations

Simple

The simple implementation is included here as reference. It does use the constant string to represent property name for refactor support.

Setter

The setter class introduce a helper method to set property value and raise property changed event.
protected bool Set<T>(string propertyName, ref T field, T value)
{
    if (field == null || EqualityComparer<T>.Default.Equals(field, value)) { return false; }
    field = value;
    RaisePropertyChanged(propertyName);
    return true;
}
The property setter than simply become
private DateTime _time;
public DateTime Time
{
    get { return _time; }
    set { Set(TimeProperty, ref _time, value); }
}
This help to reduce boilerplate code and make sure the behaviors are consistent.

Delegate Setter

The delegate setter implementation uses Action<T> lambda to update the value so it doesn't have to have the field parameter passed in as reference, the set helper looks like this
protected bool Set<T>(string propertyName, Action<T> setter, T field, T value)
{
    if (field == null || EqualityComparer<T>.Default.Equals(field, value)) { return false; }
    setter(value);
    RaisePropertyChanged(propertyName);
    return true;
}
To use the helper method we would need to create an inner class that holds the property value
private class Inner : IDisplayText
{
    public string DisplayText { get; set; }
    public DateTime Time { get; set; }
    public bool Status { get; set; }
    public int Count { get; set; }
}
private readonly Inner _inner = new Inner();
The property setting would then call the helper method like this
public DateTime Time
{
    get { return _inner.Time; }
    set { Set(TimeProperty, (t) => _inner.Time = t, _inner.Time, value); }
}

Lambda

The lambda method utilize the Expression<Func<T>> to capture the property that's been updated and does not rely on constant member for property name. Again a helper method is introduced to encapsulate the property changed behavior. See conclusion about performance difference between .net 4.0 to .net 4.5.
protected bool Set<T>(Expression<Func<T>> expression, ref T field, T value)
{
    if (field == null || EqualityComparer<T>.Default.Equals(field, value)) { return false; }
    field = value;
    RaisePropertyChanged(GetPropertyName(expression));
    return true;
}

protected string GetPropertyName<T>(Expression<Func<T>> expression)
{
    MemberExpression memberExpression = (MemberExpression)expression.Body;
    return memberExpression.Member.Name;
}

Field

The field implementation introduces a helper class to capture the backing field for the property
protected class Field<T>
{
    public string PropertyName;
    public T Value;

    public Field(string propertyName)
    {
        PropertyName = propertyName;
    }

    public static implicit operator T(Field<T> t)
    {
        return t.Value;
    }
}
Again with a helper method for setting the property
protected bool Set<T>(Field<T> field, T value)
{
    if (field == null || EqualityComparer<T>.Default.Equals(field.Value, value)) return false;
    field.Value = value;
    RaisePropertyChanged(field.PropertyName);
    return true;
}
The definition of a property looks like this
private readonly Field<DateTime> _time = new Field<DateTime>(TimeProperty);
public DateTime Time
{
    get { return _time; }
    set { Set(_time, value); }
}

Lambda Field

Lambda Field implementation looks to improve the runtime performance (see results section) of Lambda implementation by combining Lambda and Field implementation. The Field class definition looks like this
protected class Field<T>
{
    public string PropertyName;
    public T Value;

    public Field(Expression<Func<T>> expression)
    {
        PropertyName = GetPropertyName(expression);
    }

    protected string GetPropertyName<T>(Expression<Func<T>> expression)
    {
        MemberExpression memberExpression = (MemberExpression)expression.Body;
        return memberExpression.Member.Name;
    }

    public static implicit operator T(Field<T> t)
    {
        return t.Value;
    }
}
The helper method is the same as Field implementation and the definition of property is
private readonly Field<DateTime> _time = new Field<DateTime>(() => Time);
public string TimeProperty { get { return _time.PropertyName; } }
public DateTime Time
{
    get { return _time; }
    set { Set(_time, value); }
}

Field2

Field2 is a modified Field implementation to remove constant string for property name. It borrows Lambda implementation's method of determining property name. The constants for property names are changed to static readonly fields of the class and are set at application startup by the static constructor to reduce overhead
public static readonly string TimeProperty;
public static readonly string StatusProperty;
public static readonly string CountProperty;
public static readonly string DisplayTextProperty;

static Field2Model()
{
    var dummy = new Field2Model(0);

    TimeProperty = GetPropertyName(() => dummy.Time);
    StatusProperty = GetPropertyName(() => dummy.Status);
    CountProperty = GetPropertyName(() => dummy.Count);
    DisplayTextProperty = GetPropertyName(() => dummy.DisplayText);
}

private static string GetPropertyName<T>(Expression<Func<T>> expression)
{
    MemberExpression memberExpression = (MemberExpression)expression.Body;
    return memberExpression.Member.Name;
}
The property implementation is the same as Field
private readonly Field<DateTime> _time = new Field<DateTime>(TimeProperty);
public DateTime Time
{
    get { return _time; }
    set { Set(_time, value); }
}

CallerMemberName

This implementation uses a .Net 4.5+ attribute CallerMemberNameAttribtue.  The implementation simplified all the Setter like implementation by adding a optional parameter and mark the parameter with [CallerMemberName].  The attribute cause the compiler to add the function/property's name as string literal to the method call thus the operation is pretty cheap and has about the same speed as setter.
The setter method look like this
protected bool Set(ref T field, T value, [CallerMemberName]string propertyName = "")
{
    if (field == null || EqualityComparer.Default.Equals(field, value)) { return false; }
    field = value;
    RaisePropertyChanged(propertyName);
    return true;
}
The property implementation is similar to setter without the property name as one of the argument
private DateTime _time;
public DateTime Time
{
    get { return _time; }
    set { Set(ref _time, value); }
}

AOP

The last implementation leverages aspect orientated library such as PostSharp or Fody. PostSharp has some example on how to do it here. I used Fody because it was requested and free.

Test Methodology

A test harness was built to bind the models implementing different property changed strategy. The test application binds a collection of models to a grid and them update the models a million times to generate property changed events. A timer is used to measure program execution time, this is not the CPU time, but the result is sufficient for the comparison. The test harness also create 1 to 100000 objects to account for object construction time, as you will see there are implementations where object construction is somewhat expensive.

Results

The values showing in the chart is in milliseconds and account for total construction time and property update time.


Object Count Simple Setter Lambda Field Lambda Field Field 2 Field With WEH Delegate Setter CMN Attribute Fody
1 752 827 1778 958 1104 970 955 942 814 815
10 730 814 1751 956 1105 959 955 936 810 814
100 733 818 1765 955 1103 961 949 938 810 811
1000 731 842 1885 959 1109 956 956 944 814 820
10000 750 874 2745 984 1205 979 981 1022 870 863
100000 837 999 2837 1122 1950 1095 1123 1171 985 998

The execution time for each implementation is pretty much the same across each implementation regardless of object count, the spike up you see for 100,000 object test is the construction cost.

Conclusion

Simple: use this if you don't mind typing or copy and paste
Setter: good implementation for simpler maintenance
Delegate Setter: good implementation if you are wrapping another class such as entity or dto
Lambda: bad performance not recommended. An interesting point here is the runtime went from 7000+ ms to 1700+ ms when project is convert from .net 4.0 to .net 4.5
Field: preferred method of implementation if you need refactor support in event handler, slightly more boiler plate code compare to setter/CallerMemberName implementation
Lambda Field: use this one if you really really want to use Lambda, but it has a performance penalty if your program create lots of objects. CallerMemberName is preferred over this implementation.
Field 2: recommended method if you just hate literal string for some reason but still want refactor support
CallerMemberName: simple to implement and has good performance, the recommended implementation. However you lose the refactor support in event handler.
APO: no boilerplate code, but your lose refactor support and dependency on external libraries.
You can get the source code to the implementations and test harness at github

Thursday, October 4, 2012

Entity Framework Code Second 0.5.0 released

Version 0.5.0 of Entity Framework Code Second templates have been release on NuGet.  The new version added BulkInsert method to the repositories.  It takes an IEnumerable<Entity> as input and uses SqlBulkCopy to do the bulk insert.  This only work for MSSQL server for now.

You can report any bugs or issues through github or email (check NuGet package for contact email).

Saturday, August 11, 2012

Entity Framework Migration with Model First

Entity Framework 4.3 Code First's Migration allows developers to generate database schema migration functionality as code rather than using t-sql to update the schema.  All the example I have found so far uses Power Shell command to generate migration classes and apply schema update through Power Shell command.  In the article I will show you how to have your program apply the migration classes automatically at run-time.  You will still need to generate the migration classes using command, but that's a good thing because you will want to review the changes first before letting your program update the schema.  As a bonus I will also show you how to use Migration feature with Model First approach through t4.

Assumptions:

  • You are using Entity Framework 4.5/5.0
  • You added Entity Framework to your project using NuGet
  • EF entities are used as database objects and nothing more
  • Fluent Api for entity configuration is used instead of attributes
  • You want to use Repository pattern (aka Table Module)
  • You have basic understanding of Entity Framework migrations feature by reading this blog
I am using fluent api for schema generation, through EntityTypeConfiguration, because it give us more flexibility to configure how the entity is translated to tables and there are things the attributes will not be able to do, such as specify precision for decimal datatype.  Complex types are supported in the latest version of templates.

Where to get the templates

You get get the templates either through NuGet (the preferred method) or github.

How to use the templates

Once you added the templates to your project through NuGet, you will need to run the custom tools manually when you update your model.

Under the hood

Context.tt

This template creates the DbContext subclass that contains the repositories for accessing table data and DbMigrationsConfiguration class.
For DbContext subclass, the generated constructor will initialize the repositories by passing the concrete DbSet objects to the repositories' constructors. See repository.tt section for more information on repositories. The subclass also overrides OnModelCreating method to use entity configurations and calls a partial method CustomOnModelCreating method that can be implemented to contain additional model creation logic.
The DbMigrationsConfiguration class is used for the Db Migration feature of Entity Framework v4.3.1 and later. The generated partial class contains instruction on how to create migration classes in the comments section.

Entity.tt

This template generates poco for the entities defined in the model.

ComplexType.tt

This template generates poco for the complex types defined in the model.

Enum.tt

Only apply to .Net Framework version 4.5 and up. Generate enums used in the model.

EntityTypeConfiguration.tt

This template generates EntityTypeConfiguration classes for the entities that configures the model using Fluent Api. This template will handles cases that weren't well supported using by other templates such as row version or decimal type.

ComplexTypeConfiguration.tt

This template generates ComplexTypeConfiguration classes for the entities that configures the model using Fluent Api. This template will handles cases that weren't well supported using by other templates such as row version or decimal type.

Repository.tt

This template generates classes that implements .Net version of repository pattern (aka Table Data Gateway pattern from Folwer). The repository inheriates IDbSet and is a wrapper for DbSet class. The repository adds additional method based on the entity such as if the entity contains RowVersion it will create GetMaxRowVersion for the repository.

Migration

Once the project is setup with t4, you will find DbMigrationsConfiguration.partial.cs under project root folder.  Open the file and you will see the remarks about how to use this class.  What you want to do is run the Add-Migration command in Package Manager Console.  The DbMigrationsConfiguration class is setup for you to generate all the migration classes under Migrations folder under the {parent}.Migrations namespace.

For example to create the initial schema run the following command, assuming the name of your project is Data and you have a local database server running.

Add-Migration InitialCreate -ProjectName "Data" -StartUpProjectName "Data" -ConfigurationTypeName "Data.Migrations.DbMigrationsConfiguration" -ConnectionString "Server=localhost;Database=EF;Trusted_Connection=True;" -ConnectionProviderName "System.Data.SqlClient"

After you have created migrations classes, you just need to call the static method DbMigrationsConfiguration.UpdateSchema(connectionString).  The method will
  1. Create the database if not exist
  2. Update the schema to the latest version
Depend on your upgrade strategy your might call the method on Program.Main or Application_Start, or you might write an utility program that will update all your databases.

Saturday, October 15, 2011

Setting up secure windows azure web role with GoDaddy ssl certificate

For the current project I am working on I am using Windows Azure as the platform and I need to figure out how to setup SSL for the web apps I am deploying to Windows Azure. The documentations online are sparse and fragmented. All the information I am providing here are available somewhere but they are not all together in one place.

Now, the reason for this article, you need to create a certificate file with private key for import but the SSL certs from GoDaddy are chained and you don't have the private key for the chained certificates.  From my research Versign certificates will work without issue when uploaded to Windows Azure, but very few people got GoDaddy's certificates to work properly.  However Versign certificates cost way more than $12.99 (Google for discount) per year, so hopefully this article will save you some money.

To setup https on Windows Azure you need to create a certificate file with private key and embed the chained certificates so you can upload them to Windows Azure. There are many ways to setup the SSL certificate and export it for use later but I have only found one way to correctly install and export the SSL certificate so that it contains the chained certificates as well. I am sure the steps here apply for other vendors but GoDaddy is the only one I tested with so I am putting that down as pre-condition.

This article assume you are doing the following:
  • you know how to manage your domain and sub-domain with your domain registrar
  • you purchased an SSL certificate from GoDaddy
  • you are building .Net web applications with Visual Studio
  • you are using Azure web role to host your web application
solution layout
The first step is to create a dummy project and get it running locally. Since we are only discussing setting up SSL your test app should contain no access to database or external resources just to minimize places where your app might fail, if you have something that's ready deployed to Windows Azure and it works then you can use that as well. I am going to reuse the Html5 project I created for another article. In the same solution where your sample web project is, simply create a Windows Azure project. After the Windows Azure project has been created, right click on the Roles folder and select Add->Web Role Project in Solution. If you don't have an existing project to work with simply create a new one right now.

Once you have the project up and running locally then you are ready to deploy it to Azure.  We want to make sure the non secure version works first before we move on with the main part of the article to eliminate any issues that's not related to SSL certificates. Log onto Windows Azure management portal to create a new hosted service and a production deployment of the sample project we have created earlier. Once the deployment is ready navigate to your-host.cloudapp.net. Verify that your web role is running correctly and fix any issues that have popped up.

Next we will setup our custom domain to redirect to Azure web role.

For setup with sub-domain:
 In the DNS manager add a CNAME for sub with value of your-host.cloudapp.net.

For setup without sub-domain: In your DNS manager, add a CNAME for www with value of your-host.cloudapp.net. I also had to create an A record for @ with value of 127.0.0.1 for the domain to work not sure why, you could google it if you are really curious. *You can skip the next step and go on to the next paragraph if you are using sub domain. Once you have the DNS records setup go back to Domain Manger and click on Forward->"Forward Domain" from the tool bar and forward your-domain.com to www.your-domain.com. Verify www.your-domain.com points to Azure web role you have setup previously and fix any issues you encounter.

You can view the result for my setup at Azure generated URL vs custom domain

Now we need to generate an ssl certificate for your-domain.com, if you are using www.your-domain.com, SSL certificate for your-domain.com will work just fine, you don't need to create a separate certificate for www.your-domain.com. However if you want to use sub-domain you will need to get an ssl cert for sub.your-domain.com. GoDaddy has instruction on how to generate and install SSL certificate on your IIS server, but I will go through it here again so you don't have to read that article. Start IIS manager and double click on Server Certificate feature. In the Action panel on the right, select "Create Certificate Request ...".  In the popup dialog enter your domain name in the "Common name" field.  The rest of fields doesn't matter too much if you are using standard ssl so enter whatever information you think is appropriate.  On the next screen make sure you select "Microsoft RSA SChannel Cryptographic Provider" and Bit Length of 2048.  Save the result somewhere.

Next log into your GoDaddy account and navigate to Secure Certificate Services, select an unused credit and request a certificate. Copy the CSR content from the saved file and paste it into the input box. I selected Starfield Technologies as Certificate Issuing Organization, you may do the same if you want to follow the rest of this guide closely.  Wait a few hours and check back, your SSL should be ready. Download the certificate for IIS 7 and make sure to check include the intermediate certificates box.

Now we are ready to install the certificates, first we will install the intermediate certificates. Start Microsoft Management Console by type in mmc in the run box on the Start menu. Select File->"Add/Remove Snap-ins", then select Certificates and Add. Pick "Computer account" option in the popup box, then on the next screen with "Local computer" selected, click Finish button then Ok button. In the tree view on the left navigate to "Certificates/Intermediate Certification Authorities/Certificates". Right click->All Task->Import and import "sf_iis_intermediates.p7b" (you might have "sf_bundle.crt" if you downloaded SSL for other server option). Make sure to read through the next part before proceeding.

What we want to do next is install the SSL certificate through IIS, take note that we are not importing the SSL certificate using mmc, if you do that you cannot export the certificate with private key. Start IIS manager as administrator and select your local machine in the tree view on the left then double click on Server Certificates feature. On the right hand side is Action panel click on "Complete Certificate Request ..." and browse to your SSL certificate, this is the other certificate that was not installed in the earlier step. You might need to change the file extensions in the open file dialog because GoDaddy's certificate file is saved as *.crt while the open file dialog is defaulted to *.cer. Now you should see your SSL certificates installed in IIS manager, go back to mmc and check under "Certificates/Personal/Certificates". You should see your SSL certificate listed there, refresh if you don't see it. if that doesn't work the SSL certificate was not installed correctly; Google online for more help.

So after all that we are finally ready to export the certificates and upload to Windows Azure. In the mmc find the SSL certificate and right click->All Tasks->Export. Make sure you export through mmc, IIS export doesn't export the complete certificate chain correctly and you cannot export the intermediate certificates any other way since you don't have their private keys. Click Next on the Welcome screen and in the Private Key screen make sure to have "Yes, export the private key" selected then click Next. In the Export File Format screen select "Personal Information Exchange", check "Include all certificates in certification path if possible" and "Export all extended property" options then click Next. Enter a password and click Next, then select where you want to export the certificates to. Once the export is done you are ready to upload it to Windows Azure.

Go back to Windows Azure management console and find the host service we created earlier, there should be a certificates folder under the host. Right click on that folder and pick "Add Certificate" option; point it to the certificate file we just exported and enter the password and ok. Once the import is done you should see three certificates, this is important if any of the certificates is missing most browser will not validate your SSL certificate and your site is not shown as secure even if the communication is protected. We upload the certificates first before setting up the web role because it makes the next step slightly easier. Don't close the Windows Azure management console, start up visual studio and open your project.

All three certificates

Find your web role and bring up the property view by double click on the web role. Select Certificate tab and Add a new certificate. Name it your-domain.com, select LocalMachine for Store Location and My for Store Name.  Now go back to Azure console and select your-domain.com certificate in the properties panel to the right find the Thumbprint property and copy it's value. Back to visual studio and paste the thumbprint value we just copied into Thumbprint field. Repeat the process for the other two certificates but set Store Name to Trust.


Add certificate with thumbprint value
Next click Endpoints tab in the project property view and add a new end point for https and select your-domain.com certificate as ssl certificate.


Endpoints
Finally, create a new Azure package and upgrade the existing deployment on Windows Azure portal. If you have done everything correctly up to this point the new package should upload without any error.  Once the update is complete you can now test to see if your certificates are installed correctly by navigate to https://www.your-domain.com.  The site should load without any error, if the browser put up a warning screen about site not trusted or cannot verify site SSL then you will have to search around to figure out what went wrong.  Take a look at my https results.

Disclaimer: the ssl certificate is good for a year and I don't plan on renew it so if you visit the site after 2013/07/11 the certificate will not be valid (but you can go get a free Slurpee).  Also I might take down the site later when I need to free Azure computing credit for something else.  Here is a screenshot showing the end result.

Black Magic

Tuesday, December 14, 2010

Minimize JavaScript and CSS with MSBuild: Part 1

This is going to be a two part posts about how to use MSBuild to automate JavaScript and CSS minimization.  Part 1 is mostly introduction so if you want to jump straight to how it’s done go to part 2.

For the project I am working on, I am trying to use best practice by minimize JavaScript and CSS files to reduce download time.  However, it’s hard to develop JavaScripts with no whitespaces and meaningless variable names.  What I decided to do is to have the build process automatically invoke a tool to minimize CSS and JavaScript for me during compilation.

There are many tools out there that would minimize the files for you.  I checked out .Net implementation of YUI CompressorGoogle Closure Library, and Microsoft Ajax Minifier.  I need a tool that integrate with MSBuild, so that eliminates Google’s tool, which is too bad since it produces the best result from what I can gather online.  While it is true that I can use the command line task to execute Google’s tool, I would also need to setup the environment for Java and make sure it’s available on build server.  The YUI Compressor combines all the CSS files and JavaScript files into a single css/js file, which is not the behavior I want for my project.  I want to keep page specific files separate so they don’t get downloaded until user visit that page.  In another word, if the user visit AboutUs page he shouldn’t have to download the bits for CheckBrowser page.  In the end I decided to use Microsoft Ajax Minifier.

For the rest of this post I will describe the sample project and how it’s setup.  The sample program tests a subset of browser’s html5 functionality support.  The sample program uses Asp.Net MVC2, jQuery, Modernizr, and SilkSprite.  Here are the results in Chrome 7 and IE 8:

IE 8

Chrome 7


Files
Here is the list of the files in the project, the folders of interests are uncompressed folder and content folder.  Take note of scripts and styles folder in both of the directory.  The only difference is the files under content has ‘min’ before the extension to indicate they are minimized version of the file.  The ‘uncompressed’ folder is where your working version of the files would be.  So let’s setup the folder structure first.

Create the ‘uncompressed’ folder and add ‘scripts’ and ‘styles’ directory to the folder.  Next add/move the desired files from ‘Content’ and ‘Scripts’ folders to their appropriate location under ‘uncompressed’ fold.  MVC template automatically adds the ‘Content’ folder to your project, you can rename it to ‘content’ if you are anal like me, if not just leave it in place.  Then add ‘images’, ‘scripts’, and ‘styles’ folder to ‘content’.

Next we need to grab the files for tools we will be using and place them in the project.  Download Modernizr JavaScript from http://www.modernizr.com/, and SilkSprite from http://www.ajaxbestiary.com/Labs/SilkSprite/.  I will assume you know how to create controllers and views since that’s not the scope of this article but if not you can check out a few tutorials online.  I de-minimized Modernizr script for testing purpose, you can leave it minimized and simple make sure it doesn’t have ‘min’ in the file name.  I also change the SilkSprite CSS to point to the location of actual image file.

Finally we would need to do is set the build action for every file under ‘uncompressed’ to None.  To do this you click on the file and bring up the property window(F4), the first item in the property window should be ‘Build Action’, change its value to None and do it for all the files.

That’s it for the background explanation and setup of the project.  In the next post we will open up the MSBuild file and modify the internal of the project file to add instruction for minimize JavaScript and CSS files during compilation.

Download the sample project here.

Tuesday, December 7, 2010

Minimize JavaScript and CSS with MSBuild: Part 2

This is part 2 of two part posts discussing how to use MSBuild to automate JavaScript and CSS minimization.  Part 1 containing background information on what we are trying to accomplish and basic project setup.  This post will dive into the steps required to setup the minimization process.

With the project structure setup the way we wanted, the next thing we are going to do is to make the project automatically pick up all file under ‘content’ folder and treat them all as content.  With the sample project open right click on the project file “Html5” and select “Unload Project”, then right click on the ghosted project icon and select “Edit Html5.csproj”.  Search for /Project/ItemGroup/Content xml element and find all the Content element with Includes attribute referencing files in the ‘content’ folder.  Delete all of them, then add this element in it’s place, make sure it’s a child of <ItemGroup> element.

<Content Include="content\**\*.*" />

Before we proceed to the next step, make sure you have Microsoft Ajax Minifier installed.  Check for “C:\Program Files (x86)\MSBuild\Microsoft\MicrosoftAjax\AjaxMin.tasks”.  Back to the Html5.csproj file, find the <Import> element and add a sibling node to the file.

<Import Project="$(MSBuildExtensionsPath32)\Microsoft\MicrosoftAjax\AjaxMin.tasks" />

Find the Target element Named “BeforeBuild” (<Target Name=”BeforeBuild”…), it might be commented out, uncomment it and remove all child elements from it.  If BeforeBuild target doesn’t exist, add it to the file.  We are doing the customization in BeforeBuild because BeforeBuild target is referenced by Build target, and the Build target is called every time before debugging or deployment.  This behavior will make sure the minimized files are updated before we launch the program for testing or package it for deployment.

The content inside BeforeBuild target is pretty simple it does two things: first it delete previous minimized file from content directory, then it calls two other targets to perform the actual work of minimizing and copy content files.  Here is the xml snippets for BeforeBuild target:

<Target Name="BeforeBuild">
  <ItemGroup>
    <JsToDelete Include="content\scripts\**\*.*" />
    <CssToDelete Include="content\styles\**\*.*" />
  </ItemGroup>
  <Delete Files="@(JsToDelete)" />
  <Delete Files="@(CssToDelete)" />
  <CallTarget Targets="MinimizeWebContent;CopyWebContent" />
</Target>

Take note of CallTarget task, it calls two targets we referenced earlier.  We will look at the CopyWebContent target first.  We use CopyWebContent to copy file as-is for debugging and testing purpose.  We need to make sure this target is only executed when our configuration is set to Debug.  For that we will set the condition on the target, Condition=" '$(Configuration)' == 'Debug' ".  The reset of the target is pretty simple, it copy the files from uncompressed folder to content folder and add min to the filename.  Here is the xml snippet for CopyWebContent:

<Target Name="CopyWebContent" Condition=" '$(Configuration)' == 'Debug' ">
  <ItemGroup>
    <Js Include="uncompressed\scripts\*.js" Exclude="uncompressed\scripts\*.min.js" />
    <Css Include="uncompressed\styles\*.css" Exclude="uncompressed\styles\*.min.css" />
  </ItemGroup>
  <Copy SourceFiles="@(Js)" DestinationFiles="@(Js -> '$(MSBuildProjectDirectory)\content\scripts\%(Filename).min.js')" />
  <Copy SourceFiles="@(Css)" DestinationFiles="@(Css -> '$(MSBuildProjectDirectory)\content\styles\%(Filename).min.css')" />
</Target>

Lastly let’s take a look at MinimizeWebContent, the meat of this article.  Before we proceed we need to add the conditional statement to MinimizeWebContent because we only want to minimize the files when we compile to project for release environment.  The main task that’s going to minimize files is the <AjaxMin> task, the task takes six arguments: JsSourceFiles, JsSourceExtensionPattern, JsTargetExtension, CssSourceFiles, CssSourceExtensionPattern, and CssTargetExtension.  We will look at the JavaScript related parameters.

First, we will create an ItemGroup for JavaScript files named Js, it will include all files under uncompressed\scripts folder.  The ItemGroup Js will be passed to JsSourceFiles argument.  Next we need to give JsSourceExtensionPattern a value, the parameter accepts a regular expression for identify extension portion of the filename.  Lastly the value assigned to JsTargetExtension will replace the file extension identified by JsSourceExtensionPattern with new file extension.  The other three parameters are similar to the ones we just discussed but are used for CSS files.  Once the files are process by AjaxMin we need to copy them to content folder and delete the generated minimized files from uncompressed folder.  The xml snippet that performs all the action described about is shown here.

<Target Name="MinimizeWebContent" Condition=" '$(Configuration)' == 'Release' ">
  <ItemGroup>
    <Js Include="uncompressed\scripts\*.js" Exclude="uncompressed\scripts\*.min.js" />
    <Css Include="uncompressed\styles\*.css" Exclude="uncompressed\styles\*.min.css" />
  </ItemGroup>
  <AjaxMin JsSourceFiles="@(Js)" JsSourceExtensionPattern="\.js$" JsTargetExtension=".min.js" CssSourceFiles="@(Css)" CssSourceExtensionPattern="\.css$" CssTargetExtension=".min.css" />
  <ItemGroup>
    <JsMin Include="uncompressed\scripts\**\*.min.js" />
    <CssMin Include="uncompressed\styles\**\*.min.css" />
  </ItemGroup>
  <Copy SourceFiles="@(JsMin)" DestinationFolder="content\scripts" />
  <Copy SourceFiles="@(CssMin)" DestinationFolder="content\styles" />
  <Delete Files="@(JsMin)" />
  <Delete Files="@(CssMin)" />
</Target>

Now after all the work we can finally see the result of our labor.  To see the result set the configuration of the project to ‘Debug’ and build the project. We can then see the total file size for all the files under content/scripts and content/styles is 60,519 bytes.  Next we set the configuration to ‘Release’ and rebuild the project.  Again check to the total file size and you should see 57,544 bytes.  That’s 2975 bytes saved or 4.9%.  That’s not very impressive but we have to remember that majority of the files are already compressed and we will not get much saving from them.  To get a better picture of what our saving could be we will take a look at home_index.js file.
Here is the uncompressed version of the file, home_index.js

$(document).ready(ValidateHtml5Functionality);
function ValidateHtml5Functionality() {
    var supportsAll = true;
    var validationSummary = $('#validationSummary');
    var validIcon = '<span class="ss_sprite ss_accept "> &nbsp;  </span>';
    var invalidIcon = '<span class="ss_sprite ss_cancel "> &nbsp;  </span>';
    validationSummary.append('<li>' + (Modernizr.applicationcache ? validIcon : invalidIcon) + 'Application Cache</li>');
    validationSummary.append('<li>' + (Modernizr.canvas ? validIcon : invalidIcon) + 'Canvas</li>');
    validationSummary.append('<li>' + (Modernizr.canvastext ? validIcon : invalidIcon) + 'Canvas Text</li>');
    validationSummary.append('<li>' + (Modernizr.localstorage ? validIcon : invalidIcon) + 'Local Storage</li>');
    validationSummary.append('<li>' + (Modernizr.sessionstorage ? validIcon : invalidIcon) + 'Session Storage</li>');
    validationSummary.append('<li>' + (Modernizr.webgl? validIcon : invalidIcon) + 'Web GL</li>');
    validationSummary.append('<li>' + (Modernizr.websqldatabase ? validIcon : invalidIcon) + 'Web Sql Database</li>');
    validationSummary.append('<li>' + (Modernizr.websockets ? validIcon : invalidIcon) + 'Web Socket</li>');
    validationSummary.append('<li>' + (Modernizr.webworkers ? validIcon : invalidIcon) + 'Web Workers</li>');
}

And here is the compressed version, home_index.min.js

$(document).ready(ValidateHtml5Functionality);function ValidateHtml5Functionality(){var d=true,a=$("#validationSummary"),c='<span class="ss_sprite ss_accept "> &nbsp; </span>',b='<span class="ss_sprite ss_cancel "> &nbsp; </span>';a.append("<li>"+(Modernizr.applicationcache?c:b)+"Application Cache</li>");a.append("<li>"+(Modernizr.canvas?c:b)+"Canvas</li>");a.append("<li>"+(Modernizr.canvastext?c:b)+"Canvas Text</li>");a.append("<li>"+(Modernizr.localstorage?c:b)+"Local Storage</li>");a.append("<li>"+(Modernizr.sessionstorage?c:b)+"Session Storage</li>");a.append("<li>"+(Modernizr.webgl?c:b)+"Web GL</li>");a.append("<li>"+(Modernizr.websqldatabase?c:b)+"Web Sql Database</li>");a.append("<li>"+(Modernizr.websockets?c:b)+"Web Socket</li>");a.append("<li>"+(Modernizr.webworkers?c:b)+"Web Workers</li>")};

For this file the original file size is 1,341 bytes, and the compressed version is 812 bytes.  That’s 529 bytes saved or 39%!  Much better result.

There you go, with this setup we can now develop and debug our JavaScript using normal formatting/whitespace rule, and deploy minimized version of JavaScript files in an automated fashion.  With Ajax enabled web so prevalent nowadays this will help you create more responsive website by reduce the number of bytes the user will have to download.  There are few more things we can do to further reduce the latency of the website.  One thing is to combine the non-page specific files into one to reduce the number of request to server, this could be accomplished by using both YUI Compress for non-page specific files and AjaxMin for page specific files.  I will leave that to you as an exercise.

Download the sample project here.