Skip to content →

Glyn Darkin Posts

ASP.NET vNext – Packing and Publishin

These are my notes from the ASP.NET vNext Community Standup

Packing and Publishing

Packing and Publishing is the process of taking the source and static files as they are represented in my source tree and staging them into a folder structure ready to xcopy onto a server.

The underlying process uses the KPM command line tool, which has a number of switches for options to control the output.

There are levels out publishing:

  1. Move the files and folders and pull in the nuget packages ready for deployment
  2. Same as above, but compile all the *.cs files into a dll

The KRE runtime can now run code without it having to be compiled first.  This enables deployment to be a simple xcopy of the source files. If you want to compile all the source files up front then rather than compiling them to a traditional bin folder they will get compiled into a nuget package and get deployed into the packages folder. This is an important point as it means that all executable C# code is modular and managed at the package level.

You starting folder structure will look similar to the below tree structure, this is the structure that would get checked-in to your source control provider.

  • $/
    • src
      • MyWebApp
        • project.json
        • files.cs
        • wwwroot
          • static files

All external nuget packages would be references under the dependencies node of your project.json file, however your packages are pulled down from the remote repository and stored in a local cache in your user profile. When your application runs the KRE runtime will probe for the packages and use the cached versions.

Packing the solution using kpm and the pack command will generate an xcopyable deployable.

kpm pack –out output file path

Output directory structure

  • output file path
    •  wwwroot
      • web.config
      • static files
    • approot
      • packages
        • nuget packages
      • src
        • project.json
        • x.cs

This separation of source code and static files means that code can never be served.

IIS or Kestrel anchors to the wwwroot folder.

If your decided to compile your application when you package the output folder structure would look similar to the below:

  • output file path
    •  wwwroot
      • web.config
      • static files
    • approot
      • packages
        • nuget packages
        • MyAppPackage

The MyAppPackage represents your source code compiled to a nuget package, the package process all pulls all cached versions of the dependent nuget packages so that they can be xcopy deployed.

Using wild cards in version numbers enables automatic updates to dependencies without having to reupdate

KRE – is the new runtime, this can be included in the packages folder if the CLR is not already installed on the target machine.

CLR can be either remote deployed onto a server and shared, or deployed locally within the packages folder which will take precedence.

F5 != kpm pack

The Visual Studio publish UI uses this underlying command.



Leave a Comment

ASP.NET vNext – Entity Framework 7 Design (Notes)

My notes from the ASP.NET Community Standup 4

Main goal of ASP.NET is to shink down the memory footprint of a Request.

Data access is a first class tenant of web applications, therefore Entity Framework is becoming a first class tenant of ASP.NET.

EF is becoming portable and is trying to provide a single consistent programming model for all data access, on the server, on the desktop, tablet of mobile.

EF 7, is also a complete rewrite. To make EF lighter the underlying EDMX has been removed, making it more modular and light weight.  Little code is being moved forward, the focus is more on the API, designs and conventions that people have adopted.

EF 7 will be a PCL Nuget package, with the objective that it can run on Mono, Xamarin etc.

Providers will include Redis, Azure Table Storage.

No Model First, no designer. Only Code first using annotations and fluent api, however it will still include reverse engineering to generate code from an existing database table.

EF7 is not recommended for a complex datamodel, its focus is on providing a common abstraction over simple datamodels.

Indexes will now become a first class citizen of the mapping infrastructure so that they can be managed via code first. Providers can use this metadata to generate their own interpretation of an index based on their store.

Providers are created for each type of data store. There are some helpers for providers creating Linq. There is some capabilities for projecting those Linq expressions in memory when the underlying data store does not support Linq.


Shipping Providers

  • MS SQL Server
  • SQL Lite
  • Azure Table Storage
  • Redis Provider

Heavy focus on performance and memory management, including batching, change tracking and the output sql generation.

If you want the EDMX and designer you should use EF6 which will be fully supported for the foreseeable future.

Leave a Comment

ASP.NET vNext – Razor, Directives and Tag Helpers (Notes)

These are my notes from the ASP.NET vNext Community Standardup 3


Razor Views now support async processing out of the box.

This is particularly important for ASP.NET webpages or using Razor outside the normal MVC lifecycle.

Razor now supports incremental flushing, so that HTML can be streamed to the browser. This is enabled using a new “@await FlushAsync()” razor command that can be called within the page.


Dependency injection is now plumbed into Razor.


@inject IFoo Foo

Components can now be injected via IoC into a View


The default helpers have been removed from the base class and are now injected using the Injection mechanism. This means that you can expand your own Helpers


Use _ViewStart pages

Now _ViewStart supports directives, so that we can use the @inject directive to set global properties, _ViewStart files can be cascaded and nested.

Pre compilation is still around and will be turned on by default.


Available directives

@inject IFoo Foo

@using My.Comapny.Namespace

@inherits points to the base class

@model use this model across the lifes



Model C# class

Validation based on Attributes on Model

Razor template to generate AngularJS Template, and then client side AngularJS processes template


Tag Helpers

Provides a solution to support better HTML manipulation similar to how AngularJS manipulate HTML


<input for=”FirstName” class=”form-field” />

<tag-cloud count=”2″ />

This extension improves the authoring of Razor files and provides better intellisence.

This does not enable a server side dom, you can access the view string, but you would have to manipulate this yourself, however this is not recommended as it limits the buffering and thus speed of the page.

Intent of feature is not to become server side DOM but to improve HTML helpers with HTML style syntax.



Leave a Comment

ASP.NET vNext – Webroot, static content and client side serving (Notes)

Notes from the ASP.NET vNext community stand up.

Web Root, Static Content and client side serving

A new project sub-folder that is the route folder that your webserver hosts e.g. IISExpress would point to the WebRoot folder no instead of the Project Route. Within this folder you will place all the static content that would get served by the webserver. The purpose of this folder is to provide better security and better support for a front end build process. Therefore this folder is where the output of any frontend build process using Gulp, Grunt or Bower would go. It can be thought of as the bin folder for static files and WebRoot can be viewed as

If you have moved to using bower to manage your front end assets, then you would have a task runner to copy your assets from bower_components to the WebRoot.

WebRoot would not be checkedin to your source control folder, as its built as part of the build process.

This is a very common pattern for Ruby & NodeJS developers.

Build orchestration on Windows within Visual Studio will use new project file type, appname.kproj which is an MSBuild file for Visual Studio, however MSBuild is not used for compilation as compilation is done at run time.


Publishing processes within Visual Studio to compile and build a package for publishing use the KPack solution.


  • Build out a front end build process using Grunt, Gulp and Bower.
  • Move minified/transformed content files from bower_components to a separate content folder


Leave a Comment

ASP.NET vNext, How & why? (Notes)


What is ASP.NET vNext?

  • New flexible and cross-platform runtime
    • Run it on Windows
    • Run it on Mac
      • Using Mono
    • Run it on Linux
      • Using Mono
  • Modular HTTP request pipeline
    • No more System.Web
    • Baremetal performance
    • Supports OWIN
  • Built cloud ready
    • Fully self contained, not coupled to .NET framework
    • Slimmed down to a CoreCLR
    • Use only the components of the baseclass library that you need using Nuget
    • Side by side deploy
    • No GAC or binding policies
  • Friendly frameworks
    • All references pulled in using Nuget at runtime
  • Agile development with the tools of your choice
    • Visual Studio 14
      • New project system that removes the “build” set between code & F5
    • Command line – Full Command line support
    • 3rd party editors – Sublime support
  • Open source on GitHub

In an effort to create an experience as close to Ruby & NodeJS ASP.NET vNext will do build compilation at run time unless you choose to do pre compilation.

Inversion of Control

You apps will be fully componentised and provisioned from the ground up using the inbuilt “inversion of control” (IoC) container.  Providers are not configured in Web.config any more but are replaced with a IoC, you can rip out the provided one and use your own.

Take away: If you are already using an IoC (Castle Windsor, Autofac) you will be in a good position here, however the recommendation is to start using one.

Bower & Grunt

Although many people have been publishing front-end frameworks using Nuget e.g. AngularJS, JQuery etc, Nuget is not great at managing the content files. Rather than extend Nuget the team have taken the decision to implement first class support for Bower & Grunt.

Take Away: This functionality has already been packaged into a number of Visual Studio extensions so that you can start adopting this work flow with Visual Studio 2013, alternatively you can use the Bower Nuget package.

Using Environment Variables for identifying configuration

Rather than compile and package for application for each environment and run the associated web.config transforms, ASP.NET will now use a preconfigured Environment Variable to determine which environment it is running in and make associated configuration changes. We can already do this today by creating our own Environment variables and querying them at run time.

No web.config – replaced with your own providers e.g. Json file, environment variables and IoC for provider configuration

Deep integration with Roslyn & Nuget

Real-time compilation and debugging enables whole new code-debug-run scenarios, especially when it comes to 3rd party references. Examples of cloning via git, referencing and compiling where given in talks that represents a much fast and open developer cycle than the more common use of published, compiled binaries.

Take away: keep on track with understanding Roslyn, no need to go into deep understanding but keep playing with it.

Harmonised frameworks

Web Pages, MVC & WebAPI have all been harmonised together

MVC + WebAPI + Web Pages = ASP.NET MVC 6

Both Web Pages will now be built on the same basics as MVC & WebAPI. Web Pages will have a clear upgrade model to MVC.

Moving forward – Compatibility

Web Forms, MVC 5, Web API 2, Web Pagaes 3, SignalR 2, EF 6

Fully supported on .NET vNext (NOT ASP.NET vNext)

MVC, Web API, Web Pages 6, SignalR 3

Run on new runtime and request pipeline only (no System.Web.dll)

To migrate MVC app, you will have to create a new project and migrate!!!!

Cloud Optimised RunTime

Very slimmed down version of CoreCLR, could take a while!!!

API portability scanner to find out if your code will port


RTW Release Q2 2015


Leave a Comment

Principles for building a Successful Continuous Integration System

Below is a list of the core principles that have defined the  CI solutions that I have built over the last few years.

Use the lowest common denominator in technology for scripting

This has meant that the main workflows have been written in PowerShell rather than MSBuild or Nant as its more accessible, a terser language, can use .Net framework directly & debuggable.

I have seen too many systems that need loads of dependencies installed before you can run a build e.g. Using Ruby as a build language

Everything should be in source control

As a developer I want to install VS & SQL, get latest, build and go.

That means no “installed” dependencies, they all need to be mapped into the source tree or pulled in from Nuget

No build masters

The build should be something that everybody understands, can fix and improve

Team leads should be the custodians of a project build as they are the people that should have  the big picture view of a project

You broke it you fix it

Exactly what it says on the box. It does not matter if its a code, unit test or build script failure. If you changed something that means the build has stopped working then you fix it.

Same build for local as remote

A lot of build systems hold the configuration on the server, I believe this to be wrong.

I want to be able to run the same build that’s on the server on my local machine before I commit
Everything the build server does should be available for me to me locally.

Dashboards & emails are great but a flashing light is better

The last system I built had a flashing tractor light and a USB Santa drumming Christmas tunes when the build was broken, makes things a little more fun!!!

The below books really shaped my attitude to CI and contain invaluable information.

Leave a Comment

Designing a Target Architecture for a large scale website built on Amazon Webservices

I am currently helping out a friend of mine with their startup. They wanted me to design an architecture and technology roadmap for their platform so that they can look to get funding to move things forward. They are already hosting their minimum viable product on Amazon Web services so that seemed like the obvious place to start.

To build a tech roadmap your need to start with the target architecture. I used the Amazon Visio Stencils  to draw up what I consider to be a pretty simple but scalable architecture for the site. The site is predominantly a readonly site so there is not a large requirement for a scalable async workflow, as a result scaling will be provided by adding more web & search nodes and caching, caching, caching.Image

There is a requirement to process a number of feeds which would be delivered using Elastic Map Reduce workflows rather than traditional ETLs and pushing the results out to Amazon DynamoDB for fast, scalable read access.

I have also added Solr as the search platform however this could easily be replaced with ElasticSearch or even the new Amazon Cloud Search (still in Beta)

I then plugged in all the details into the Amazon calculatorto get a monthly run cost of $3617.73

I find that cloud architectures tend to be quite prescriptive and thus the above architecture could be considered to be rather generic for a large scale webapp.

I would be really interested to hear what people think of it and what could be improved.

Leave a Comment

A/B Testing – Should you use it and if so, when?

Imagine making a change to your site based on a gut feeling and then moving on to the next thing without actually being able to prove categorically, that the change had a positive influence on your business. Perhaps you don’t have to imagine that – you do it every day! So imagine being able to make a change and actually measure whether it has made a positive or negative impact on your KPIs (Acquisition, Activation, Retention, Referral, Revenue).  All the big companies (Google, Microsoft, Facebook etc.) are using A/B testing to enable them to measure each change. But when do you use it?

A/B testing presumes that you’ve already bottomed out your problem solution and are now working on scaling, or product market fit. So in essence you know what your product is and now you need to optimise it. Those optimisations could be big or small and we would advocate for being bold in trying out big changes to get some strong measurements back out to learn from. But here’s a presentation that gives examples of what you can do with A/B testing and why a test someone else has run should not necessarily be used to determine what you do.

[gigya src=”” allowfullscreen=”true” allowscriptaccess=”always” width=”500″ bgcolor=”#ffffff” flashvars=”prezi_id=huzarjynislw&lock_to_path=0&color=ffffff&autoplay=no&autohide_ctrls=0″ ]

Leave a Comment