Monday, February 1, 2016

Have the ability to rapidly build and run micro-applications

All life is an experiment. The more experiments you make the better.
- Ralph Waldo Emerson

I'm in the process of developing my own application 'stack' (generic application slices with reusable cross-cutting concerns - e.g. Security, Configuration, and Logging).

I have most of it in place at the moment, but still need to work out how I will secure my API's.  Presently, I am looking at IdentityServer as a possible solution for this.

This is the sort of solution that I want to be able to design/develop/build/package/release quickly:

* Web API Service Bus
* Typescript Client
* Identity Management Server (this probably only needs to be build and deployed once)

I have lots of little ideas that I want to be able to push out and host as an Azure AppService.  One of them is a Garden Maintenance Planning application, another one is a Habit Kicker service.

The Habit Kicker application let's you create a habit that you want to kick - in my case, I am aiming to have alcohol free days (AFD's) and then record success outcomes.  The system will push out an alert asking for feedback about whether you have achieved your goal for the day.

You can see from the user interface that, even for such a simple little idea, there are many problems to be solved:
  • How will notifications be implemented
  • How will schedules be implemented
  • What technologies will be used to develop the user interface
  • How are users identified
For these types of micro-applications, it would be desirable to have a consistent, repeatable recipe for build/package/release that would scale across the different application types - e.g. Typescript client, DNX Web application, ASP.NET 4.6.

Being able to create the infrastructure components and deployment pipeline rapidly, means that you can focus on developing value from as early in Sprint 1 as possible.

In terms of infrastructure, my plan is to use Visual Studio Team Services for source code hosting, and for Build/Package/Release.  Using Azure AppServices as my hosting platform provides me with a high level of control and flexibility.  Azure AppServices also have useful platform components such as WebJobs that I can take advantage of.

The next major problems that I want to solve are:

  1. Identity and Access Management - I want a low touch solution that is as decoupled from my Application as possible
  2. Build/Package/Release - recipes for different flavors of app, as mentioned above

Please feel free to leave any tips or comments to let me know how you achieve rapid deployment for your little ideas.

Relevant posts

Wednesday, January 27, 2016

Configuring .NET Core projects for optimal local development

When developing an ASP.NET Core applications, among other decisions, you need to choose which version of the ASP.NET Core and .NET Core packages to consume. When using .NET Core tooling in its default state, you will likely bump against the following issues when developing many projects locally:

  1. The default Nuget settings are a global configuration (%AppData%\NuGet\NuGet.config).   How do different developers keep their Nuget package source configurations in sync if they are not in source control?
  2. The default package source folder is a global setting (%userprofile%\.dnx\packages).  How can you run separate projects against a different .NET Core package versions without getting version conflicts for packages?

Choosing a .NET Core version for your solution

ASP.NET Core source code is developed on GitHub and then pushed to several different Nuget feeds, based on the stability of the code.  The cadence and feed choices are as follows:

  • aspnetvolatiledev – Any package that compiles is pushed here. Only used by contributors dealing with breaking changes between repos.
  • aspnetcidev – A coherent set of packages that compiled referencing eachother. Used by contributors when building under normal circumstances.
  • aspnetvnext – A coherent set of signed packages that have passed automated testing. Used by consumers evaluating the latest developments in the stack.
  • – Official releases used by general consumers.

The choice will depend on your appetite for risk/change/stability.  Choosing the release means that your packages will be very stable for a long period (e.g. Beta 5, Beta 6, RC1, RC2, etc.) but you will have a lot of catching up to do when new packages are released due to the high amount of code and API churn that is happening at the moment.

In choosing the ASPNETCIDev feed you will avoid monolithic refactors but, instead, you will get hit with regular breaking changes that will impact your development productivity on a daily basis.  

The vNext feed sits between the official feed and the CIDev feed and provides a trade-off between daily breaking changes or a monolithic set of changes.

Restoring Packages

.NET Core projects define their dependencies in global.json and project.json files.  Global.json identifies the platform version dependency while project.json specifies individual package dependencies.  After you have configured these files in your project, it is simply a matter of running the dotnet CLI tool to restore packages from the feed source to your local machine.  

The dotnet restore command uses Nuget configuration to identify which feeds to use when restoring.  The user profile defaults for Nuget are located at %AppData%\NuGet\NuGet.config.  These defaults are updated when you manage package sources via the Nuget configuration tool in Visual Studio.

To add the vNext feed source to your defaults, open the configuration tool and add an entry to the MyGet feed

Configuring per-project settings

Developers should have the best F5 experience possible - which means they should be able to checkout source code and run it without any friction.  Thankfully we can enable this by configuring settings on a per-project basis.

The dotnet restore command will look for and load a local project Nuget configuration before it loads the global configuration from the user profile.

Similarly, command will look for a packages setting in the global.json solution configuration to identify where to locate restored packages before defaulting to %userprofile%\.dnx\packages.

My configuration

The following files explain my personal configuration for local projects to achieve projects that are self-describing of their dependencies and which assist with reducing developer friction.

The first task is to create a local Nuget.config in the root folder of your solution.

The first line of the Nuget.config clears any package sources that might be configured at another level - e.g. the global user setting. This ensures that the local Nuget.config, defines all sources that are relevant for the project and that these are checked in to version control with the rest of the source code.

Next step is to configure a separate location for packages that are restored for the project. This ensures that the project is not impacted by other packages that might exist in the global package store which might have come from a feed which is running at a different cadence to the local project.

At this point, running the dotnet restore command will restore all dependency packages from the vNext feed into a local packages folder in the root folder of the solution.

The final step is to configure Git so that the packages folder is excluded from version control.  This is simply a matter of adding a line to the local .gitignore file for the solution.

Tuesday, January 12, 2016

Bootstrap tasks for new Typescript web projects

Earlier this year, after attending a Microsoft conference, I blogged how the momentum of the developer workflow is moving rapidly towards running tasks from the command line.  Since then I have continued to develop skills and knowledge in this area.

This post is a walk-through of my current workflow for bootstrapping new Typescript web projects for development using VS Code.  The high level tasks I execute are:

  1. Create a root folder for the application
  2. Acquire development tools using NPM
  3. Acquire software framework dependencies using Bower
  4. Create the VS Code project definition 
  5. Start the application and launch it in a browser
  6. Add the project assets to source control

Step 1. Create the project Root Folder

A straightforward step.  You could open Windows Explorer and browse to your root working location and create a new folder.  However, as we are going to be working inside of the command shell, we can avoid the friction of opening Explorer by running DOS commands.

NOTE: Since moving to the command line for my workflow, I have leaned more and more on ConEmu as my tool of choice for running command line tasks as it is only ever a keystroke away.

I launch ConEmu (CTRL+~) and type:

> cd \code
> md myproject
> cd myproject

That gets me a new project folder named myproject in my development working folder and places the location of my command prompt in the new folder.

Step 2. Acquire Development Tools

For the purpose of my bootstrapping I grab the following development tools:
  • Typescript: The tsc compiler will compile our Typescript to Javascript
  • Bower: package manager for managing client side dependencies such as Bootstrap and Angular
  • Browser Sync: Use to serve the app and to provide live updates in the browser during development
Command line tasks for installing development tools using npm:

Running these commands creates a package.json folder that contains the node configuration information.  It also creates a node_modules folder where the packages that get downloaded are stored.

NOTE: Later the node_modules folder is excluded from source control as the packages can be pulled down on demand using the npm install command - typically either on the build server or on another developer machine.

Step 3. Acquire Software Dependency Packages

My standard software frameworks are Bootstrap and Angular so I install them as part of the project setup.  As with npm, the first step is to initialize the folder for bower and then run commands to pull down and install the packages:

bower init
bower install angular --save
bower install bootstrap --save

Step 4. Create the VS Code Project

For this step, launch VS Code from the current folder using the following command:

code .

Note: This assumes that you have VS Code installed on your machine and that it is configured on your PATH variable.

In the root of the project, add a file named app.cmd and add the following command:

browser-sync start --server --port 3001 --index default.html --files="./*"

This command launches the app using a web server.  Browser Sync is a node package that was installed in the tooling step.  It watches files for changes and then refreshes the browser to show the updates.

Update the package.json file by configuring the start command to launch the website using the command that was just created.

With this piece of config in place, the site can now be launched in a browser from the command line using either of the following commands:

# launch using npm
> npm start

# launch using app.cmd
> app

NOTE: All of these tasks can be automated using VS Code's task runner but I am not yet as familiar with that task runner as I am with the approach shown in this article. 

Typescript projects require a tsconfig file that defines compiler settings and identifies the Typescript files to be compiled.  Create a file named tsconfig.json in the root of the folder and add the following configuration information.

As the project is developed, files get added to the files element and further compilation options added as necessary.

The project will need a suitable default html file and this is the basic template that I have been using.

The last task for setting up our VS Code project is to add a build task.  This allows us to press CTRL+SHIFT+B to compile the project.  To create the initial tasks file, press CTRL+SHIFT+B and VS Code will prompt to create the file:

After creating the task runner, overwrite the default content following task definition to compile Typescript assets:

At this point you should be able to press CTRL+SHIFT+B and see that the project builds - later when you have Typescript files, you will see that they get built into the js folder which is what we configured in the compilation options earlier.

You should also be able to run npm start and see the default page load up in a browser.

If that has worked so far... well done!

Step 6.  Add assets to source control

With all of that hard work, the last thing you want is to lose content that has been created.  The final step is to configure Git and commit the project to source control.  To start with, create a Git ignore file to exclude Typescript generated content, and packages.

The file should be named .gitignore and contain the following definition.

NOTE: Read Scott Hanselman's article to learn how to create files that start with a dot in Windows.

With the .gitignore configuration in place, all that is left is to run Git commands to initialise the repository and commit the assets to version control:

> git init
> git add .
> git commit -m "Initial Commit"

Testing Gist embeds in Blogger

Although I've had my blog here for a number of years now, it's largely been inactive.  Now that I've started to stir again, it feels as though there is some unnecessary friction in writing posts simply because I don't enjoy the Blogger platform.

The main gripe I have is the lack of control you have over the HTML that gets generated - it's akin to the old SharePoint platform and what it did to content.

This post is a test post to see how embedded Gists appear in Blogger.

Monday, November 23, 2015

Using Powershell to work with Json

Today I decided to look into working with Json in Powershell.  Json is rapidly overtaking Xml as the preferred format for describing projects and build artifacts, so it makes sense to learn how to integrate it with tools such as AppVeyor scripts, Visual Studio Team Services Build Tasks or Octopus deployment steps.

A quick search online led me to discover the following two Poweshell cmdlets that can be used when working with Json:

• ConvertFrom-Json
• ConvertTo-Json

Using cmder, I created a new Powershell tab and started typing:

> cd \temp
> md jsontests
> new-item "testjson.js"
> notepad "testjson.js"

I then added the following content to the file:

    Name: "Darren Neimke",
    Age: "42",
    Gender: "Male" 

Flicking back to the console, I typed the following Powershell command to confirm that I could read the content:

Get-Content "testjson.js"

Piping the raw content to ConvertFrom-Json produced the following:

To expand my use of Powershell, I opened the Powershell ISE and created the following script:

$path = ".\testjson.js"
$raw = Get-Content $path -raw

$obj = ConvertFrom-Json $raw
$obj.Age = 45     # I always lie about my age!

Write-Host $obj   # Dump obj to console

Set-Content $path $obj

The ISE amazed me in how it was able to infer the schema of the $obj instance and provided me with Intellisense after that!

Running that script updated the value of the Age property and saved it back to the file.

Things I Learned:

  • Using ISE to create a Powershell script
  • How to pass the content of a file to another cmdlet using piping and variables
  • Updating Json content using variables
  • Saving a file


EntityFramework and the challenge of Entity Serialization

Let's take the following couple of entities:

public class Parent
    public int Id { get; set; }

    public string Name { get; set; }

    public List Children { get; set; }

public class Child
    public int Id { get; set; }

    public string Name { get; set; }

    public int ParentId { get; set;  }

    public Parent Parent { get; set; }

And pass them through an Entity Framework query that looks like this:

var parent = db.Parent
                 .Include(par => par.Children)
                 .Where(par => par.Name == "Somename")

It's interesting to see that we can then write the following LINQ to query the result:

var result = parent.Children[0]

Here, the result variable refers to a Parent type which will have a collection of Children which will have a Parent ... oh never-mind, I'm sure you see where this ends!

When building a Web API application, we might think of exposing this type of query through a Controller action.  In such a case, how should the serializer deal with the cascading references.

One solution is to use a Serialization solution such as the ReferenceLoopHandler that is found in the Json.Net library to ignore circular references.  This switch tells the serializer to exclude reference properties after they have been found the first time.

Another solution is to shape the data to return specific fields from the service operation.

var parentView = new 
    ParentId = parent.Id,
    ParentName = parent.Name,
    ChildCount = parent.Children.Count,
    Children = parent.Children.Select(c =>
            new {
                Id = c.Id,
                Name = c.Name

This approach helps to control the shape of the data and to have greater certainty over what is being returned.

Taking this one step further we would create custom Data Contract classes and return those instead of the loosey-goosey approach of returning anonymous types.

var parentDataContract = db.Parent.Include(par => par.Children)
                            .Select(par =>
                                new ParentView
                                    Id = par.Id,
                                    Name = par.Name,
                                    Children = par.Children.Select(c =>
                                        new ParentView.ChildView
                                            Id = c.Id,
                                            Name = c.Name

This approach gives us better static checks across the application, allows for reuse of Data Contracts across separate operations, and allows us to see where different contracts are being used.  From a versioning and maintenance point of view, this would be the gold standard.

What is your approach to designing service endpoints?   Do you mix RESTful with RPC-style design all in the same Controllers or do you separate them out into their own classes of service?

Saturday, November 21, 2015

Developing from the Command Line

As I have mentioned, the developer workflow has changed quite a bit.  In case you haven't heard or kept up, it looks something like this:

  • From the command line, use Yeoman to generate a new project  > yo webapp
  • From the command line, initialize the folder as a new Git repository  > git init
  • From the command line, open the new project using an Editor of your choice  > code .

As you can see, much more is being done from the command line.  New tools such as cmder are being used to gain quick access to command windows for Powershell/Node/etc to assist and speed up this flow.  Cmder is great because it has transparency, allows you to have multiple tabs, and is easily summoned and hidden away using CTRL+`.

For my task today I decided to initialize a Git repository, add a file, make changes to the file, and commit those files to a Git branch all from the command line.  I wanted to use Powershell to create the project folder and the initial file so that I could tick my "One thing per day" goal of using PS for something at least once per day!

I found the New-Item (alias: ni) cmdlet which allows you to create a variety of item types.  To create a new folder, give it the -ItemType of 'directory' and then the name of the folder that you wish to create.  E.g.

> New-Item -ItemType directory myNewDirectory

I then went ahead and used Git to initialize a repo in the new folder:

> git init

New-Item can also be used to create files, just give it the name of the file that you want to create:

> ni "file1.txt"
> notepad "file1.txt"

This adds a new file named file1.txt and opens it in Notepad.

> git add .
> git commit -a -m "Adding file1.txt"

This will commit the changes of to your Git repo.  It's easy to visualize what's happening in Git Extensions:

The folder can be opened using VS Code using code and a dot "." to open the folder that you are currently in:

> code .

After playing around with Git for a while, I wanted to delete my test folder so I typed Remove- and pressed CTRL-SPACE to find out if Powershell had a Remove-Item command

And sure enough it did.  So I finished with the following PS command to blow away my test folder:

> rm "\testdir" -force

What I Learned:

  • When using cmder, I can start typing the name of a command and then use CTRL+SPACE to find all matching cmdlets