Monday, 16 December 2013

Chemistry and WPF

I remember many years ago turning to a spoof advert in VIZ Comic for 'Over Your Head Computer Magazine'. It was a brilliant, if coarse, pastiche of your typical home computer mag, except that it set the tone with a front page headline screaming 'IT'S TOO LATE! YOU CAN NEVER HOPE TO CATCH UP!'

I laughed at it at them time not realising that I'd be laughing at myself a decade or so later. There comes a point in every IT professional's career when they realise that they've fallen behind and have to risk a heart attack running to catch up. So it was with me and Windows Presentation Foundation (WPF).

I'd avoided learning WPF as I spent most of my time mired in the archaic but easy world of Windows Forms (WinForms). WinForms' main weaknesses are also its strengths: you are limited as to what you can do with the user interface elements, so you spend most of your time writing business logic code. A Button will support a handful of properties that you aren't inclined to alter on the whole: you just want to set the caption, put it in a convenient location and wire up a Click event to it so it does something useful.

With WPF it seems like you have to learn a shedload of new concepts before you can even think of doing this kind of coding. Dependency properties, content controls, routed events, styles, themes, templates...all these present hurdles that must be got over before coding can begin in earnest. I picked up Adam Nathan's excellent book on the subject with the intention of working through it, and several years later I'm still working through it.

So, I decided that the only way to teach myself WPF was to actually start working on a project that would benefit from it. I started out my career as a chemist but drifted into programming almost accidentally. I try to bring the two together by heading up the Chemistry for Word project, aided and abetted by a team of thoroughly capable and enthusiastic volunteers. Currently we are integrating the fine ChemDoodle editor to edit chemical structures, and we'll be using it for the foreseeable future, but I just wanted to see whether or not it would be possible to do the same thing in WPF. 

So, why use WPF for editing chemical structures? Well, it has several appealing advantages over other ways of doing this:
  • It uses DirectX and is very quick;
  • It uses high-level concepts - Shapes - to describe objects on screen.  Shapes provide their own event handling, and styling so you can manipulate them directly in XAML;
  • You can use your existing C# knowledge to engineer the actual code itself.
Moreover, it makes  difficult things a lot easier, even if it's difficult itself to learn.  Take for instance the drawing of chemical bonds.  Chemists use a certain convention to convey stereochemistry known as the hash-wedge convention.  Here's 1-phenylethanol drawn using this convention:

Drawing the hash bond to the hydrogen atom using conventional Windows GDI calls is a pain in the nether regions as you have to draw lines perpendicular to the direction of the bond that gradually taper out.  But with WPF it's much simpler.  You define a Shape object that describes the overall outline of the bond, a Brush object that describes the hash pattern, and then apply the Brush to the Shape using a BrushTransform to rotate it. You get the angle of rotation using a customer ValueConverter object that takes the bond object and spits out the angle at which it's drawn, which you can then bind to the BrushTransform using a XAML Style object. 

The issue with this approach is the same issue as with every other WPF approach to solving a problem.  It's not that it's difficult - it isn't.  It's that it's difficult to discover.  WPF has gained an unenviable - and justified - reputation for size and complexity and it's hard to pin down the most effective way of performing a task. 

So, I think it's worth my sharing my experiences with WPF as I build this chemical editor control.  I will publish various snippets of code as I deal with interesting problems I encounter during the development process.  I hope that in so doing I'll go some way to demystifying WPF for the uninitiated. 

One final point:  The new Windows Runtime is a separate subsystem altogether from WPF, and there has been a lot of talk of WPF itself being deprecated.  I don't believe that this is currently in the offing as many business users won't be using applications written to run on the new Metro interface.  Besides which, both systems make extensive use of XAML so learning this will stand you in good stead for the future.

Thursday, 12 September 2013

Windows PowerShell for Developers #6: the Package Manager Console

I've been playing about with the PMC that comes with NuGet.  This is quite a handy tool as it exposes the root design time environment object to PowerShell commands as $dte.  I'm currently adapting the substantial script I wrote to prepare a solution for NuGet packaging so it runs directly from this console. One hope is that I can cut the code down by manipulating the solution and projects directly instead of hacking the underlying files. 

This has proved to be more difficult that I thought.  $dte doesn't support all the interfaces you'd expect.  You have to derive a Solution2 COM interface before you can manipulate the solution directly. This, for example is how you add a file at solution level:

$vsSolution = Get-Interface $dte.Solution ([EnvDTE80.Solution2])
$vsProject = $vsSolution.AddSolutionFolder("newFolder")
$projectItems = Get-Interface $vsProject.ProjectItems ([EnvDTE.ProjectItems])
$projectItems.AddFromFile("greenery.txt")

It's still nicer than having to parse the solution file all the same. More on this as the project develops.

Tuesday, 3 September 2013

Windows PowerShell for Developers #5: Instrumenting your Projects for NuGet

Just before I went on my yearly summer holiday, I talked about NuGet-related issues and how PowerShell can help to resolve some of these painlessly. Latterly, I covered the housekeeping that comes with creating NuGet packages from your project. I also covered how to run packaging scripts automatically from Visual Studio as a post-build step

Prepping - or 'instrumenting' -  your projects for NuGet can save you a lot of  aggro in the long term but can be an almighty pain to do.  There are several areas you have to cover for each project you want to automatically package:
  1. Create a nupack.ps1 file at the solution level which will package a given project
  2. Creating the post-build scripts which invoke the script above
  3. Managing dependencies between packages.
The first step is fairly rudimentary and I've shown how to do this.  Step 2 is a drudge, but is merely a case of cutting and pasting the relevant scripts.   Step 3 on the other hand is one of those jobs given to programmers who have been really evil and have been sent to the Ninth Circle of Hell for their sins. Imagine if you have a complex solution with a lot of dependencies between .NET projects, and you want to map these to package inter-dependencies?  The job can very quickly get out of hand, and you can very easily make mistakes, as I did.

So, being a resourceful (i.e. lazy) type, I decided that it was about time to bring PowerShell to my rescue yet again.  I wanted a script that would do the following to a complete multi-project Visual Studio solution:
  1. Create the nupack.ps1 file at the solution level
  2. Amend each project to call this file after the build has completed
  3. Run nuget spec  against every single project in the solution to create the packaging specification
  4. Interrogate each project file for dependencies and map these to package inter-dependencies.
Steps 3 and 4 can be made a bit simpler if we consistently name our packages  after their originating projects. If we open up a .nuspec file for a project (generated by nuget spec) in an XML editor then we see something like

  
    $id$
    $version$
    ...
    ...
    ...
    ...
    .../
    ...
    false
    to be supplied
    Summary of changes made in this release of the package.
    
    
    Tag1 Tag2
    
      
      
    
  The 

The bits in ellipses are where you would normally insert information about your package, but the <dependencies> tag allows you to specify the packages which must also be installed. Our job is to make sure that this file is properly constituted. If, while we're at it, we can get the other tags properly populated as well then that would be a bonus. So, let's write a PowerShell script to do just that!

The Script

The PowerShell script to do all this looks like:
#NuGetProjects
#Prepares a set of projects in a solution for NuGet processing
#Generates a nuspec.ps1 file in the solution's root folder
#then modifies each VBproj file to run this as part of its postbuild event
#Also generates .nuspec files for each project and  updates the dependencies for each of these files

param(
 [string]$solutionDir = '.'
 )
#transfer the info from the VB project file to the nuspec file where we can
#mangle it to our heart's content
Function transferprojectinfo
{
 param (
  [System.IO.FileInfo]$projectFile,
  [xml]$nuspecxml
  )
 #register the MSBuild namespace so we can search the project file
 $ns = @{e='http://schemas.microsoft.com/developer/msbuild/2003'}
 #load up the project XML
 $xml = New-Object System.Xml.XmlDocument
 $xml.Load($projectFile.FullName)
 [string]$infoFileName = ""
 $descNode = (select-xml -namespace $ns -xml $xml -XPath '//e:AssemblyName').node
 $nuspecXML.package.metadata.description = $descNode.InnerText
 
 Push-Location $projectFile.Directory
 if (Test-Path (Join-Path $projectFile.Directory "\AssemblyInfo.vb"))
 {
  $infoFileName = (join-path $projectFile.Directory  "AssemblyInfo.vb")
 }
 else
 {
  if (Test-Path (Join-Path $projectFile.Directory  "\My Project\AssemblyInfo.vb"))
  {
   $infoFileName = (Join-Path $projectFile.Directory  "\My Project\AssemblyInfo.vb")
  }
 }
 
 if ($infoFileName -ne "")
 {
  Write-Output "AssemblyInfo.VB location ='$infoFileName'"
  [string]$assemblyDesc =( Get-Content $infoFileName|Select-String -Pattern '\<assembly: assemblydescription="" desc="">.*)"\)\> ' |%{$_.matches}|%{$_.groups["desc"].value})
  [string]$assemblyTitle = ( Get-Content $infoFileName|Select-String -Pattern '\<assembly: assemblytitle="" desc="">.*)"\)\> ' |%{$_.matches}|%{$_.groups["desc"].value})
  [string]$assemblyCompany = ( Get-Content $infoFileName|Select-String -Pattern '\<assembly: assemblycompany="" desc="">.*)"\)\> ' |%{$_.matches}|%{$_.groups["desc"].value})
  [string]$assemblyVersion = ( Get-Content $infoFileName|Select-String -Pattern '\<assembly: assemblyversion="" desc="">.*)"\)\> ' |%{$_.matches}|%{$_.groups["desc"].value})
  if($assemblyDesc -ne '')
  {
   $nuspecxml.package.metadata.description = $assemblyDesc
  }
  else
  {
   $nuspecxml.package.metadata.description = "to be supplied"
  }
  $nuspecxml.package.metadata.title= $assemblyTitle
  $nuspecxml.package.metadata.copyright = $assemblyCompany
  
  
 }
 #now do the dependencies
 $dependencyNodes = (select-xml -namespace $ns -xml $xml -XPath '/e:Project/e:ItemGroup/e:ProjectReference/e:Name')
 $nuspecxml.dependencies.RemoveElement
 $depRoot = $nuspecxml.CreateElement("dependencies")
 foreach($depNode in $dependencyNodes)
 {
   $newDep =  $nuspecXML.CreateElement("dependency")
   $newDep.SetAttribute("id",$depNode.Node.InnerText)
   $newDep.SetAttribute("version", "1.0.0.0")
   $depRoot.AppendChild($newDep)
 }
 $nuspecxml.package.metadata.AppendChild($depRoot)
 Pop-Location 
}

Function makespec
{
 
 param([string]$dirName)
 
 Write-Output "Processing project directory $dirName"
 pushd $dirName

 $projects = get-childitem -Filter *.vbproj

 foreach ($proj in $projects)
 {
  Write-Output "Processing directory $proj"
  $dirName = $proj.DirectoryName
  $nuspecFilename = (Join-Path $dirname *.nuspec)
  $nuspecFiles = (Get-ChildItem $nuspecFilename)
  foreach($nuspecFile in $nuspecFiles)
  {
   if(!(test-path $nuspecFile))
   {
    nuget spec -f $proj
   }
   $nuspecxml = [xml](Get-Content $nuspecFile)
   $metadata = $nuspecxml.package.metadata
   $metadata.authors = 'authorname'
   $metadata.owners = 'authorname'
   $metadata.authors = 'companyname'
   $metadata.licenseUrl = 'http://servername/sitename'
   $metadata.projectUrl = ''http://servername/sitename'
   $metadata.iconUrl = ''http://servername/sitename/image.gif'
   transferprojectinfo $proj $nuspecxml
   $nuspecxml.Save($nuspecFile.FullName)
   Write-Output "Fixed up $($nuspecFile.Fullname)"
  }
 }
 popd
}


#constant definitions
[string]$repositoryLocation = "\\servername\packagesharename"

#this is the main processing part of the script
Write-Output '+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++'

Write-Output "Preparing to process projects in directory tree $($solutionDir) "
$buildEventText = 'Powershell.exe $(SolutionDir)nupack.ps1 -projectdir $(ProjectDir) -config $(ConfigurationName) -outdir $(OutDir)'

 
$ErrorActionPreference = "Stop"
Write-Output "Setting Context to $($solutionDir)"
pushd $solutionDir
#extract the leaf folder of the directory
$slnFolder = (join-path $repositoryLocation (Get-Item $solutionDir).Name)
#create the nuget target directory if it doesn't exist
if(!(Test-Path $slnFolder))
{
 mkdir $slnFolder
}
#now generate the post-build script :  note use of the substitution variables
Write-Output "generating nupack.ps1 file"
echo '
#PowerShell script invoked 
param(
 [string]$projectdir,
 [string]$config="Debug",
 [string]$outDir
)
if($config -eq "Release")
{
 write-output "Project Directory = $projectdir"
 pushd $projectdir
 write-output "Deleting the packages..."
 del $projectdir\*.nupkg
 write-output "Repackaging..."
 nuget.exe pack -Properties OutDir=$outDir
 popd' > nupack.ps1
 "xcopy /Y /C `"`$projectdir*.nupkg`" `"$slnFolder`""  >> nupack.ps1
'}' >> nupack.ps1


$projects = get-childitem -Filter *.vbproj -Recurse
foreach ($proj in $projects)
{
 #first, generate the nuget files
 #this assumes that we have nuget in the path
 Write-Output "Preparing to process projects in directory tree '$($proj.DirectoryName)' "
 pushd $proj.DirectoryName
 nuget spec -f
 Write-Output "Processing project $($proj.FullName)"
    $ns = @{e='http://schemas.microsoft.com/developer/msbuild/2003'}
    #$xml = [System.Xml.XmlDocument](get-content $proj.FullName);
 $xml = New-Object System.Xml.XmlDocument
 $xml.Load($proj.FullName)
    $node = (select-xml -namespace $ns -xml $xml -XPath '//e:PostBuildEvent').node

    $outputType =(select-xml -namespace $ns -xml $xml -XPath '//e:OutputType').node.InnerText
 
 if ($outputType -eq 'Library')
    {
  Write-Output "Making the  specification for $($proj.DirectoryName)"
  
  makespec $proj.DirectoryName
        if ($node -ne $null)
  #found that there's a PostBuildEvent node already
        {
   $node.InnerText = $buildEventText
        }
        else
        {
             #create the node
    $projectNode = (select-xml -namespace $ns -xml $xml -XPath '/e:Project').node
    $pgElement = $xml.CreateElement('PropertyGroup', 'http://schemas.microsoft.com/developer/msbuild/2003')
    $pbeElement = $xml.CreateElement('PostBuildEvent', 'http://schemas.microsoft.com/developer/msbuild/2003')
    $pbeElement.InnerText = $buildEventText
    $pgElement.AppendChild($pbeElement)
    $projectNode.AppendChild($pgElement)
        }
  Set-ItemProperty $proj -name IsReadOnly -value $false
  $xml.Save($proj.FullName)
  Write-Output "Updated $proj.FullName"
 
    }
}
Write-Output "Finished processing projects in $solutionDir"
popd
It's a hell of a lot simpler than it looks. You simply copy it to the solution directory and run it, or run it from where it currently resides but passing the solutions folder name as a parameter. It traverses the solutions folder structure and amends every single .NET project according to the four steps above. It also creates the nupack.ps1 script at the solution level. The line of code

$buildEventText = 'Powershell.exe $(SolutionDir)nupack.ps1 -projectdir $(ProjectDir) -config $(ConfigurationName) -outdir $(OutDir)'
specifies the command that will be run by the post-build event. This is inserted directly into the project file's XML.

Getting at the information in the project file

Manipulating a VB.NET (or C#) project file using PowerShell is a bit more involved than one normally comes across.

The XML resides in its own custom namespace, so the usual PowerShell shortcut of $variable.tagname can't be used. Instead we have to resort to declaring a namespace in an associative array and then using this in our path expressions:


 $ns = @{e='http://schemas.microsoft.com/developer/msbuild/2003'}
    #$xml = [System.Xml.XmlDocument](get-content $proj.FullName);
 $xml = New-Object System.Xml.XmlDocument
 $xml.Load($proj.FullName)
    $node = (select-xml -namespace $ns -xml $xml -XPath '//e:PostBuildEvent').node
You need to change the line
#constant definitions
[string]$repositoryLocation = "\\servername\packagesharename"
to point to the location where the packages will be stored as well.

This script is probably too verbose and a real PowerShell wizard might have pulled off the same trick in half the number of lines. Still, running this will instrument your entire solution to get it to package all its projects and copy them to the specified shared once you've built. You use it at your own risk: make a copy of your solution in a new folder structure before you let this loose on it.

I look forward to suggestions as to how this script might be improved.

Thursday, 15 August 2013

Windows PowerShell for Developers #4: Housekeeping

In my last posting, we looked at how to automate the creation and publishing of NuGet packages from Visual Studio. We wrote a simple PowerShell script that was invoked when a project finished building, and this packaged the output and pushed it to a network share.

One of the side effects of this automation is that every time you build a new version of the assembly, a new package will be created and pushed out.  NuGet package names  follow the convention
<assembly name>.<major version number>.<minor version number>.<revision>.<build>.nupkg
 so the package name will change every time you build a new version of the DLL.  Before you know it, you're swamped with various package versions, like this screenshot of the repository folder shows:



So, what to do about it?  PowerShell comes to the rescue again.  Here's the contents of a purge.ps1 file I created:

param ( [string]$workingDir = "."
 )

pushd $workingDir
echo "Purging $workingDir"
[string]$pattern = "(?<rootname>.*)\.(?<revision>\d+\.\d+\.\d+\.\d+)\.nupkg";

$results = dir .| Sort-Object Name -Descending   | Where-Object {$_.Name -match $pattern} |  foreach-object {
    new-object PSObject -Property @{
        rootname = $matches.rootname
        revision = $matches.Revision
        File = $_
  }
 }
$filesToKeep = $results | sort rootname, revision  -Descending | group rootname |
ForEach-Object { $_.group | select -first 1 -ExpandProperty File}

Remove-Item *.nupkg -Exclude $filesToKeep

Write-Output "Files retained are" 
dir .
popd
echo "Purge completed"

Copy this file to your repository folder, fire up a PowerShell console and run it.  It will delete all but the most recent version of the package.

How does it work?  Pretty simply as it happens:
  1. It creates a regex which matches sections of the filenames
  2. It then applies the regex to the current directory, and creates a stream of PowerShell objects that capture the name of the package and the version information separately
  3. It groups these objects by package name and keeps the most recent version in a list called $filesToKeep
  4. It then deletes all packages from the target directory except those in the list $filesToKeep
Imagine doing this in VBScript or, God forbid, the DOS command language...

PowerShell has a pretty arcane syntax, and it's easy to lose sight of what a script is intended to do.  But it's easy to translate a process expressed in plain English into PowerShell, so my advice is simply to write down what you are trying to achieve in words first before plunging into script.

Happy scripting!

Windows PowerShell For developers #3: Fun with NuGet


One of the most frequent complaints I've heard from Visual Studio developers moving to a Java development environment such as Eclipse is the amount of time it takes to get everything set up.  I don't happen to think that the .NET Framework is inherently superior or inferior to Java, just built with a different  set of priorities.  However, it is undeniable that Visual Studio works out-of-the-box.  With a Java project you find yourself spending days having to scrounge various bits and pieces of software and then possibly weeks getting it all to work together.

What's the reason for this discrepancy?  Well, Visual Studio is produced and controlled by one corporation which has tight controls over its features.  Java has,  like Topsy, just 'growed'.  The language specification has been under tight central control but everything else has simply sprouted around it, thanks to a vibrant open-source community.

I happen to think that the involvement of the community in building the product is a good thing, and even Microsoft is starting to embrace this philosophy.  Encouragingly, it is, from the outset, supporting the development of a 'distribution infrastructure' so that people can find components and install them easily.  No more scrounging around the trash cans of the Internet, we hope.

Welcome to NuGet

Part of this new infrastructure is NuGet (pronounced 'nugget').  NuGet is a package manager that runs inside Visual Studio (and is an open source project).  NuGet allows easy sourcing and installation for shared binary components.  It has many nice features, such as automatic updating and dependency management.

This next image is taken from the NuGet home page, showing packages that can be downloaded and installed into a Visual Studio project:
Manage NuGet Packages Dialog Window
You can install NuGet from CodePlex  .  Once you install it, you have access to a wide variety of NuGet packages.  When you install these, you Visual Studio project is automatically updated with references to the packaged components, and it's a doddle to keep them up-to-date. 

NuGet in the Enterprise

NuGet isn't just for hobbyist developers.  If you work for a company that has  a suite of internally developed components, you can package these with NuGet and deploy them to an internal network share.  This can make your life as a developer a hell of a lot easier. 

I'll share with you some PowerShell tools I have developed to make the process easier.

Automating the Process

If you have a solution with a lot of library projects, these are prime candidates for packaging.  The logical time to generate a new package is immediately after the build, and this is best done automatically for each project.

Let's assume you've created a .nuspec file for each project and populated its tags, that you know how to manually create a package,  and that the projects package smoothly and without errors.

So, let's automate the packaging process:

Now, open a project's Properties page, click on the Compile tab and then click the Build Events button and then Edit Post-Build.  Enter the following command line:



Create a file nupack.ps1 in the solution's root folder using a text editor and containing the following PowerShell code:

param(
 [string]$projectdir,
 [string]$config="Debug",
 [string]$outDir
)
if($config -eq "Release")
{
 write-output "Project Directory = $projectdir"
 pushd $projectdir
 write-output "Deleting the packages..."
 del $projectdir\*.nupkg
 write-output "Repackaging..."
 nuget.exe pack -Properties OutDir=$outDir
 popd
xcopy /Y /C "$projectdir*.nupkg" "\\servername\sharename"
}

Substitute '\\server\sharename' for a locally accessible share where you want to store the package.  You can configure the Package Manager to use this location as a repository.

Every time you now build the project in Release configuration, the package will be rebuilt and pushed to the share ready to be picked up by the Package Manager ready for installation into other projects.