Nov
29
2013

Using TeamCity integrated dotCover coverage files with NDepend

Byron Bay 2013-08-26 038_DxO

For a long time I wanted to integrated NDepend on our build server so that this week I invested some time here and there to achieve that goal. I did that already a long time ago, I even wrote the documentation which you can read on NDepend website.

This time I wanted to go one step further.

We use a first build which is building each feature branch we are developing. One of the responsibility of that build is to run unit tests, integrations tests, specifications and gather code coverage. To achieve that we are using the TeamCity integrated dotCover in each build steps running our different tests. This is collecting artifacts which aren’t directly shown on the Artifacts tab:

Clicking show reveal the file we are interested about, dotCover.snapshot:

Something to note is that to be able to use code coverage NDepend needs the pdb files, this is why we have another artifact named PDBs.zip. And finally the third is our software with the exe and dlls.

Now that we have a build which generate the coverage file which we want to pass to NDepend, let’s create another TeamCity build which will define Snapshot Dependency and Artifact Dependency to the previous build:

We are extracting the exe to a NDepend folder and all dlls out of the archive Libs folder to the same NDepend folder.
We do the same for the pdb files so that NDepend can use the code coverage data.
Finally we extract the dotCover.snapshot to a dotCover folder.

Then the issue we had was that the dotCover.snapshot file is not of the format that NDepend is expecting.

So as a first build step of our NDepend build we need to convert dotCover.snapshot file, this is done using a Command Line build step and dotCover integrated in TeamCity, using the report command and the ReportType equal to NDependXML:

So after that build first step we have a new converted file; dotCoverNDepend.xml which can be consumed by NDepend.

Then in the second build step we are using dotCoverNDepend.xml with the new NDepend 5.1 CoverageFiles command parameter:

Here is the full command

C:\NDepend\NDepend.Console.exe "%teamcity.build.checkoutDir%\skyeEditor.ndproj" /CoverageFiles "%teamcity.build.checkoutDir%\dotCover\dotCoverNDepend.xml" /InDirs "%teamcity.build.checkoutDir%\NDepend" "C:\Windows\Microsoft.NET\Framework\v4.0.30319" "C:\Windows\Microsoft.NET\Framework\v4.0.30319\WPF" /OutDir "C:\NDependOutput"

This will create the NDepend report which we will archive as an artifact, on the General Settings of the build

Then you will need to define that you want to see the NDepend report as a TeamCity Report Tab, which you define by navigating to Administration > Report Tabs, clicking Create new report tab and specifying

Finally you will have the following NDepend report with code coverage shown for your builds

One last thing I struggled about is that the NDepend builds were not started, because I thought it was enough to configure the Dependencies, but you need to define also a Build Trigger with Trigger on changes in snapshot dependencies ticked:

Thanks to Yegor for the discussion which greatly helped as always! And also to Ruslan which also helped through the post on the jetbrains forum. And finally thanks to Patrick who introduced really fast the /CoverageFiles in NDepend, so do not hesitate to give feedback using the new NDepend user voice.

Oct
17
2013

Optimizing release build process using JetBrains TeamCity and Atlassian Stash

Byron Bay Light House

Automating and optimizing the processes I use everyday to work is something important so that I get more productive and spend less time in things that a computer is better at.

Previously I had 3 builds defined in TeamCity one for all feature branches, one for release and one for patch. For feature branch and patch branch I needed to go to TeamCity to define two Build Parameters: the branch name and the release number.

My goal was to avoid to go to TeamCity when we have a release and have to set those Build Parameters.

I wanted one TeamCity build which would

  1. Determine automatically the version number
  2. Determine the branch to use to build that release

The second point was easy! You just need to follow the same principles defined for feature branches. I defined the following branch naming convention: any release should have a branch name like this release/skye-editor-2.26.0 for a release of Skye Editor 2.26.0. Then I defined in my VCS Root of my TeamCity Build the branch specification:

+:refs/heads/release/skye-editor-*

and the same for the Build Triggers / VCS Trigger / Branch filter.

Now the first point is a bit more complex!

As we want to determine automatically the version number we quickly realize that the release number is defined in the branch name itself, e.g. release/skye-editor-2.26.0.

So why not use it? Yeah great idea but how?

First idea that came was to pass that value as a parameter to the build script and deal with splitting the branch name to the release number into the build script. As Yegor was confirming that currently TeamCity has no way to parse values. I didn’t really like that idea of passing that parameter! So I continued to think about alternatives and finally came to ask Yegor:

I had another idea, in teamcity when the active builds are displayed for a feature branch build only the second part is displayed, e.g. if I specify as branch specification SKYE-* and I have a branch with SKYE-1077-blabla then it wil show 1077-blabla. Is there a build parameter which would map this 1077-blabla?

And the answer was

If you have something like refs/heads/SKYE-(*), then only the part in the brackets is regarded as logic branch name. Logic branch name is available as %teamcity.build.branch%

That’s it! Thanks Yegor. We have the way to get our version without having to change our build script and without writing any code!

So I just used the logical branch name %teamcity.build.branch% as my configuration parameter CurrentRelease which was already existing and that I had to manually update before:

Replacing the previous two manual configurations

So now I create easily a Git release branch; release/skye-editor-2.26.0, then use pull requests of Atlassian Stash to review and merge my feature branches to that release branch which is automatically built using TeamCity.

This is a great improvement and shorten my release check list, all good!

Sep
20
2013

Functional Programming Principles in Scala using JetBrains IntelliJ IDEA

Another great shot of our friend the Whale of last week in Cape BridgewaterThis week I started the Coursera course Functional Programming Principles in Scala with Martin Odersky as instructor.

One of my first step was to have JetBrains IntelliJ IDEA working with the Scala programming language so that I can work on the course assignments using my preferred IDE.

I have seen that The Guardian published a blog post Functional Programming Principles in Scala: Setting up IntelliJ but it focus on the Unix-like operating systems (OSX and Linux). As I work on Windows I for sure installed it on my machine running Windows 8.1! I also use a different set of tools so here is the way I did it!

First of all I am assuming that you have JetBrains IntelliJ IDEA 12 and JDK installed.

From IntelliJ you need to install the Scala plugin so go to File / Settings then on the dialog search for plugins, click on Browse Repositories button and search for Scala then select Scala v.0.19.299 from JetBrains

Then you also need to install the SBT plugin also from JetBrains which is still in development and get nightly builds.

Add the following URL to the list of custom plugin repositories in Settings / Plugins / Browse Repositories / Manage Repositories:

http://download.jetbrains.com/scala/sbt-nightly-leda.xml

After that, you may install the plugin via Settings | Plugins | Browse Repositories. IDEA will check for the latest plugin builds automatically.

Now extract the assignment zip that the coursera course is providing; e.g. for the example assignment, unpack the example.zip to a folder then in IntelliJ use File / Import Project. IntelliJ already recognize the folder as project:

Click Next

Tick Use auto-import and click Finish. IntelliJ will then import your project:

Now you need to edit your project structure using File / Project Structure and choose Modules, then progfun-example

You will see on the dialog for the Source and Test Source Folders some folder in red, just click on the X to delete all the one in red to get to the following state, then finally click OK

Now navigate to the test file ListsSuite and press ALT – SHIFT – F10 to start the tests

And if you implemented the assignment you should see this results:

There is still two tests to fix in the first assignment, this is why there are red!

But as you can see now you have a full Scala development environment based on the great JetBrains IntelliJ IDEA IDE! Now you can code with pleasure!

I really enjoy the first week assignments and implemented the first two as coding Kata! And I even involved France-Anne for the second one; first time that she is coding something Smile . Very nice!

Sep
19
2013

Uninstalling a program which doesn't want to on Windows 8.1

I updated to Windows 8.1 RTM a week ago and since that time I had the issue that AMD Radeon™ RAMDisk was not running but even worse I could not uninstall it because the uninstaller was saying that it was the wrong version of the operating system!
I was using it on Windows 8, it was working great on it but after the update it wasn’t anymore.

The solution I found is to use Microsoft Fix It – “Fix problems with programs that can't be installed or uninstalled”. I followed the wizard, could pick up that I had an uninstall issue then I could choose which software was in trouble and finally could uninstall it!

Great tool! Thanks Microsoft Fix it!

May
28
2013
Git // GitHub

Git and GitHub Training

Today I had the pleasure to attend “Git and GitHub Training” held by Tim Berglund one of the GitHubber, so called because he is working at Github.

In the past I enjoyed very much the video Tim and Matthew McCullough, another GitHubber, did for O’Reilly

So it was really nice to meet Tim in one of his course.

What are the class objectives?

  • Understand how Git works and how to apply that to day to day development.
  • Learn how GitHub makes distributed collaboration both effective and enjoyable.
  • Practice the use of Pull Requests to make contributions to any project.
  • Learn the basic 10 commands that will appear in your every-day use of Git.
  • Know how to “back out” mistakes using Git’s incredible history and ability to revert almost any change.
  • Leverage the features of GitHub for easier collaboration with colleagues.
  • Discover how the offline capabilities of Git work.

Topics

  • Introductions
  • Git and your initial setup
  • Git configuration and its inheritance
  • SSH Authentication and your first repository
  • Understanding and thinking in Git's three stages
  • Adding, committing, and diff-ing code changes
  • The Similarity Index; Moving, Renaming, and Removing files
  • Reviewing Version History in Git
  • Strategies for Efficiency (quick workflows, GitIgnores, etc.)
  • Managing and using Git Remotes
  • GitHub
  • Forking Repos
  • Pull Requests
  • Branching, Tagging, and Stashing
  • Merging, Rebasing, and managing conflicts
  • Undoing your work with Git

The course was taking place at Canoo headquarters in Basel, which have really very nice offices and were welcoming us very nicely!

We went through all the topics described and even a bit more as most of the people in the course were already having some experience with git.

Did I liked the course, the format and the way to present it?

Definitely yes! I enjoyed it and would really recommend it to people which are starting with git. I especially liked the way Tim presented the inner working of git which I think is important to get right, especially when you come from another vcs like svn. Also that the log is a graph and not a list…

Did I learned a lot?

Not really but that was planned. I spent almost two years working with git svn at Innoveo Solutions and git for my personal projects. Now I am also using git without the svn part at work and continue to take influence so that the whole company migrate to git. So I gained some experience, which the course would have give me much quicker, and I guess my colleagues, Cédric, Carlos and Christian gained some knowledge in one day which took me much more time to grasp. So the course if a great starter, as for quite some things, you need to practice to really get it!

For people with a good understanding of git I would take care to go to the Advanced Git & GitHub Course.

During the course we had some open discussions on various topics, but here are some which I would like to put some emphasis on.

My personal experience brought me from thinking that I want to have the tools integrated into my IDE to, after getting more knowledge about git, using git from the command line. This makes me as a developer much more efficient and let me automate some stuff, which I couldn’t do with the IDE. For sure I agree with Tim Winking smile that if I would have to work with cmd it would be rude but today there is PowerShell and combined with ConEmu and PoshGit it makes very effective, even on Windows which was still seen as a clicking environment!

I learned about

git config --global credential.helper cache

But this wasn’t working on my Windows machine, so I went back to https://github.com/anurse/git-credential-winstore

This was also new to me

git push --delete origin mybranch

because I am used to do it with the following, which is for sure less expressive of what it does

git push origin :mybranch

Finally things I heard during the course which I want to investigate further are

I also had interesting discussions with some of the Canoo people especially about Scratch, one way to bring kids to programming and the Raspberry Pi.

So that was a great day! And I hope to have some others like this one in the future.

May
4
2013

Git Diff Margin, display Git Diff on the margin of the current file in Visual Studio 2012

Levé du jourI am happy to announce that I finally released today my first Visual Studio 2012 extension called Git Diff Margin.

Git Diff Margin

A Visual Studio 2012 extension to display Git Diff on the margin of the current file.

  • Quickly view all current file changes on the left margin: blue rectangle for modification, green rectangle for new lines and grey triangle for deletion
  • Navigate to previous/next change on the file
  • Undo the change
  • Show the diff in external tool
  • Copy the old code into the clipboard
  • Copy a part of the old code by selecting it in the popup
  • Define the colors of the Addition, Modification and Deletion diff margin through Visual Studio Fonts and Colors options
  • Support Visual Studio 2012 Dark, Light and Blue Theme
  • Support the zoom
  • Perfect companion of Visual Studio Tools for Git

Installation

Grab it from inside of Visual Studio's Extension Manager, or via the Extension Gallery link

Screenshots

Video

Get the code

https://github.com/laurentkempe/GitDiffMargin

Credits

Thanks to Sam Harwell @sharwell for all the improvements!
Dec
7
2012

Optimizing Skye Editor using JetBrains dotTrace

WP_000092This post is a transcript of an internal post I did on Innoveo Solutions internal blog. Thanks to Innoveo to let me share this here!

Skye Editor is our metal model editor which is written in C# 4, WPF uses Model-View-ViewModel design pattern and MVVM Light.

The post shows the usage of JetBrains dotTace to optimize Skye Editor and the importance of profiling your code, here it is.

For the release 2.20 of our Skye Editor product we have done already some optimization like “FindProductValue of ModelProduct to use a dictionary”

Starting of the release 2.21 the goal was to go one step further with “Optimize Loading/Deleting of definition and update to MVVMLight 4 RTM”

The results are quite awesome!

Here I am comparing the last version of Skye Editor which we shipped, 2.18 to the version currently in development for the next release 2.21.

The performance measurement scenario is as following:

  • Starting the application
  • Loading a big definition, BigDefinition.zip, 2743 Kb zip, 19928 Kb Xml
  • Deleting a brick which as lots of sub bricks and attributes, value ranges, values...

I used the profiler dotTrace from JetBrains to measure the performance improvement.

Here is a first result for the method ActualizeFromNewArchive, which is used when we load, import or activate a definition. This method is responsible of building all the View Models used in the editor which we use to display the tree of root, brick, the attributes, value range, values but also the backendinfo.. and finally the texts. So on big definition there is a lot to create especially for the texts.

2.18

2.21

So we went from 9083ms to 944ms which is around a 9.6 factor improvement as we can see on the following picture!

That's quite impressive. But where does it come from? Let dig deeper in the execution tree.

2.18

2.21

So the first improvement is due an improvement done on MVVM Light 4 RTM a library we are using from the beginning which lets us decouple our View Models / Views using some messaging mechanisms among other features. I helped it's author Laurent Bugnion to test and to improve the toolkit, he even mention us on MVVM Light 4 RTM.

We went from 2219ms to 35ms but across the whole scenario (all usage of the Register method) we went from 3072ms to 130ms, which we can see here:

The improvement there is that CleanupList is not anymore done at that moment but only when the application is idle. Clever. And what is really cool is that I mentioned that performance issue and Laurent fixed it in the next release. Thanks Laurent!

But this is not all because we have won only 3072ms which doesn't bring us from 9083ms to 944ms.

The other big improvement is the optimization of the way we find value which as radically changed.

2.18

2.21

From 5103ms to 13ms !

2.18

        public ProductValue FindProductValue(string uuid)
        {
            return Values.AsBindingQueryable().FirstOrDefault(pv => pv.UUID == uuid);
        }

2.21

        public ProductValue FindProductValue(string uuid)
        {
            return Values.FindBindingByUuid(uuid);
        }

Look more in details

2.18

        private readonly List<TBinding> _bindingList;

        public IQueryable<TBinding> AsBindingQueryable()
        {
            return _bindingList.AsQueryable();
        }

2.21

        public TBinding FindBindingByUuid(string uuid)
        {
            Tuple<TBinding, TModel> value;
            _modelDictionary.Value.TryGetValue(uuid, out value);
            return value != null ? value.Item1 : default(TBinding);

The huge difference between those two methods is that the 2.18 is using a list and LINQ to find the first value which match the uuid we are searching. When the 2.21 is using a dictionary which index all values by uuid.

Another improvement of the 2.21 was to go from the following version of the method to the previously shown one:

        public TBinding FindBindingByUuid(string uuid)
        {
            return _modelDictionary.Value.ContainsKey(uuid) ? 
                   _modelDictionary.Value[uuid].Item1 : default(TBinding);
        }

This one make two access to the dictionary and the other only one access, which improved also quite a bit.

Another improvement is that we removed the usage of a ThreadSafeObservableCollection which was dispatching to the UI thread all operations. Basically you could operate the collection from a background thread while it was bound to the UI, which normally you cannot do due to thread affinity, except if you dispatch, which for sure as a cost.

So that's it for the improvement when we load/import/activate a definition!

Now about deleting.

2.18

2.21

It would be nice to have this gain but in fact we had to refactor the operation so that one part is executed on the UI thread on the other part on a background thread. So basically what touch to the View Model is executed into the UI thread and what touch the Model on the background thread.

So we have also to count this

2.18

2.21

So we go from 30720ms to 21666ms. Which is again a good improvement

This can be again improved a lot because currently we have to traverse the whole tree and count all the relations to the texts we want to delete which is accounting for 19468ms.
With a cache of relation it will much much faster. But that for next time!

I hope you will enjoy the time saving of all those optimizations in Skye Editor!

Nov
20
2012
.NET // C# // TDD

Using Thread.Sleep() in Unit Test! A good idea?

TurtleIn my humble opinion it is definitely not a good idea! Why?

  1. It is brittle test because it depends to the CPU load of the machine running the test. Maybe it runs fine on your development machine, and will for sure from time to time fail on your build server because of the load on the server.
  2. It is slower then needed. If you increase the sleep time so that you “ensure” that the test should pass in all situation then the test will always as long as this time.

So what can we do about it?

One of the first approach is to using polling like NUnit and it’s Delayed Contraint does. But I am not a big fan of this because you have to remember the details of it, and you can fall in that trap easily:

Use of a DelayedConstraint with a value argument makes no sense, since the value will be extracted at the point of call. It's intended use is with delegates and references. If a delegate is used with polling, it may be called multiple times so only methods without side effects should be used in this way.

My personal preferred approach is:

  1. to expose as a protected property the Task running the background operation
  2. to create a SUT class in my test class which inherit from the class with the protected property
  3. to make the Task a public property of the SUT class
  4. to have my test use the Task by calling the Task.Wait() just before the assertion

For example here is one of my test using that approach:

Sep
27
2012
C# // TDD // Kata // ReSharper // NCrunch

Test Driven Development Kata - String Calculator

The subject of this article is the String Calculator proposed by Roy Osherove. The Kata is done in C# uses Visual Studio 2012, JetBrains Resharper 7 and NCrunch.

But what is a TDD Kata? It is an implementation exercise starting with unit tests and leveraging refactoring that you practice daily for around 15 to 30 minutes.

The goal of this video is to present one of those session I did recently of this practice that I really find interesting.

The video was also published on Tech Head Brothers, my french portal about .NET.

Aug
28
2012

Microsoft Bluetooth Mobile Keyboard 6000 for developers

WP_000463_DxOI love my Microsoft Bluetooth Mobile Keyboard 6000. I have one at home and one at work.

But there is two things which I don’t really like as a developer:

  1. No contextual menu key
  2. Two keys combination for Home and End

So I searched for a solution and found one at least for the contextual menu key. The solution is named SharpKey.

SharpKeys is a utility that manages a Registry key that allows Windows to remap one key to any other key. Included in the application is a list of common keyboard keys and a Type Key feature to automatically recognize most keyboard keys.

This little tool let me remap the Caps Lock, which no one use right? to the the contextual menu key, which makes my developer life much easier as I don’t need to move my hand from the keyboard to the mouse to get to that menu.

Great improvement!

At the moment I don’t have found a better way for the Home and End key, but I am considering remapping the volume up and volume down.

About Laurent

Laurent Kempé

Laurent Kempé is the editor, founder, and primary contributor of Tech Head Brothers, a French portal about Microsoft .NET technologies.

He is currently employed by Innoveo Solutions since 10/2007 as a Senior Solution Architect, certified Scrum Master and Founding Member.

Founder, owner and Managing Partner of Jobping, which provides a unique and efficient platform for connecting Microsoft skilled job seekers with employers using Microsoft technologies.

Laurent was awarded Most Valuable Professional (MVP) by Microsoft from April 2002 to April 2012.

JetBrains Academy Member
Certified ScrumMaster
My status

Twitter

Flickr

www.flickr.com
This is a Flickr badge showing public photos and videos from Laurent Kempé. Make your own badge here.

Month List

Page List