Allen Conway's .NET Weblog

Exploring all things .NET and beyond...

Which Version of TypeScript is Installed and Which Version is Visual Studio Using?

My rating of TypeScript on a scale of 1-10... a solid 10

My rating of finding out which version of TypeScript is installed and being used currently... a 2. 

Sometimes the simplest of things are missed and I think this is one of the cases. There is a lot of playing detective and resulting confusion that will come about trying to figure out which versions of TypeScript are actually installed and subsequently, which is targeted for VS.NET to compile against. I'm here to clear up this confusion and shed some light on this for TypeScript users.

The TypeScript compiler (tsc.exe) is installed through the installation of VS.NET 2013 Update 2 (or later versions), VS.NET 2015, or via the TypeScript tools extensions available through the VisualStudio Gallery. VS.NET users of one of the aforementioned versions are in a good place because TypeScript having been created by Microsoft integrates well into VS.NET. Regardless of installation method, the tooling and compilers are available at the following location:

C:\Program Files (x86)\Microsoft SDKs\TypeScript

As can be seen from the screenshot below, I have folders for versions 1.0 and 1.5:



Now before moving further, the way you've probably found to find out which version of TypeScript is installed is to send the -v option to the compiler which will "Print the compiler's version." You can do this from any location using the command prompt. Doing so on a default installation of say VS.NET 2013 with the 1.0 SDK folder present will yield the following:



Notice we have a SDK folder for version 1.0 however the compiler version is 1.0.3.0. This is because what really matters is not the folder, but rather the actual version of the compiler within the folder. In this case the 1.0 folder contains version 1.3 of the TypeScript compiler.

As mentioned, you can run the -v option against the compiler from anywhere. If you run this command against the directory that physically contains the TypeScript compiler (tsc.exe), then the resulting version output will be that of the compiler in the SDK directory targeted.

Running the version command against the 1.0 directory:



Running the version command against the 1.5 directory:



OK great, we can run version commands in different spots and find out about versions of the compiler. However which one are VS.NET and my project using, and where is that compiler version coming from when I run the -v option in a non-TypeScript SDK directory?

Let's address the global version question 1st. You can run the where tsc command from any command line which will tell you the TypeScript compiler location the version is returned from using the version option:



OK so I have SDK tools version 1.0 (compiler version 1.3 as we know) and 1.5 installed. Why is it returning only the tsc.exe information from the 1.0 SDK folder? It turns out that this information is actually a part of the PATH environmental variable in Windows. There is no magic or real installation assessment going on here. It simply reads the directory value embedded within the path variable for TypeScript and finds the tsc.exe version within that specified directory. Here is what was within my Windows PATH variable for TypeScript:

C:\Program Files (x86)\Microsoft SDKs\TypeScript\1.0\



Before we get much further it turns out all these versioning commands, PATH variable values, and the like, they have absolutely no bearing at all on which version of TypeScript VS.NET is using and compiling against. I'll get to that in a minute. 

However every Google search for, "which TypeScript version do I have installed" yields a bunch of results saying, "run tsc -v" which ultimately reads that value in the PATH variable and confuses the heck out of people. They get concerned that VS.NET is not targeting their newer version of TypeScript installed. 

The matter of fact is that TypeScript can have multiple side-by-side installations and it's all about the project's targeted value and none of what's in the PATH variable is important. You would think if the TypeScript team wanted to use the PATH variable they would update it to the newest version upon installing a newer TypeScript version. Not so. It remains stagnant at the old version which then shows the older TypeScript compiler version thus leaving everyone confused.  I found this comment on the following GitHub thread which confirms, folks will have to update the PATH variable manually for the time being:



Before manually changing the PATH variable to point to the newer TypeScript SDK version, lets look at what VS.NET reads to know which compiler to target. Unfortunately it is not available as a nice dropdown in the TypeScript properties for the project (hence adding to my rating of '2' for the version fun with TypeScript). Instead it is in the project's properties configuration within the .csproj or .vbproj file. I particularly like the EditProj Visual Studio Extension which adds a nice 'Edit Project File' to the context menu when right-clicking a project within VS.NET. Doing this will bring up the project's configuration, and I can see the TypeScript version targeted and used by the project inside the tag:



Now VS.NET will append this value to the C:\Program Files (x86)\Microsoft SDKs\TypeScript\ path to get the compiler used for the project. In this case we know from above the 1.3 version of the TypeScript compiler is in that directory. 

Let's do a test and write some TypeScript using the spread operator that's not fully supported until ES6 or version 1.5 of TypeScript which can compile to ES5 JavaScript. Technically version 1.3 of TypeScript can still compile the following to JS but it will complain in the IDE which is what we'd expect:



Notice how we get the red squiggly under the spread operator usage (three dots ...) and a notice that this is a TypeScript 1.5 feature.

Now we can prove the tsc -v output and resulting PATH variable value are not what's being used by VS.NET (it's the Tools Version I showed above). If we change the PATH variable TypeScript directory value from:

C:\Program Files (x86)\Microsoft SDKs\TypeScript\1.0\

to:

C:\Program Files (x86)\Microsoft SDKs\TypeScript\1.5\

...in the System variables (the TypeScript directory is embedded within the Path variable, so copy out to Notepad to locate and modify):


...and open a new command window we can see now the tsc -v command reads the updated PATH variable and outputs the resulting 1.5.0 version. 



So cool right? VS.NET should now reflect version 1.5 of TypeScript being used and our warning should go away. Nope. Save your TypeScript file, rebuild, reopen VS.NET, do whatever you feel, and you will still see the warning above. That's because what matters to VS.NET is what version is in it's own project configuration and not what the PATH variable returns via the version command.



What we need to do is manually update the project's configuration to point to the 1.5 SDK tools version (remember this should be the value of the folder that VS.NET appends to the SDK directory and not the actual compiler version). Using the 'Edit Project' tool I mentioned previously, I can change the 1.0 tools version to 1.5:



If I go back to my TypeScript file, immediately I notice the spread command is understood and accepted as we are pointing to version 1.5 of the TypeScript compiler:



So that's TypeScript versioning as of today. If you want to target a newer (or older) version of TypeScript, or just want to see which version your project is currently using, you'll need to take a look in the project's configuration for the  value.

As an aside, VS.NET will warn you if you have a newer version of the TypeScript tools installed, but your project is targeting an older version. It will ask if you would like to upgrade your project:



I can say I've had mixed results saying  'Yes' to this dialog including while writing this post. It did not update the version to 1.5 in my project's properties. I still had to manually modify the version.

To this end, I've made a recommendation on Microsoft Connect that the TypeScript tools version (and corresponding compiler version), be selectable from within the IDE on the 'TypeScript Build' tab within the project's properties. You can read and vote for this if you would like here: Allow switching TypeScript configured version from Project Properties IDE

Using OpenCover and ReportGenerator to get Unit Testing Code Coverage Metrics in .NET

If you are fortunate enough to use VS.NET Ultimate or purchase a license to a product such as dotCover then you already have access to unit testing code coverage tools. However there is still an easy and powerful way to get the same type metrics using a combination of msbuild.exe and (2) open source tools: OpenCover and ReportGenerator.



OpenCover will leverage msbuild.exe and analyze code to determine the amount of code coverage your application has in reference to the unit tests written against it. ReportGenerator will then leverage those results and display them in a .html report output that is generated. The really cool part of it all is that since it is all scriptable, you could make the output report an artifact of a Continuous Integration (CI) build definition. In this manner you can see how a team or project is doing in reference to the code base and unit testing after each check-in and build.


A quick word on what this post will not get into - What percentage is good to have that indicates the code has decent coverage? 100%? 75%? Does it matter? The answer is, it depends and there is no single benchmark to use. The answer lies in the fact that one should strive to create unit tests that are are meaningful. Unit testing getters and setters to achieve 100% code coverage might not be a good use of time. Testing to make sure a critical workflow's state and behavior are as intended are examples of unit tests to be written. The output of these tools will just help highlight any holes in unit testing that might exist. I could go into detail on percentages of code coverage and what types of tests to write in another post. The bottom line - just make sure you are at least doing some unit testing. 0% is not acceptable by any means.

Prior to starting, you also can get this script from my 'BowlingSPAService' solution in GitHub within the BowlingSPA repository. You can clone the repository to your machine and inspect or run the script. Run the script as an Administrator and view the output.

BowlingSPA GitHub


1. Download NuGet Packages:


Download and import the following (2) open source packages from NuGet into your test project. If you have multiple test projects, no worries. The package will be referenced in the script via its 'packages' folder location and not via any specific project. The test projects output is the target of these packages.

OpenCover - Used for calculating the metrics

Report Generator - Used for displaying the metrics

The documentation you'll need to refer to most is for OpenCover. It's Wiki is on GitHub and can be found at the location below. ReportGenerator doesn't need to be tweaked so much as it really just displays the output metrics report in HTML generated by OpenCover. This was my guide for creating the batch file commands used in this article.

OpenCover Wiki


2. Create a .bat file script in your solution 


I prefer to place these types of artifacts in a 'Solution Folder' (virtual folder) at the root to be easily accessible.

3. Use the following to commands to generate the metrics and report

   a. Run OpenCover using mstest.exe as the target:

Note: Make sure the file versions in this script code are updated to match whatever NuGet package version you have downloaded.

"%~dp0..\packages\OpenCover.4.5.3723\OpenCover.Console.exe" ^
-register:user ^
-target:"%VS120COMNTOOLS%\..\IDE\mstest.exe" ^
-targetargs:"/testcontainer:\"%~dp0..\BowlingSPAService.Tests\bin\Debug\BowlingSPAService.Tests.dll\" /resultsfile:\"%~dp0BowlingSPAService.trx\"" ^
-filter:"+[BowlingSPAService*]* -[BowlingSPAService.Tests]* -[*]BowlingSPAService.RouteConfig" ^
-mergebyhash ^
-skipautoprops ^
-output:"%~dp0\GeneratedReports\BowlingSPAServiceReport.xml"



The main pieces to point out here are the following:
  • Leverages mstest.exe to target the 'BowlingSPAService.Tests.dll' and send the test results to an output .trx file. Note: you can chain together as many test .dlls as you have in your solution; you might certainly have more than 1 test project
  • I've added some filters that will add anything in the 'BowlingSPAService' namespace, but also removing code in the 'BowlingSPAService.Tests' namespace as I don't want metrics on the test code itself or for it to show up on the output report. Note: these filters can have as many or few conditions you need for your application. You will after getting familiar with the report probably want to remove auto-generated classes (i.e. Entity Framework, WCF, etc.) from the test results via their namespace.
  • Use 'mergebyhash' to merge results loaded from multiple assemblies
  • Use 'skipautoprops' to skip .NET 'AutoProperties' from being analyzed (basic getters and setters don't require unit tests and thus shouldn't be reported on the output)
  • Output the information for the report (used by ReportGenerator) to 'BowlingSPAServiceReport.xml'

   b. Run Report Generator to create a human readable HTML report

"%~dp0..\packages\ReportGenerator.2.1.5.0\ReportGenerator.exe" ^
-reports:"%~dp0\GeneratedReports\BowlingSPAServiceReport.xml" ^
-targetdir:"%~dp0\GeneratedReports\ReportGenerator Output"

The main pieces to point out here are the following:
  • Calls ReportGenerator.exe from the packages directory (NuGet), providing the output .xml report file genrated from #3(a) above, and specifying the output target directory folder to generate the index.htm page. 
  • The report creation directory can be anywhere you wish, but I created a folder named 'ReportGenerator Output'

   c. Automatically open the report in the browser

start "report" "%~dp0\GeneratedReports\ReportGenerator Output\index.htm"

The main pieces to point out here are the following:
  • This will open the generated report in the machine's default browser
  • Note: if IE is used, you will be prompted to allow 'Blocked Content.' I usually allow as it provides links on the page with options to collapse and expand report sections.

4. Stitch together all the sections into a single script to run


REM Create a 'GeneratedReports' folder if it does not exist
if not exist "%~dp0GeneratedReports" mkdir "%~dp0GeneratedReports"

REM Remove any previous test execution files to prevent issues overwriting
IF EXIST "%~dp0BowlingSPAService.trx" del "%~dp0BowlingSPAService.trx%"

REM Remove any previously created test output directories
CD %~dp0
FOR /D /R %%X IN (%USERNAME%*) DO RD /S /Q "%%X"

REM Run the tests against the targeted output
call :RunOpenCoverUnitTestMetrics

REM Generate the report output based on the test results
if %errorlevel% equ 0 ( 
 call :RunReportGeneratorOutput 
)

REM Launch the report
if %errorlevel% equ 0 ( 
 call :RunLaunchReport 
)
exit /b %errorlevel%

:RunOpenCoverUnitTestMetrics
"%~dp0..\packages\OpenCover.4.5.3723\OpenCover.Console.exe" ^
-register:user ^
-target:"%VS120COMNTOOLS%\..\IDE\mstest.exe" ^
-targetargs:"/testcontainer:\"%~dp0..\BowlingSPAService.Tests\bin\Debug\BowlingSPAService.Tests.dll\" /resultsfile:\"%~dp0BowlingSPAService.trx\"" ^
-filter:"+[BowlingSPAService*]* -[BowlingSPAService.Tests]* -[*]BowlingSPAService.RouteConfig" ^
-mergebyhash ^
-skipautoprops ^
-output:"%~dp0\GeneratedReports\BowlingSPAServiceReport.xml"
exit /b %errorlevel%

:RunReportGeneratorOutput
"%~dp0..\packages\ReportGenerator.2.1.5.0\ReportGenerator.exe" ^
-reports:"%~dp0\GeneratedReports\BowlingSPAServiceReport.xml" ^
-targetdir:"%~dp0\GeneratedReports\ReportGenerator Output"
exit /b %errorlevel%

:RunLaunchReport
start "report" "%~dp0\GeneratedReports\ReportGenerator Output\index.htm"
exit /b %errorlevel%


This is what the complete script could look like. It adds the following pieces:

    a. Create an output directory for the report if it doesn't already exist
    a. Remove previous test execution files to prevent overwrite issues
    b. Remove previously created test output directories
    c. Run all sections together synchronously ensuring each step finishes successfully before proceeding


5. Analyze the output



Upon loading the report you can see immediately in percentages how well the projects are covered. As one can see below I have a decent start to the coverage in the pertinent areas of BowlingSPAService, but the report shows I need some additional testing. However, this is exactly the tool that makes me aware of this void. I need to write some more unit tests! (I was eager to get this post completed and published before finishing unit testing) :)

You can expand an individual project to get a visual of each of the class's unit test coverage:


By selecting an individual class, you can see a line level coverage visually with red and green highlighting. Red means there isn't coverage, and green means there is coverage. This report and visual metric highlights well when unit tests have probably only been written for 'happy path' scenarios, and there are unit test gaps for required negative or branching scenarios. By inspecting the classes that are not at 100% coverage, you can easily identify gaps and write additional unit tests to increase the coverage where needed.




Upon review you might find individual 'template' classes or namespaces that add 'noise' and are not realistically valid targets for the unit testing metrics. Add the following style filters to the filter switch targeting OpenCover to remove a namespace or a single class respectively:
  • -[*]BowlingSPAService.WebAPI.Areas.HelpPage.*
  • -[*]BowlingSPAService.WebApiConfig
After adding or modifying unit tests, you can run the report again and see the updated results. Questions about the particulars of how the metrics are evaluated are better directed toward the source for OpenCover on GitHub.

That's all you should need to get up and running to generate unit test metrics for your .NET solutions! As mentioned previously, you might want to add this as an output artifact to your CI build server and provide the report link to a wider audience for general viewing.  There are also additional fine tuning options and customizations you can make so be sure to check out the OpenCover Wiki I posted previously. 

Big News From Microsoft Build 2015 Day 1


As typical, there are usually some big and cool announcements made on the Microsoft .NET front in regards to the framework and surrounding technologies. Today at Build, 3 of the biggest announcements made were the release of .NET Core for Mac and Linux, the the release of Visual Studio Code, and finally the ability to compile and run Android and iOS app code in a Universal app for Windows.

The .NET Core release announcement will have the most impact for ASP.NET 5 (vNext) allowing one to build web sites and services on Windows, Linux, and Mac using .NET Core. Visual Studio Code will have an impact by being an all platform code editor which should give some competition to the likes of Sublime and Notepad++. Finally the ability to reuse Android and iOS app code to build and create Windows Universal apps will position Microsoft well going forward.

As a good friend and colleague of mine Anthony Handley said best, "This really IS a new Microsoft." This huge push for OSS and cross platform enabled development will certainly strengthen Windows 10 and Windows based development overall. Since it appeared Microsoft was running a distant 3rd in the mobile space both from hardware and software perspective, these moves to allow Android (Java / C++) and iOS (ObjectiveC) code to be compiled on Windows to make and run Windows Universal apps to reuse existing code is a winners move in my opinion. Combine this with the ability to run a portion of the .NET framework on Mac and Linux, you have a nice 1-2 punch strategy. This will certainly (and hopefully) bring Microsoft back into the picture as they drop the 'proprietary' walls and allow in cross platform app building.

As a modified version of the old saying goes, "if you can't beat them, join them, become best friends, and conquer the world!"

Microsoft releases .NET Core preview for Mac and Linux

Microsoft Launches Visual Studio Code, A Free Cross-Platform Code Editor For OS X, Linux And Windows

Huge news: Windows 10 can run reworked Android and iOS apps

Orlando Code Camp 2015


It was nice to see everyone that came out to Orlando Code Camp 2015! As I understand it there were 800+ attendees and a ton of great speakers and content. I enjoyed doing my presentation entitled, "Single Page Applications for the ASP.NET Developer using AngularJS, TypeScript, and WebAPI" There was a great turn out and was I was happy to be there.

For those interested, here is the URL to the GitHub repository with the sample app:
BowlingSPA GitHub

This is the slide deck from the presentation:


I really enjoy speaking at events put on by the Orlando .NET User Group (ONETUG). They are a fantastic group of individuals, and I look forward to hopefully speaking down in Orlando later this year at another user group event!

I'm Speaking at Orlando Code Camp 2015


Orlando Code Camp 2015 is almost here! This Saturday, March 28th I'll be down at Code Camp presenting on my session titled, "Single Page Applications for the ASP.NET Developer using AngularJS, TypeScript, and WebAPI" If you are interested in web development, or just curious about what SPAs entail this should be a great session to attend. 

The best part about Code Camp is it is FREE to attend and draws some fantastic speakers and attendees from all around. In all the years I have attended, I really think it's on par with some of the major software conferences and has many of the same speakers too. It's well worth attending to get some great information and have a chance to network with area peers. 

Check out the site to register, and hope to see you there!

Resolve Multiple Interface Bindings With Ninject

The old adage "program to an abstraction, not a concretion" reigns true and used in most modern application design via the use of Interfaces and concepts like IoC and DI. The polymorphic behavior of Interfaces and their ability to have multiple implementations allows us to do some really cool things as well as be staged for highly testable code with mocking frameworks like Moq.

However I'd say 99% of the time we typically have 1 Interface bound to a single concrete class using our DI framework. There will be cases where you will want to take advantage of having more than 1 implementation of an Interface and need to configure this.

Ninject is a great DI framework and you can achieve this through the use of named bindings. In this case if we have an Interface named ICalculate you could have 2 (or more) implementations. However upon injecting the class into a constructor, how would you dictate which instantiation/binding to use? The named bindings accomplish this as follows.

1. Provide a name for the bindings using the same Interface to allow them to be unique:
Bind<ICalculate>().To<CalcImpl1>().Named("Calculation1");
Bind<ICalculate>().To<CalcImpl2>().Named("Calculation2");

2. Specify the named binding as an attribute applied on the argument of the Interface being injected:
readonly ICalculate _calculate;
public MathCalculations([Named("Calculation1")] ICalculate calculate){
    _calculate = calculate;
}

It's that easy! For more information, see the Ninject documentation here.

I'm Speaking at Modern Apps LIVE! (LIVE! 360) Las Vegas

I'm excited to be speaking at the upcoming Modern Apps LIVE! conference co-located at the Visual Studio LIVE!, LIVE! 360 Las Vegas conference March 16-20. This conference track has a fantastic lineup of speakers including Jason Bock, Brent Edwards, Anthony Handley, Rocky Lhotka, and Kevin Ford with a variety of topics surrounding Modern Application Development. 
There is still time to save $500 using my speaker registration code: LVSPK30 Click on the banner below to go straight to the registration page.


Speaking

Here are the Modern Apps LIVE! sessions I'll be speaking at during the conference:

I hope to see you there and don't miss out on the registration savings above!!