Scott Hanselman

Console2 - A Better Windows Command Prompt

June 8, '11 Comments [78] Posted in Tools
Sponsored By

image I was working on my Mac today and while I maintain that the OS X finder is as effective as shooting your hands fill of Novocaine, I remain envious of the simplicity of their Terminal. Not much interesting has happened in the command prompt world in Windows since, well, ever. I actually blogged about text mode as a missed opportunity in 2004. That post is still valid today, I think. Text is fast. I spend lots of time there and I will race anyone with a mouse, any day.

I blogged about Console2 as a better prompt for CMD.exe in 2005. Here we are 6 years later and I hopped over there to see Console2 was still being developed. They were on build 122 then, and they are, magically and to their extreme credit, still around and on build 147. Epic.

Open Source projects may be done, but they are never dead.

I downloaded Console2 at http://sourceforge.net/projects/console/files/ and put it c:\dev\utils which is in my PATH.

Here's how I set it up for my default awesomeness.

  • Right-click in the main console and click Edit | Settings.
  • Under Console, set your default Startup Directory
  • Under Appearance|More, hide the menu, status bar and toolbar.
  • Under Appearance, set the font to Consolas 15. Not 14, not 16. Black background, Kermit green foreground color.
  • Set Window Transparency to a nice conservative 40 for both Active and Inactive. Not too in your face, but enough glassiness to say "I'm a subtle badass."
  • Under Behavior set "Copy on Select"
  • Under Hotkeys, change the New Tab 1 hotkey to Ctrl-T because that's what it should be. You'll have to click on the hotkey, then in the textbox, then type the hot-key you want AND press Assign for it to stick.
  • Under Hotkeys, change Copy Selection to Ctrl-C and Paste to Ctrl-V then rejoice and wonder why Windows doesn't work like this today. At this point, you may want to device if you want "Copy on Select" to happen automatically under Behavior. That'll save you the Control-C if you like.
  • Now, the subtlety. Under Tabs, you (if you are me) want two default tabs, one for CMD.EXE and one for PowerShell because you don't like your peas and carrots to touch on your plate.
    • Set your Console|cmd.exe first tab to this shell if you want it to be a Visual Studio command prompt. Be aware of the PATH if you are not on x64 like I am.
    • Then, make another Tab called PowerShell with this path:
      • %SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe
      • And I used the vspowershell.ico icon 'cause I'm into flair.

You'll have a nice "New Tab" option where you can make one of either shell. Note the general loveliness of this understated shell. I can open a new Tab with Ctrl-T (or lots) and use Ctrl-Tab to move between them. I took the screenshot with the background so you can see the transparency.

One final reason why Console2 rocks? It's freaking resizable in two directions, unlike the Windows CMD.exe console.

 (36)

Console2 is a great little front-end for your existing shell, no matter what it is. Note that Console2 isn't a shell itself, it's just a face on whatever you are already using. Enjoy.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Hanselminutes Podcast 268 - Personal Systems of Organization - Rey Bango interviews Scott Hanselman

May 31, '11 Comments [16] Posted in Musings | Podcast
Sponsored By

When life gives you lemons, organize your closet and install a valet hook   Tables turned this week and Rey Bango interviews Scott on his personal systems of organization. How has Scott synthesized the systems of Stephen Covey, David Allen, J.D. Meier and the Pomodoro Technique into a living system that works for him.

Links from the Show

Download: MP3 Full Show

NOTE: If you want to download our complete archives as a feed - that's all 266 shows, subscribe to the Complete MP3 Feed here.

Also, please do take a moment and review the show on iTunes.

Subscribe: Subscribe to Hanselminutes or Subscribe to my Podcast in iTunes or Zune

Do also remember the complete archives are always up and they have PDF Transcripts, a little known feature that show up a few weeks after each show.

Telerik is our sponsor for this show.

Building quality software is never easy. It requires skills and imagination. We cannot promise to improve your skills, but when it comes to User Interface and developer tools, we can provide the building blocks to take your application a step closer to your imagination. Explore the leading UI suites for ASP.NET AJAX, MVC,Silverlight, Windows Forms and WPF. Enjoy developer tools like .NET Reporting, ORM, Automated Testing Tools, Agile Project Management Tools, and Content Management Solution. And now you can increase your productivity with JustCode, Telerik’s new productivity tool for code analysis and refactoring. Visit www.telerik.com.

As I've said before this show comes to you with the audio expertise and stewardship of Carl Franklin. The name comes from Travis Illig, but the goal of the show is simple. Avoid wasting the listener's time. (and make the commute less boring)

Enjoy. Who knows what'll happen in the next show?

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Hanselminutes Podcast 267 - Before The Show: Off the Cuff Conversation with Jeff Atwood (EXPLICIT)

May 31, '11 Comments [13] Posted in Podcast
Sponsored By

the-golden-child-originalSometimes the most interesting conversations happen before or after the show. Often they happen with Jeff Atwood. I (Scott) called Jeff to get some audio for our other show http://thisdeveloperslife.com and was recording as soon as Jeff and I started chatting. Here's our unedited random personal phone call that I thought might be fun

To be clear: This was a personal conversation before we recorded an episode of This Developer's Life. I thought it'd be interesting to share. I did beep out the swear words. Sorry if that offends.

Download: MP3 Full Show

NOTE: If you want to download our complete archives as a feed - that's all 266 shows, subscribe to the Complete MP3 Feed here.

Also, please do take a moment and review the show on iTunes.

Subscribe: Subscribe to Hanselminutes or Subscribe to my Podcast in iTunes or Zune

Do also remember the complete archives are always up and they have PDF Transcripts, a little known feature that show up a few weeks after each show.

Telerik is our sponsor for this show.

Building quality software is never easy. It requires skills and imagination. We cannot promise to improve your skills, but when it comes to User Interface and developer tools, we can provide the building blocks to take your application a step closer to your imagination. Explore the leading UI suites for ASP.NET AJAX, MVC,Silverlight, Windows Forms and WPF. Enjoy developer tools like .NET Reporting, ORM, Automated Testing Tools, Agile Project Management Tools, and Content Management Solution. And now you can increase your productivity with JustCode, Telerik’s new productivity tool for code analysis and refactoring. Visit www.telerik.com.

As I've said before this show comes to you with the audio expertise and stewardship of Carl Franklin. The name comes from Travis Illig, but the goal of the show is simple. Avoid wasting the listener's time. (and make the commute less boring)

Enjoy. Who knows what'll happen in the next show?

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Globalization, Internationalization and Localization in ASP.NET MVC 3, JavaScript and jQuery - Part 1

May 26, '11 Comments [35] Posted in ASP.NET | ASP.NET MVC | Internationalization | Javascript
Sponsored By

There are several books worth of information to be said about Internationalization (i18n) out there, so I can't solve it all in a blog post. Even 9 pages of blog posts. I like to call it Iñtërnâtiônàlizætiøn, actually.

There's a couple of basic things to understand though, before you create a multilingual ASP.NET application. Let's agree on some basic definitions as these terms are often used interchangeably.

  • Internationalization (i18n) - Making your application able to support a range of languages and locales
  • Localization (L10n) - Making your application support a specific language/locale.
  • Globalization - The combination of Internationalization and Localization
  • Language - For example, Spanish generally. ISO code "es"
  • Locale - Mexico. Note that Spanish in Spain is not the same as Spanish in Mexico, e.g. "es-ES" vs. "es-MX"

Culture and UICulture

The User Interface Culture is a CultureInfo instance from the .NET base class library (BCL). It lives on Thread.CurrentThread.CurrentUICulture and if you felt like it, you could set it manually like this:

Thread.CurrentThread.CurrentUICulture = new CultureInfo("es-MX");

The CurrentCulture is used for Dates, Currency, etc.

Thread.CurrentThread.CurrentCulture = new CultureInfo("es-MX"); 

However, you really ought to avoid doing this kind of stuff unless you know what you're doing and you really have a good reason.

The user's browser will report their language preferences in the Accept-Languages HTTP Header like this:

GET http://www.hanselman.com HTTP/1.1
Connection: keep-alive
Cache-Control: max-age=0
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8

See how I prefer en-US and then en? I can get ASP.NET to automatically pass those values and setup the threads with with the correct culture. I need to set my web.config like this:



...snip...

That one line will do the work for me. At this point the current thread and current UI thread's culture will be automatically set by ASP.NET.

The Importance of Pseudointernationalization

Back in 2005 I updated John Robbin's Pseudoizer (and misspelled it then!) and I've just ported it over to .NET 4 and used it for this application. I find this technique for creating localizable sites really convenient because I'm effectively changing all the strings within my app to another language which allows me to spot strings I missed with the tedium of translating strings.

You can download the .NET Pseudoizer here.

UPDATE: I've put the source for Pseudoizer up on GitHub. You are welcome to fork/clone it and send pull requests or make your own versions.

Here's an example from that earlier post before I run it through Pseudointernationalization:


Transaction Download


View Statement


Select an account below to view or download your available online statements.

I can convert these resources with the pseudoizer like this:

PsuedoizerConsole examplestrings.en.resx examplestrings.xx.resx

and here's the result:


[Ŧřäʼnşäčŧįőʼn Đőŵʼnľőäđ !!! !!!]


[Vįęŵ Ŝŧäŧęmęʼnŧ !!! !!!]


[Ŝęľęčŧ äʼn äččőūʼnŧ þęľőŵ ŧő vįęŵ őř đőŵʼnľőäđ yőūř äväįľäþľę őʼnľįʼnę şŧäŧęmęʼnŧş. !!! !!! !!! !!! !!!]

Cool, eh? If you're working with RESX files a lot, be sure to familiarize yourself with the resgen.exe command-line tool that is included with Visual Studio and the .NET SDK. You have this on your system already. You can move easily between the RESX XML-based file format and a more human- (and translator-) friendly text name=value format like this:

resgen /compile examplestrings.xx.resx,examplestrings.xx.txt

And now they are a nice name=value format, and as I said, I can move between them.

Accounts.Download.Title=[Ŧřäʼnşäčŧįőʼn Đőŵʼnľőäđ !!! !!!]
Accounts.Statements.Action.ViewStatement=[Vįęŵ Ŝŧäŧęmęʼnŧ !!! !!!]
Accounts.Statements.Instructions=[Ŝęľęčŧ äʼn äččőūʼnŧ þęľőŵ ŧő vįęŵ őř đőŵʼnľőäđ yőūř äväįľäþľę őʼnľįʼnę şŧäŧęmęʼnŧş. !!! !!! !!! !!! !!!]

During development time I like to add this Pseudoizer step to my Continuous Integration build or as a pre-build step and assign the resources to a random language I'm NOT going to be creating, like Polish (with all due respect to the Poles) so I'd make examplestrings.pl.resx and the then we can test our fake language by changing our browser's UserLanguages to prefer pl-PL over en-US.

Localization Fallback

Different languages take different amounts of space. God bless the Germans but their strings will take an average of 30% more space than English phrases. Chinese will take 30% less. The Pseudoizer pads strings in order to illustrate these differences and encourage you to take them into consideration in your layouts.

Localization within .NET (not specific to ASP.NET Proper or ASP.NET MVC) implements a standard fallback mechanism. That means it will start looking for the most specific string from the required locale, then fallback continuing to look until it ends on the neutral language (whatever that is). This fallback is handled by convention-based naming. Here is an older, but still excellent live demo of Resource Fallback at ASPAlliance.

For example, let's say there are three resources. Resources.resx, Resources.es.resx, and Resources.es-MX.resx.

Resources.resx:
HelloString=Hello, what's up?
GoodbyeString=See ya!
DudeString=Duuuude!

Resources.es.resx:
HelloString=¿Cómo está?
GoodbyeString=Adiós!

Resources.es-MX.resx:
HelloString=¿Hola, qué tal?

Consider these three files in a fallback scenario. The user shows up with his browser requesting es-MX. If we ask for HelloString, he'll get the most specific one. If we ask for GoodbyeString, we have no "es-MX" equivalent, so we move up one to just "es." If we ask for DudeString, we have no es strings at all, so we'll fall all the way back to the neutral resource.

Using this basic concept of fallback, you can minimize the numbers of strings you localize and provide users with not only language specific strings (Spanish) but also local (Mexican Spanish) strings. And yes, I realize this is a silly example and isn't really representative of Spaniards or Mexican colloquial language.

Views rather than Resources

If you don't like the idea of resources, while you will still have to deal with some resources, you could also have difference views for different languages and locales. You can structure your ~/Views folders like Brian Reiter and others have. It's actually pretty obvious once you have bought into the idea of resource fallback as above. Here's Brian's example:

/Views
/Globalization
/ar
/Home
/Index.aspx
/Shared
/Site.master
/Navigation.aspx
/es
/Home
/Index.aspx
/Shared
/Navigation.aspx
/fr
/Home
/Index.aspx
/Shared
/Home
/Index.aspx
/Shared
/Error.aspx
/Footer.aspx
/Navigation.aspx
/Site.master

Just as you can let ASP.NET change the current UI culture based on UserLanguages or a cookie, you can also control the way that Views are selected by a small override of your favorite ViewEngine. Brian includes a few lines to pick views based on a language cookie on his blog.

He also includes some simple jQuery to allow a user to override their language with a cookie like this:

var mySiteNamespace = {}

mySiteNamespace.switchLanguage = function (lang) {
$.cookie('language', lang);
window.location.reload();
}

$(document).ready(function () {
// attach mySiteNamespace.switchLanguage to click events based on css classes
$('.lang-english').click(function () { mySiteNamespace.switchLanguage('en'); });
$('.lang-french').click(function () { mySiteNamespace.switchLanguage('fr'); });
$('.lang-arabic').click(function () { mySiteNamespace.switchLanguage('ar'); });
$('.lang-spanish').click(function () { mySiteNamespace.switchLanguage('es'); });
});

I'd probably make this a single client event and use data-language or an HTML5 attribute (brainstorming) like this:

$(document).ready(function () {
$('.language').click(function (event) {
$.cookie('language', $(event.target).data('lang'));
})
});

But you get the idea. You can set override cookies, check those first, then check the UserLanguages header. It depends on the experience you're looking for and you need to hook it up between the client and server

Globalized JavaScript Validation

If you're doing a lot of client-side work using JavaScript and jQuery, you'll need to get familiar with the jQuery Global plugin. You may also want the localization files for things like the DatePicker and jQuery UI on NuGet via "install-package jQuery.UI.i18n."

Turns out the one thing you can't ask your browser via JavaScript is what languages it prefers. That is sitting inside an HTTP Header called "Accept-Language" and looks like this, as it's a weighted list.

en-ca,en;q=0.8,en-us;q=0.6,de-de;q=0.4,de;q=0.2

We want to tell jQuery and friends about this value, so we need access to it from the client side in a different way, so I propose this.

This is Cheesy - use Ajax

We could do this, with a simple controller on the server side:

public class LocaleController : Controller {
public ActionResult CurrentCulture() {
return Json(System.Threading.Thread.Current.CurrentUICulture.ToString(), JsonRequestBehavior.AllowGet);
}
}

And then call it from the client side. Ask jQuery to figure it out, and be sure you have the client side globalization libraries you want for the cultures you'll support. I downloaded all 700 jQuery Globs from GitHub. Then I could make a quick Ajax call and get that info dynamically from the server. I also include the locales I want to support as scripts like  /Scripts/globinfo/jquery.glob.fr.js. You could also build a dynamic parser and load these dynamically also, or load them ALL when they show up on the Google or Microsoft CDNs as a complete blob.

But that is a little cheesy because I have to make that little JSON call. Perhaps this belongs somewhere else, like a custom META tag.

Slightly Less Cheesy - Meta Tag

Why not put the value of this header in a META tag on the page and access it there? It means no extra AJAX call and I can still use jQuery as before. I'll create an HTML helper and use it in my main layout page. Here's the HTML Helper. It uses the current thread, which was automatically set earlier by the setting we added to the web.config.

namespace System.Web.Mvc
{
public static class LocalizationHelpers
{
public static IHtmlString MetaAcceptLanguage(this HtmlHelper html)
{
var acceptLanguage = HttpUtility.HtmlAttributeEncode(Threading.Thread.CurrentThread.CurrentUICulture.ToString());
return new HtmlString(String.Format("",acceptLanguage));
}
}
}

I use this helper like this on the main layout page:







   


@Html.MetaAcceptLanguage()

...

And the resulting HTML looks like this. Note that this made-up META tag would be semantically different from the Content-Language or the lang= attributes as it's part of the the parsed HTTP Header that ASP.NET decided was our current culture, moved into the client.







   



Now I can access it with similar code from the client side. I hope to improve this and support dynamic loading of the JS, however preferCulture isn't smart and actually NEEDS the resources loaded in order to make a decision. I would like a method that would tell me the preferred culture so that I might load the resources on-demand.

So what? Now when I am on the client side, my validation and JavaScript is a little smarter. Once jQuery on the client knows about your current preferred culture, you can start being smart with your jQuery. Make sure you are moving around non-culture-specific data values on the wire, then convert them as they become visible to the user.

var price = $.format(123.789, "c");
jQuery("#price").html('12345');
var date = $.format(new Date(1972, 2, 5), "D");
jQuery("#date").html(date);
var units = $.format(12345, "n0");
jQuery("#unitsMoved").html(units);

Now, you can apply these concepts to validation within ASP.NET MVC.

Globalized jQuery Unobtrusive Validation 

Adding onto the code above, we can hook up the globalization to validation, so that we'll better understand how to manage values like 5,50 which is 5.50 for the French, for example. There are a number of validation methods you can hook up, here's number parsing.

$(document).ready(function () {
//Ask ASP.NET what culture we prefer, because we stuck it in a meta tag
var data = $("meta[name='accept-language']").attr("content")
//Tell jQuery to figure it out also on the client side.
$.global.preferCulture(data);

//Tell the validator, for example,
// that we want numbers parsed a certain way!
$.validator.methods.number = function (value, element) {
if ($.global.parseFloat(value)) {
return true;
}
return false;
}
});

If I set my User Languages to prefer French (fr-FR) as in this screenshot:

Language Preference Dialog preferring French

Then my validation realizes that and won't allow 5.50 as a value, but will allow 5,50, given this model:

public class Example
{
public int ID { get; set; }
[Required]
[StringLength(30)]
public string First { get; set; }
[Required]
[StringLength(30)]
public string Last { get; set; }
[Required]
public DateTime BirthDate { get; set; }
[Required]
[Range(0,100)]
public float HourlyRate { get; set; }
}

I'll see this validation error, as the client side knows our preference for , as a decimal separator.

NOTE: It seems to me that the [Range] attribute that talks to jQuery Validation doesn't support globalization and isn't calling into the localized methods so it won't work with the , and . decimal problem. I was able to fix this problem by overriding the range method in jQuery like this, forcing it to use the global implementation of parseFloat. Thanks to Kostas in the comments on this post for this info.

jQuery.extend(jQuery.validator.methods, {
range: function (value, element, param) {
//Use the Globalization plugin to parse the value
var val = $.global.parseFloat(value);
return this.optional(element) || (val >= param[0] && val <= param[1]);
}
});
Here it is working with validity... 

The Value 4.5 is not valid for Hourly Rate

And here it is in a Danish culture working with [range]:

Localized Range

 

I can also set the Required Attribute to use specific resources and names and localized them from an ExampleResources.resx file like this:

public class Example
{
public int ID { get; set; }
[Required(ErrorMessageResourceType=typeof(ExampleResources),
ErrorMessageResourceName="RequiredPropertyValue")]
[StringLength(30)]
public string First { get; set; }
...snip...

And see this:

image

NOTE: I'm looking into how to set new defaults for all fields, rather than overriding them individually. I've been able to override some with a resource file that has keys called "PropertyValueInvalid" and "PropertyValueRequired" then setting these values in the Global.asax, but something isn't right.

DefaultModelBinder.ResourceClassKey = "ExampleResources";
ValidationExtensions.ResourceClassKey = "ExampleResources";

I'll continue to explore this.

Dynamically Localizing the jQuery DatePicker

Since I know what the current jQuery UI culture is, I can use it to dynamically load the resources I need for the DatePicker. I've installed the "MvcHtml5Templates" NuGet library from Scott Kirkland so my input type is "datetime" and I've added this little bit of JavaScript that says, do we support dates? Are we non-English? If so, go get the right DatePicker script and set it's info as the default for our DatePicker by getting the regional settings given the current global culture.

//Setup datepickers if we don't support it natively!
if (!Modernizr.inputtypes.date) {
if ($.global.culture.name != "en-us" && $.global.culture.name != "en") {
var datepickerScriptFile = "/Scripts/globdatepicker/jquery.ui.datepicker-" + $.global.culture.name + ".js";
//Now, load the date picker support for this language
// and set the defaults for a localized calendar
$.getScript(datepickerScriptFile, function () {
$.datepicker.setDefaults($.datepicker.regional[$.global.culture.name]);
});
}
$("input[type='datetime']").datepicker();
}

Then we set all input's with type=datetime. You could have used a CSS class if you like as well.

image

Now our jQuery DatePicker is French.

Right to Left (body=rtl)

For languages like Arabic and Hebrew that read Right To Left (RTL) you'll need to change the dir= attribute of the elements you want flipped. Most often you'll change the root element to or change it with CSS like:

div {
direction:rtl;
}

The point is to have a general strategy, whether it be a custom layout file for RTL languages or just flipping your shared layout with either CSS or an HTML Helper. Often folks put the direction in the resources and pull out the value ltr or rtl depending.

Conclusion

Globalization is hard and requires actual thought and analysis. The current JavaScript offerings are in flux and that's kind.

A lot of this stuff could be made boilerplate or automatic, but much of it is a moving target. I'm currently exploring either a NuGet package that sets stuff up for you OR a "File | New Project" template with all the best practices already setup and packaged into one super-package. What's your preference, Dear Reader?

The Complete Script

Here's my current "complete" working script that could then be moved into its own file. This is a work in progress, to be sure. Please forgive any obvious mistakes as I'm still learning JavaScript.

    <script>
        $(document).ready(function () {
            //Ask ASP.NET what culture we prefer, because we stuck it in a meta tag
            var data = $("meta[name='accept-language']").attr("content")

            //Tell jQuery to figure it out also on the client side.
            $.global.preferCulture(data);

            //Tell the validator, for example,
            // that we want numbers parsed a certain way!
            $.validator.methods.number = function (value, element) {
                if ($.global.parseFloat(value)) {
                    return true;
                }
                return false;
            }

            //Fix the range to use globalized methods
            jQuery.extend(jQuery.validator.methods, {
                range: function (value, element, param) {
                    //Use the Globalization plugin to parse the value
                    var val = $.global.parseFloat(value);
                    return this.optional(element) || (val >= param[0] && val <= param[1]);
                }
            });

            //Setup datepickers if we don't support it natively!
            if (!Modernizr.inputtypes.date) {
                if ($.global.culture.name != 'en-us' && $.global.culture.name != 'en') {

                    var datepickerScriptFile = "/Scripts/globdatepicker/jquery.ui.datepicker-" + $.global.culture.name + ".js";
                    //Now, load the date picker support for this language
                    // and set the defaults for a localized calendar
                    $.getScript(datepickerScriptFile, function () {
                        $.datepicker.setDefaults($.datepicker.regional[$.global.culture.name]);
                    });
                }
                $("input[type='datetime']").datepicker();
            }

        });
    </script>

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

NuGet for the Enterprise: NuGet in a Continuous Integration Automated Build System

May 25, '11 Comments [26] Posted in NuGet | Open Source
Sponsored By

NuGet: Microsoft .NET Package Management for the EnterpriseI had the pleasure of speaking at TechEd 2011 North America last week in Atlanta. You can see ALL the videos of all the sessions on Channel 9. As an aside, you might notice that they are in the process of organizing video archives of ALL Microsoft developer events at http://channel9.msdn.com/Events. You can even see PDC 1999 if you like or see sessions by Speaker at http://channel9.msdn.com/Events/Speakers. Here are all my talks with a horrible headshot that I plan on asking Duncan to swap out ASAP.

My favorite talk was NuGet: Microsoft .NET Package Management for the Enterprise. I talked about NuGet, like I did in The Netherlands a few weeks ago, except the TechEd talk was focused much more on how NuGet fits into the software development lifecycle in a diverse Enterprise (or big boring company, if you prefer) environment.

Here's the video downloads, or you can click the slide at the right.

At my last company, we used Subversion for source control and CruiseControl for Continuous Integration (CI). I thought it'd be nice to setup a similar system using the latest free (and mostly free) tools. Note, you can do all this with TFS as well for both Source and Build. I'll do a post on that later. For now, I give you:

Setting up NuGet to build using Mercurial for Source Control and JetBrains TeamCity for Continuous Integration while pushing to a local Orchard NuGet Gallery Server

Oh yes, that's a long H3 right there but it's descriptive, right? Here's the general idea.

Progession Diagram: Source, Build, NuGet, Gallery

This of course, is not unique to NuGet, as NuGet is just a build artifact. At my last company we had several things that popped out of the build. Not just the DLLs, but also a ZIP file, MSI installer and even a complete configured and prepped Virtual Machine for sales people to pick up and give demos with our latest bits. You can setup your Continuous Integration system to be as awesome or as simple as you like. You should be thinking about which libraries and parts will create NuGet packages.

Another thing to think about is daily (or every build) packages vs. stable or release packages. There are some discussions on the NuGet site around a "-beta" switch and baked-in support for understanding stable vs. volatile builds. For now, consider two locations for your builds, one for each build and one for "blessed." For some, this might mean a folder for dailies and only blessed go to a server.

Here's the demo I did. You can change what you like and swap out for your favorite tools. I'll point out some gotchas and issues that hit me and might hit you. It's not perfect, but we're moving in the right direction.

Step 0 - Prep Stuff

Some of these steps are "make sure x is setup" type steps and can happen whenever, so don't take the ordering of the steps as totally crucial. Here's what I used and installed.

Step 1 - Make sure your project builds and you can make a NuGet package (nupkg)

I'm doing all this on one laptop, but you might have things spread out at your company. Do what makes you happy.

I made a trivial little class library called TechEdLibrary and confirmed it builds with MSBuild like this:

C:\dev\techedlibrary\TechEdLibrary>msbuild TechEdLibrary.csproj
Microsoft (R) Build Engine Version 4.0.30319.1
[Microsoft .NET Framework, Version 4.0.30319.225]
Copyright (C) Microsoft Corporation 2007. All rights reserved.

Build started 5/24/2011 4:52:54 PM.
..snip...
Done Building Project "C:\dev\techedlibrary\TechEdLibrary\TechEdLibrary.csproj"

Build succeeded.
0 Warning(s)
0 Error(s)

Time Elapsed 00:00:00.13

The most important part is to make sure that your AssemblyInfo.cs is actually filed out and not just the defaults. This is because we'll want to update the NuGet's specification file using the values from the DLL and project. Basically we want the metadata of the project to drive the NuGet package (or we have to update it all manually.)

I need a spec file. I can do this a few ways. I can do it manually with just "nuget spec," I can build it off the resulting DLL with "nuget spec -assemblyPath bin\debug\techedlibrary.dll" or off the csproject with "nuget spec techedlibrary.csproj."

If I create it off the csproj, the NuGet spec file will be created with some $tokens$ that will be filled out at packaging time:

C:\dev\techedlibrary\TechEdLibrary>nuget spec TechEdLibrary.csproj
Created 'TechEdLibrary.nuspec' successfully.

In the talk at TechEd I mistakenly build it off the DLL and ended up hard-coding the versions. In a continuous integration system you'll want to update the version in your spec as the build versions. You can either do it with tokens like I will in this post, or you can pass in the version (often from an environment variable) into the command line like "NuGet Pack -version 1.0.1.0."

You can also  use the UpdateVersion.exe that Matt Griffith and I updated SO many years ago to change your AssemblyInfo.cs, then the NuGet package can pick it up. Again, there is no right answer, the point is to have a strategy. What drives the version and how does NuGet find out about it?

My TechEdLibrary.nuspec I just made looks like this:

<?xml version="1.0"?>
<package xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">
<metadata>
<id>$id$</id>
<version>$version$</version>
<authors>$author$</authors>
<owners>$author$</owners>
<iconUrl>http://www.hanselman.com/images/nugeticon.png
</iconUrl>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>$description$</description>
</metadata>
</package>

And my AssemblyInfo.cs (abridged) is like this. The attributes from the DLL will get copied into the $tokens$ and packaged.

[assembly: AssemblyTitle("TechEdLibrary")]
[assembly: AssemblyDescription("This is my cool library")]
[assembly: AssemblyCompany("Scott Hanselman")]
[assembly: AssemblyProduct("TechEdLibrary")]
[assembly: AssemblyVersion("0.9.*")] //The * means I'll get an automatic version bump
//and more...

Now I can pack it up with NuGet Pack TechEdLibrary.csproj. Note the command line output find the spec and does the token/metadata replacement?

C:\dev\techedlibrary\TechEdLibrary>nuget pack TechEdLibrary.csproj
Attempting to build package from 'TechEdLibrary.csproj'.
Building project for target framework '.NETFramework,Version=v4.0'.
Packing files from 'C:\dev\techedlibrary\TechEdLibrary\bin\Release'.
Using 'TechEdLibrary.nuspec' for metadata.
Successfully created package 'C:\dev\techedlibrary\TechEdLibrary\TechEdLibrary.0.9.4161.28882.nupkg'.

Now I can just open the .nupkg into the NuGet Package Explorer and see the version, author, ID (from Title) description and version are all there, brought in from the attributes and combined with my NuSpec. I can edit the NuSpec to taste as long as I maintain the $tokens$ I want.

My package in the NuGet Package Explorer

Now if I build the library again I'll get a new version from the .NET build system and that will cause a new NuPkg to be built with a new version. I make a change to my application's source, build again, then pack again. Note the results in my directory after I make a small change, build and pack.

C:\dev\techedlibrary\TechEdLibrary>msbuild TechEdLibrary.csproj 
Microsoft (R) Build Engine Version 4.0.30319.1
Build started 5/24/2011 5:07:58 PM.
blah build blah
Done Building Project "C:\dev\techedlibrary\TechEdLibrary\TechEdLibrary.csproj" (rebuild target(s))

Build succeeded.
0 Warning(s)
0 Error(s)

Time Elapsed 00:00:00.32

C:\dev\techedlibrary\TechEdLibrary>nuget pack TechEdLibrary.csproj
Attempting to build package from 'TechEdLibrary.csproj'.
Building project for target framework '.NETFramework,Version=v4.0'.
Packing files from 'C:\dev\techedlibrary\TechEdLibrary\bin\Release'.
Using 'TechEdLibrary.nuspec' for metadata.
Successfully created package 'C:\dev\techedlibrary\TechEdLibrary\TechEdLibrary.0.9.4161.29041.nupkg'
.

C:\dev\techedlibrary\TechEdLibrary>dir *.nupkg

05/24/2011 05:07 PM 4,770 TechEdLibrary.0.9.4161.28882.nupkg
05/24/2011 05:08 PM 4,763 TechEdLibrary.0.9.4161.29041.nupkg
2 File(s) 9,533 bytes

Without settings really anything up but the versioning plan, spec file and checking if packing works, I've got a little system here and hopefully you can see how it'll work.

WEIRDNESS NOTE: We do have a double-build going on. NuGet.exe, for some weird reason, is running MSBuild again for us. That's lame, and a known bug. Surprisingly in a large number of CI systems in both the .NET and Java worlds "double builds" are common. Weaksauce, but common. Still, no excuse. That'll be fixed in NuGet.exe soon.

Step 2 - Initial Source Check-in

I like using BitBucket for small private projects and CodePlex for public Open Source projects. Both support Hg (the elemental symbol for "Mercury" as in Mercurial) and CodePlex supports TFS for both Source and Work Items. Since my demo is private and BitBucket supports unlimited private projects it was a good fit.

I cloned my project URL from BitBucket into a folder, then added, commited and pushed my bits to the BitBucket Site:

C:\dev\something>hg clone https://shanselman@bitbucket.org/shanselman/techedlibrary
...move my files in...
C:\dev\something>hg add
...yada yada yada...
C:\dev\something>hg commit
...yada yada yada...
C:\dev\something>hg push
...yada yada yada...

It's useful then to make sure I can get my source code into another folder and still build it. It's common to miss a file or two. Since the CI Server will be getting the code into a temporary folder you really need to make sure the source will build as it is, retrieved fresh from your SCM.

Step 3 - Setup your Build Server

I'm running TeamCity locally on http://localhost:8111/ with a "build agent" (there can be many) on the same machine. Visit Administration and make a new project, then a new Build Configuration (like Debug or Release).

You'll need to make a VCS root (Version Control System - there are like 60 different acronyms for version control systems. If you see a TLA (Three Letter Acronym) out there that you don't recognize, it's probably something that means "Source Control.") in TeamCity. Note that I selected Mercurial and set the HG Command Path with the full path to HG.EXE as the CI will call that to check out the code from BitBucket. Note also that I put the path without my name and password for the "pull changes from" as I put the name and password lower down.

Editing a VCS Root in TeamCity

Back in Administration, select Edit under your project. Note the nice lists of steps on the right.

The 7 steps in TeamCity

We want two steps, one for the MSBuild and one "custom command line" for the NuGet package step. The first step is easy, we call MSBuild on our TechEdLibrary.csproj.

The second step is a temporary hack. It's temporary because JetBrains TeamCity is building NuGet support in directly (screenshots of their internal build below!)

TeamCity Custom Scripts

The custom script is basically a BATCH file that looks like this:

del *.nupkg 

NuGet pack techedlibrary.csproj
forfiles /m *.nupkg /c "cmd /c NuGet.exe push @FILE yourapikey -source http://localhost:81/

There's a few interesting things going on here.

First, I delete all the nupkg files in the CI folder as it's all temporary and we don't want to accidentally push old stuff.

Second, I pack up the NuGet package like we saw before.

Finally, I don't know the name of the newly created *.nupkg file, so I cheat by making a DOS BATCH "for loop" that has only one item in it, the newly created .nupkg file! Then I call cmd.exe execute NuGet.exe pack with that new file as a parameter. Make sure you have the trailing backslash.

NOTE: You can also save the API Key in local storage with NuGet SetApiKey yourapikey -source http://localhost:81 and it'll be saved on a per-source basis.

Because I have an Orchard NuGet Gallery running locally, I have an API key for that server. I'm running my gallery server (it holds the packages and serves the OData Feed) locally on port 81 and the Orchard Site itself is on port 80.

My personal NuGet Gallery

I set the build to check Source Control every 60 seconds, if a change is detected, the source is retrieved, build, then NuGet packed, and published to my local server (or the public one).

TeamCity is something I like to have running on extra hardware I've got lying around. You can have up to 20 projects with their free version, so when combined with BitBucket and CodePlex where I keep my projects I can setup my own Continuous Integration System in just a few hours and have better confidence in my code. You can even use Amazon EC2 images to run your builds or just be a build agent.

Sneak Peak - TeamCity with NuGet Integration

I'm continuing to talk to companies and software vendors who are jazzed about NuGet. If you are one, watch my talk for ideas on what you can do as a commercial software entity and get into the discussion on CodePlex.

Here's some mockups of what JetBrains is planning for TeamCity. These are just ideas, and they aren't available, so don't stress or pressure them. ;)

This is a mockup of NuGet as a possible Build Step within TeamCity.

This is a mockup of NuGet as a possible Build Step within TeamCity.

Here's a mockup of a Build Feature where TeamCity downloads specific packages that will be needed by the build.

Here's a mockup of a Build Feature where TeamCity downloads specific packages that will be needed by the build.

Here's a screenshot of some early work that Martin Woodward from the TFS team is doing to make sure NuGet and Team Foundation Server work well together in a Continuous Integration environment. Feel free to contact Martin via his blog and continue the discussion on http://nuget.codeplex.com.

TFS and NuGet

Hope this helps you integrate NuGet into your company, however you like it.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.