Scott Hanselman

NuGet Package of the Week #10 - New Mobile View Engines for ASP.NET MVC 3, spec-compatible with ASP.NET MVC 4

September 5, '11 Comments [18] Posted in ASP.NET | ASP.NET MVC | Mobile | NuGet | NuGetPOW
Sponsored By

Desktop ASP.NET MVC Application next to the same application in a mobile browserI did some basic mobile view engine work for ASP.NET MVC for Mix in 2009 and then created what I thought was a better ASP.NET MVC Mobile ViewEngine in 2010. Unfortunately, the second one (the "better" one) had a caching bug that only showed itself in Release mode. This last month, Jon, John, Peter and I updated NerdDinner to MVC 3 with Razor and a pile of other new features. One of those new features was jQuery Mobile support and that meant we need to fix this bad Mobile View Engine. Additionally, ASP.NET MVC 4 will include actual supported Mobile Views support, so the pressure was on.

However, we wanted to make sure any new MVC 3 Mobile View sample was mostly compatible with whatever scheme ASP.NET MVC 4 uses. The original folder layout for my proposed ViewEngine was by folder but the final design was to use file names. That means instead of ~/Views/Home/Mobile/Index.cshtml, you'd have ~/Views/Home/Index.Mobile.cshtml. Of course, you can change this if you really want to yourself, but that's the default.

Alternate Views shown in Solution Explorer in subfolders Alternate Views shown in Solution Explorer separated by filename differences, not folders

Peter Mourfield jumped in and did the updated Mobile View Engines and we've put them on NuGet for you, Dear Reader.

Remember, these are for ASP.NET MVC 3. You don't need them when ASP.NET MVC 4 comes out, and the general idea will be that you will remove the Razor (or WebForms) ViewEngine and replace it with the mobile version which is a superset of functionality.

ViewEngines.Engines.Remove(ViewEngines.Engines.OfType<RazorViewEngine>().First());
ViewEngines.Engines.Add(new MobileCapableRazorViewEngine());
ViewEngines.Engines.Remove(ViewEngines.Engines.OfType<WebFormViewEngine>().First());
ViewEngines.Engines.Add(new MobileCapableWebFormViewEngine());

You can do this bit of work in Application_Start, or with the Web Activator like the MobileViewEngines.Razor.Samples does. The sample NuGet package includes both VB and C#, so you'll want to delete the one you won't use. You only need to use the ViewEngine you need, so if you aren't using WebForms, don't bother with those lines.

The whole ViewEngine that Peter made is only 81 lines of code so you can certainly change it to your taste. Peter and I put the source on BitBucket for changes, forks and fixes.

image

Just add the word Mobile in your views, like Index.Mobile.cshtml or Details.Mobile.aspx and those will be used when a mobile browser is detected. The detection  is using the standard Browser.IsMobileDevice call from ASP.NET, so consider using a browser database like http://51degrees.mobi (also on CodePlex, and NuGet).

Remember, this is a clean-room implementation (not derived from ASP.NET MVC  4) that has just basic mobile view overrides. I'm glad it doesn't have the release mode bug like my previous ones did, and we are using this implementation live on http://nerddinner.com. Modify the source if you need advanced support for multiple mobile views (like iPhone, BlackBerry, etc) other than just "mobile" like this one does. There are features that this basic ViewEngine doesn't have that a more sophisticated solution like ASP.NET MVC 4's or other folks' implementations could have like:

  • Browser Overrides: Forcing or "opting out" of mobile and using desktop
  • Device-specific custom layouts

Still, we've found it to be simple and useful on NerdDinner and we hope it's useful to you.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

A basic non-cloud-based personal backup strategy

September 5, '11 Comments [51] Posted in Tools
Sponsored By

A cloud with the international "no" symbol over itIn 2007 I posted about my Family's Backup Strategy and encouraged you to develop your own, or confirm the one you already have. At the time, my strategy was primary using Mozy.com as an online backup along with a Windows Home Server for local backup. After Windows Home Server recently removed Drive Extender technology (their version of RAID support in case a disk fails) I switched over to a Synology 1511+ and did a podcast on it.

Here's the summary if you feel the rest of this post is TL;DR and a ramble, which is likely because it is.

  • Use an imaging tool like Acronis True Image to create images of the machines you REALLY care about.
    • You can also use the System Image tool built into Windows, but I prefer the additional options from Acronis.
  • Encrypt your external drives.
  • Backup files (and disk images) to two external drives
  • Keep one external drive off-site
  • Test your backups by practicing a restore. The rule of thumb is that backups ALWAYS succeed. It's restores that fail!

I've been experimenting with cloud-based backup trying nearly everything out there from KeepVault to CrashPlan. I'm currently "between online backup strategies" right now, although I'm leaning towards CrashPlan. Most of these online backup companies are pretty confusing when you factor in someone with a server. For example, I have a main computer in my house but I also have a server. I have a KitchenPC and my wife's laptop but both of those don't matter as all the data is on server. However, I really want disk images for my main machine.

I need two things:

  • Continuous reliable image backup
  • My server's files backed up

The whole point here being if a natural disaster happens no family videos or photos will be lost.  For folks that keep all their files on their local PCs and not a server, any of these online services is great. However, as soon as I centralized my files, things got more complex. KeepVault is the best for Windows Home Server users as it has a custom Home Server client that integrates to the WHS control panel directly.

Now that I've moved over to a Linux-solution like Synology, other than using straight rsync or rsync to S3, the best solution appears to be CrashPlan Headless. While it's not directly supported on the Synology, there are LOTS of people who want to make this work so I wouldn't be totally alone, although the instructions are daunting and insane to say the least.

At this point, today, I've got a 4TB SAN with about 1.5TB on it and no cloud storage. And honestly, I wasn't looking forward to waiting two weeks (or longer) to upload 1.5TB to a new service, and I'm not sure what my ISP would say about it. I think that initial seed of large datasets is the Achilles Heel of online backup.

I really don't feel comfortable with my backups unless they are offsite. So I went and bought two Western Digital My Passport Essential SE 1 TB USB 3.0 Drives and labeled them Backup A and Backup B.

I recommend that you encrypt the whole disk. The last thing you want when you're copying your entire life onto one drive is to do it in the great wide open. There's two good ways to do it, BitLocker To Go or TrueCrypt

  • BitLocker To Go is in Windows 7 Enterprise or Ultimate. It's trivially easy to use and you CAN read BitLocker To Go'ed disks on any Windows7 Edition, or even on Windows XP or Windows Vista with BitLocker To Go Reader. It supports TPMs if your laptop has that feature and it's super secure.
  • TrueCrypt is open source and super hardcore. There's dozens of really amazing options for things like plausible deniability, honey pot secret partitions, and many choices of encryption. It's also crazy secure.

You really can't go wrong with either of these choices. For testing, I'm trying each of them, one on each drive. I like TrueCrypt, though because it's open source, but I like that BitLocker is built-in to Windows. We'll see. Point is, don't put your life on a disk unencrypted.

NOTE: Both of these encryption tools take forever (hours) to encrypt the whole disk. Be patient. It'll be a while.

I'm using Acronis  Image Home 2011 PC Backup and Recovery for my imaging solution. There are a lot of negative reviews of the 2011 version on Amazon but I haven't had any issues. Be aware. I'm always open to try new products if you have any recommendations, Dear Reader.

I use SyncBack to copy files on a schedule through my main machine from my server to the external drive. I could connect the drives directly to the Synology but I want to use my Windows machine for encryption.

My wife takes Drive A to the bank's safety deposit box in her monthly visit, then we just swap drives one a month with new backups. I'm not sure if I'll eventually get around to installing CrashPlan on the Synology (I hope it gets easier) but my current offsite "no cloud" backup strategy is working very nicely and it doesn't cost any bandwidth. In fact, I can transfer 1.5TB in just 10 minutes (of driving)!

* Cloud icon from The Noun Project, CC BY 3.0

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Analyze your Web Server Data and be empowered with LogParser and Log Parser Lizard GUI

September 4, '11 Comments [8] Posted in IIS | NuGet | Tools
Sponsored By

The Log Parser Architecture Diagram showing all the inputs and outputs. There are a lot of choices on both sides.I've been using LogParser whenever I need to really dig into Web Server Logs since before 2005. It's an amazing tool. I love it. Jeff Atwood loves it, and you should to. It may not being something you use every day but when you need it, it's there and it's awesome. It's kind of like a really focused sed or awk. A low-level tool with a high-powered focus.

Log Parser hasn't changed that I know of since 2005. I've been working with some folks to try to get it to escape the big house, but we'll see how far we get. Until then, it works fabulously and unchanged after all these years. It's great because while my primary use of LogParser is with IIS Log files, it'll query anything you can plug into it like the File System, Event Logs, the Registry or just a CSV file. The diagram from their docs is at right.

I did a blog post 6 years ago before FeedBurner where I analyzed traffic to my RSS feed from Newsgator.  NewsGator was an RSS reader that would include statistics and information in its User-Agent HTTP Header. I was reminded of this post when I was talking to the NuGet team about how they are releasing new versions of NuGet every month or so but it's not clear how many people are upgrading. It'd also be interesting to find out what other ways folks are hitting the NuGet feed and what they are using to do it. I volunteered, so David Ebbo sent me a day's log file to "figure out."

Log Parser is wonderful because it effectively lets you run SQL queries against text files. Here's a few choice examples from Atwood's post a few years back:

Top 10 Slowest Items

SELECT TOP 10 cs-uri-stem AS Url, MIN(time-taken) as [Min], 
AVG(time-taken) AS [Avg], max(time-taken) AS [Max],
count(time-taken) AS Hits
FROM ex*.log
WHERE time-taken < 120000
GROUP BY Url
ORDER BY [Avg] DESC

HTTP Errors Per Hour

SELECT date, QUANTIZE(time, 3600) AS Hour, 
sc-status AS Status, COUNT(*) AS Errors
FROM ex*.log
WHERE (sc-status >= 400)
GROUP BY date, hour, sc-status
HAVING (Errors > 25)
ORDER BY Errors DESC

Given queries like these, I figured that LogParser would be perfect for me to explore the NuGet web service logs. (Of course, I realize that the service itself could be instrumented, but this is more flexible, and I plan to make these queries run on a schedule and show up on http://stats.nuget.org.)

There are a number of ways to access a NuGet packaging server. You can use the Add Package Dialog, the Command Line, the PowerShell Console within Visual Studio, or the NuGet Package Explorer. There's also some testing data and some "no user agent" stuff in there also. I filtered that out by just charting "NuGet" clients.

I started doing the initial work from the command line, but it was slow going. I was having trouble visualizing what I wanted and what was being returned. Here is one of my first command lines. It was pretty hairy and hard to build this at the command line.

C:\u_ex110831>LogParser.exe -i:IISW3C "SELECT DISTINCT cs(User-Agent) AS Client, 
count(1) AS NumberOfHits
FROM u_ex110831.log
WHERE Client
LIKE 'NuGet%'
GROUP BY Client
ORDER by count(1) DESC"

Client NumberOfHits
------------------------------------------------------------------------------------------- ------------
NuGet+Add+Package+Dialog/1.4.20701.9038+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 38840
NuGet+Command+Line/1.5.20830.9001+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 15591
NuGet+Add+Package+Dialog/1.4.20701.9038+(Microsoft+Windows+NT+6.1.7600.0) 13360
NuGet+Command+Line/1.4.20615.182+(Microsoft+Windows+NT+6.1.7600.0) 8562
NuGet+Add+Package+Dialog/1.4.20607.9007+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 5531
NuGet+Package+Manager+Console/1.4.20701.9038+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 5497
NuGet+Command+Line/1.4.20615.182+(Microsoft+Windows+NT+6.1.7601+Service+Pack+1) 3699
NuGet+Package+Manager+Console/1.4.20701.9038+(Microsoft+Windows+NT+6.1.7600.0) 3654
NuGet+Add+Package+Dialog/1.4.20701.9038+(Microsoft+Windows+NT+5.1.2600+Service+Pack+3) 3558
NuGet+Command+Line/1.4.20615.182+(Microsoft+Windows+NT+5.2.3790+Service+Pack+2) 2539
Press a key...

There were 40 unique User Agents in this file and they include the client and its version as well as the operating system. I wanted first to chop it up to find out what Types of NuGet clients were being used. I have broken the lines up to make it clearer in this snippet.

C:\u_ex110831>LogParser.exe 
-i:IISW3C "SELECT DISTINCT SUBSTR(cs(User-Agent),0, index_of(cs(User-Agent),'/')) AS Client,
count(1) AS NumberOfHits FROM u_ex110831.log
WHERE Client LIKE 'NuGet%'
GROUP BY Client
ORDER by count(1) DESC"

Client NumberOfHits
----------------------------- ------------
NuGet+Add+Package+Dialog 74761
NuGet+Command+Line 32284
NuGet+Package+Manager+Console 12637
NuGet+Package+Explorer 943
NuGet+Visual+Studio+Extension 49

Statistics:
-----------
Elements processed: 208235
Elements output: 5
Execution time: 0.79 seconds

Pretty amazing, though. A sub-second query over almost a quarter million line long log file with useful results and no database. Reminds me of working on Unix 20 years ago.

After some experimenting and installing the Office Web Components 2003 (discontinued) and was outputting a chart with this MONSTER command line:

C:\u_ex110831>LogParser.exe -i:IISW3C -o:CHART -chartType:PieExploded 
-categories:Off -values:On -view:on
-chartTitle:"NuGet Clients by User Agent"
"SELECT DISTINCT SUBSTR(cs(User-Agent),0,index_of(cs(User-Agent),'/')) AS Client,
count(1) AS NumberOfHits
INTO foo.png
FROM u_ex110831.log
WHERE Client
LIKE 'NuGet%'
GROUP BY Client
ORDER by count(1) DESC"

Which yields me this profoundly 2003-looking chart, but still allows me to cheer a tiny victory inside. I will be able to get this (or a prettier one) to run as on a schedule (AT or Chron job) and serve it to the interwebs. I t'll probably be better to output a CSV or XML file, then process that with the web server and create a proper interactive chart. Regardless, tiny cheer.

foo

Still, I'm thinking I'm too old for this crap. Where's my GUI? What's a brother got to do to drag a DataGrid around here? A little Binging with DuckDuckGo (yes, I'm trying DDG this month) and I find - wait for it - LogParser Lizard GUI.

LogParser Lizard GUI

What's this? Oh YES. It's intellisense and tooltips, baby!

Log Parser Lizard GUI Main Screen

I can't say how much faster this tool made me once I had figured out LogParser. It's funny how you have to suffer at the command line before you can really appreciate a good GUI. At this point I called Jon Galloway for some pair-SQLing and we pounded out a few more queries.

NuGet by Version

I filtered out NuGet Package Explorer because it has its own version scheme. However, I'm not sure about this query, as I wanted to get the Major.Minor versions. I noticed that by coincidence the third value z (of x.y.z) always started with .2 so I cheated with the SUB() below because I couldn't figure out how to just filter out the x.y values. Any thoughts are appreciated.

SELECT DISTINCT SUBSTR( cs(User-Agent), 
ADD(index_of(cs(User-Agent),'/'),1),
SUB(index_of(cs(User-Agent),'.2'),STRLEN(cs(User-Agent))))
AS Client, count(1) AS NumberOfHits
FROM u_ex110831.log
WHERE cs(User-Agent) NOT LIKE '%Explorer%' AND cs(User-Agent) LIKE '%NuGet%'
GROUP BY Client
ORDER by count(1) DESC
Client Hits
------ ----- 1.4 98097
1.5 18985
1.3 2524
1.6 69

So then I did the whole version:

SELECT SUBSTR( cs(User-Agent), 
ADD(index_of(cs(User-Agent),'/'),1),
SUB(index_of(cs(User-Agent),'+('),STRLEN(cs(User-Agent))))
AS Client, count(1) AS NumberOfHits
FROM u_ex110831.log
WHERE cs(User-Agent) NOT LIKE '%Explorer%' AND cs(User-Agent) LIKE '%NuGet%'
GROUP BY Client, cs(User-Agent)
ORDER by count(1) DESC

Client Hits
--------------- ------
1.4.20701.9038 38840
1.5.20830.9001 15591
1.4.20701.9038 13360
1.4.20615.182 8562
1.4.20607.9007 5531
1.4.20701.9037 5497
1.4.20615.182 3699
1.4.20701.9038 3654

I was extremely impressed with how quickly (about an hour) was able to get really substantive, interesting and targeted data out of these log files. The next step will be to get all the logs and run the command line tool create month over month line charts. The goal will be to figure out how many folks are successfully upgrading their NuGet installations as well as how they are using it. Are they using the right-click menu or are they using the console?

If you've got an application that makes HTTP calls to a service that you own, whether your application is a phone or a custom client, while you can certainly instrument your code on the server side to collect stats, there's a LOT of information in your IIS logs. You can use LogParser Lizard GUI to develop your queries and then schedule runs of the command line tool to generate reports that will really help you improve your product. This technique isn't as sophisticated as custom 3rd party analytics package but you can certainly get a surprising amount of information in a short amount of time with LogParser.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

The Importance (and Ease) of Minifying your CSS and JavaScript and Optimizing PNGs for your Blog or Website

September 1, '11 Comments [47] Posted in ASP.NET | IIS
Sponsored By

Hello Dear Reader. You may feel free to add a comment at the bottom of this post, something like "Um, DUH!" after reading this. It's funny how one gets lazy with their own website. What's the old joke, "those who can't, teach." I show folks how to optimize their websites all the time but never got around to optimizing my own.

It's important (and useful!) to send as few bytes of CSS and JS and HTML markup down the wire as possible. It's not just about size, though, it's also about the number of requests to get the bits. In fact, that's often more of a problem then file size.

First, go run YSlow on your site.

YSlow such a wonderful tool and it will totally ruin your day and make you feel horrible about yourself and your site. ;) But you can work through that. Eek. First, my images are huge. I've also got 184k of JS, 21k of CSS and 30k of markup. Note my favicon is small. It was  LOT bigger before and even sucked up gigabytes of bandwidth a few years back.

YSlow also tells me that I am making folks make too many HTTP requests:

This page has 33 external JavaScript scripts. Try combining them into one.
This page has 5 external stylesheets. Try combining them into one.

Seems that speeding things up is not just about making things smaller, but also asking for fewer things and getting more for the asking. I want to make fewer request that may have larger payloads, but then those payloads will be minified and then compressed with GZip.

Optimize, Minify, Squish and GZip your CSS and JavaScript

CSS can look like this:

body {
line-height: 1;
}
ol, ul {
list-style: none;
}
blockquote, q {
quotes: none;
}

Or like this, and it still works.

body{line-height:1}ol,ul{list-style:none}blockquote,q{quotes:none}

There's lots of ways to "minify" CSS and JavaScript, and fortunately you don't need to care! Think about CSS/JS minifying kind of like the Great Zip File Wars of the early nineties. There's a lot of different choices, they are all within a few percentage points of each other, and everyone thinks theirs is the best one ever.

There are JavaScript specific compressors. You run your code through these before you put your site live.

And there are CSS compressors:

And some of these integrate nicely into your development workflow. You can put them in your build files, or minify things on the fly.

  • YUICompressor - .NET Port that can compress on the fly or at build time. Also on NuGet.
  • AjaxMin  - Has MSBuild tasks and can be integrated into your project's build.
  • SquishIt - Used at runtime in your ASP.NET applications' views and does magic at runtime.
  • UPDATE: Chirpy - "Mashes, minifies, and validates your javascript, stylesheet, and dotless files."
  • UPDATE: Combres - ".NET library which enables minification, compression, combination, and caching of JavaScript and CSS resources for ASP.NET and ASP.NET MVC web applications."
  • UPDATE: Cassette by Andrew Davey - Does it all, compiles CoffeeScript, script combining, smart about debug- and release-time.

There's plenty of comparisons out there looking at the different choices. Ultimately when compression percentages don't matter much, you should focus on two things:

  • compatibility - does it break your CSS? It should never do this
  • workflow - does it fit into your life and how you work?

For me, I have a template language in my blog and I need to compress my CSS and JS when I deploy my new template. A batch file and command line utility works nicely so I used AjaxMin (yes, it's made by Microsoft, but it did exactly what I needed.)

I created a simple batch file that took the pile of JS from the top of my blog and the pile from the bottom and created a .header.js and a .footer.js. I also squished all the CSS, including my plugins that needed CSS, and put them in one file while being sure to maintain file order.

I've split these lines up for readability only.

set PATH=%~dp0;"C:\Program Files (x86)\Microsoft\Microsoft Ajax Minifier\"

ajaxmin -clobber
scripts\openid.css
scripts\syntaxhighlighter_3.0.83\styles\shCore.css
scripts\syntaxhighlighter_3.0.83\styles\shThemeDefault.css
scripts\fancybox\jquery.fancybox-1.3.4.css
themes\Hanselman\css\screenv5.css
-o css\hanselman.v5.min.css

ajaxmin -clobber
themes/Hanselman/scripts/activatePlaceholders.js
themes/Hanselman/scripts/convertListToSelect.js
scripts/fancybox/jquery.fancybox-1.3.4.pack.js
-o scripts\hanselman.header.v4.min.js

ajaxmin -clobber
scripts/omni_external_blogs_v2.js
scripts/syntaxhighlighter_3.0.83/scripts/shCore.js
scripts/syntaxhighlighter_3.0.83/scripts/shLegacy.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCSharp.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushPowershell.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushXml.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCpp.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushJScript.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCss.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushRuby.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushVb.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushPython.js
scripts/twitterbloggerv2.js scripts/ga_social_tracker.js
-o scripts\hanselman.footer.v4.min.js

pause
All those ones at the bottom support my code highlighter. This looks complex ,but it's just making three files, two JS and a CSS out of all my mess of required JS files.

This squished all my CSS down to 26k, and here's the output:

CSS
Original Size: 35667 bytes; reduced size: 26537 bytes (25.6% minification)
Gzip of output approximately 5823 bytes (78.1% compression)

JS
Original Size: 83505 bytes; reduced size: 64515 bytes (22.7% minification)
Gzip of output approximately 34415 bytes (46.7% compression)

That also turned 22 HTTP requests into 3.

Optimize your Images (particularly PNGs)

Looks like my 1600k cold (machine not cached) home page is mostly images, about 1300k. That's because I put a lot of articles on the home page but I also use PNGs for images most of my blog posts. I could be more thoughtful and:

  • Use JPEGs for photos of people, things that are visually "busy"
  • Use PNGs for charts, screenshots, things that must be "crystal clear"

I can also optimize the size of my PNGs (did you know you can do that!) before I upload them with PNGOUT. For bloggers and for ease I recommend PNGGauntlet, which is a Windows app that calls PNGOut for you.  Easier than PowerShell, although I do that also.

If you use Visual Studio 2010, you can use Mad's Beta Image Optimizer Extension that will let you optimize images directly from Visual Studio.

To show you how useful this is, I downloaded the images from the last month or so of posts on this blog totaling 7.29MB and then ran them through PNGOut via PNGGauntlet.

All my PNGs getting in line to be optimzed inside of PNGGauntlet

Then I took a few of the PNGs that were too large and saved them as JPGs. All in all, I saved 1359k (that's almost a meg and a half or almost 20%) for minimal extra work.

If you think this kind of optimization is a bad idea, or boring or a waste of time, think about the multipliers. You're saving (or I am) a meg and a half of image downloads, thousands of times. When you're dead and gone your blog will still be saving bytes for your readers! ;)

This is important not just because saving bandwidth is nice, but because perception of speed is important. Give the browser less work to do, especially if, like me, almost 10% of your users are mobile. Don't make these little phones work harder than they need to and remember that not everyone has an unlimited data plan.

Let your browser cache everything, forever

Mads reminded me about this great tip for IIS7 that tells the webserver to set the "Expires" header to a far future date, effectively telling the browser to cache things forever. What's nice about this is that if you or your host is using IIS7, you can change this setting yourself from web.config and don't need to touch IIS settings.

<staticContent>
<clientCache httpExpires="Sun, 29 Mar 2020 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>

You might think this is insane. This is, in fact, insane. Insane like a fox. I built the website so I want control. I version my CSS and JS files in the filename. Others use QueryStrings with versions and some use hashes. The point is are YOU in control or are you just letting caching happen? Even if you don't use this tip, know how and why things are cached and how you can control it.

Compress everything

Make sure everything is GZip'ed as it goes out of your Web Server. This is also easy with IIS7 and allowed me to get rid of some old 3rd party libraries. All these settings are in system.webServer.

<urlCompression doDynamicCompression="true" doStaticCompression="true" dynamicCompressionBeforeCache="true"/>

If there is one thing you can do to your website, it's turning on HTTP compression. For average pages, like my 100k of HTML, it can turn into 20k. It downloads faster and the perception of speed by the user from "click to render" will increase.

Certainly this post just scratches the surface of REAL performance optimization and only goes up to the point where the bits hit the browser. You can go nuts trying to get an "A" grade in YSlow, optimizing for # of DOM objects, DNS Lookups, JavaScript ordering, and on and on.

That said, you can get 80% of the benefit for 5% of the effort by these tips. It'll take you no time and you'll reap the benefits hourly:

  • Minifying your JS and CSS
  • Combining CSS and JS into single files to minimize HTTP requests
  • Turning on Gzip Compression for as much as you can
  • Set Expires Headers on everything you can
  • Compress your PNGs and JPEGs (but definitely your PNGs)
  • Use CSS Sprites if you have lots of tiny images

I still have a "D" on YSlow, but a D is a passing grade. ;) Enjoy, and leave your tips, and your "duh!" in the comments, Dear Reader.

Also, read anything by Steve Souders. He wrote YSlow.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Asynchronous scalable web applications with real-time persistent long-running connections with SignalR

August 29, '11 Comments [92] Posted in ASP.NET | IIS | Javascript | SignalR
Sponsored By

I've been spending some time exploring asynchrony and scale recently. You may have seen my post about my explorations with node.js and iisnode running node on Windows.

Every application has different requirements such that rules to "make it scale" don't work for every kind of application. Scaling a web app that gets some data and for loops over it is different from an app that calls out to a high-latency mainframe is different from an app that needs to maintain a persistent connection to the server.

The old adage "when all you have it is a hammer everything looks like a nail" really holds true in the programming and web space. The more tools - and the knowledge to use them - the better. That's why I'm an advocate not only of polyglot programming but also of going deep with your main languages. When you really learn LINQ for example and get really good at dynamics, C# becomes a much more fun and expressive language.

Polling is a common example of hammering a screw. Trying to make a chat program? Poll every 5 seconds. Got a really long running transaction? Throw up an animated GIF and poll until eternity, my friend!

Long polling is another way to get things done. Basically open a connection and keep it open, forcing the client (browser) to wait, pretending it's taking a long time to return. If you have enough control on your server-side programming model, this can allow you to return data as you like over this "open connection." If the connection breaks, it's transparently re-opened and the break is hidden from both sides. In the future things like WebSockets will be another way to solve this problem when it's baked.

Persistent Connections in ASP.NET

Doing this kind of persistent connection in a chat application or stock ticker for example hasn't been easy in ASP.NET. There hasn't been a decent abstraction for this on the server or a client library to talk to it.

SignalR is an asynchronous signaling library for ASP.NET that our team is working on to help build real-time multi-user web application.

Isn't this just Socket.IO or nowjs?

Socket.IO is a client side JavaScript library that talks to node.js. Nowjs is a library that lets you call the client from the server. All these and Signalr are similar and related, but different perspectives on the same concepts. Both these JavaScript libraries expect certain things and conventions on the server-side, so it's probably possible to make the server look the way these clients would want it to look if one wanted.

SignalR is a complete client- and server-side solution with JS on client and ASP.NET on the back end to create these kinds of applications. You can get it up on GitHub.

But can I make a chat application in 12 lines of code?

I like to say

"In code, any sufficient level of abstraction is indistinguishable from magic."

That said, I suppose I could just say, sure!

Chat.DoItBaby()

But that would be a lie. Here's a real chat application in SignalR for example:

Client:

var chat = $.connection.chat;
chat.name = prompt("What's your name?", "");

chat.receive = function(name, message){
$("#messages").append("
"+name+": "+message);
}

$("#send-button").click(function(){
chat.distribute($("#text-input").val());
});

Server:

public class Chat : Hub {
public void Distribute(string message) {
Clients.receive(Caller.name, message);
}
}

That's maybe 12, could be 9, depends on how you roll.

More details on SignalR

SignalR is broken up into a few package on NuGet:

  • SignalR - A meta package that brings in SignalR.Server and SignalR.Js (you should install this)
  • SignalR.Server - Server side components needed to build SignalR endpoints
  • SignalR.Js - Javascript client for SignalR
  • SignalR.Client - .NET client for SignalR
  • SignalR.Ninject - Ninject dependeny resolver for SignalR

If you just want to play and make a small up, start up Visual Studio 2010.

First, make an Empty ASP.NET application, and install-package SignalR with NuGet, either with the UI or the Package Console.

Second, create a new default.aspx page and add a button, a textbox, references to jQuery and jQuery.signalR along with this script.









    Low Level Connection

    Notice we're calling /echo from the client? That is hooked up in routing in Global.asax:

    RouteTable.Routes.MapConnection("echo", "echo/{*operation}");

    At this point, we've got two choices of models with SignalR. Let's look at the low level first.

    using SignalR;
    using System.Threading.Tasks;

    public class MyConnection : PersistentConnection
    {
    protected override Task OnReceivedAsync(string clientId, string data)
    {
    // Broadcast data to all clients
    return Connection.Broadcast(data);
    }
    }

    We derive from PersistentConnection and can basically do whatever we want at this level. There's lots of choices:

    public abstract class PersistentConnection : HttpTaskAsyncHandler, IGroupManager
    {
    protected ITransport _transport;

    protected PersistentConnection();
    protected PersistentConnection(Signaler signaler, IMessageStore store, IJsonStringifier jsonStringifier);

    public IConnection Connection { get; }
    public override bool IsReusable { get; }

    public void AddToGroup(string clientId, string groupName);
    protected virtual IConnection CreateConnection(string clientId, IEnumerable groups, HttpContextBase context);
    protected virtual void OnConnected(HttpContextBase context, string clientId);
    protected virtual Task OnConnectedAsync(HttpContextBase context, string clientId);
    protected virtual void OnDisconnect(string clientId);
    protected virtual Task OnDisconnectAsync(string clientId);
    protected virtual void OnError(Exception e);
    protected virtual Task OnErrorAsync(Exception e);
    protected virtual void OnReceived(string clientId, string data);
    protected virtual Task OnReceivedAsync(string clientId, string data);
    public override Task ProcessRequestAsync(HttpContext context);
    public void RemoveFromGroup(string clientId, string groupName);
    public void Send(object value);
    public void Send(string clientId, object value);
    public void SendToGroup(string groupName, object value);
    }

    High Level Hub

    Or, we can take it up a level and just do this for our chat client after adding

    <script src="/signalr/hubs" type="text/javascript"></script>

    to our page.

    $(function () {
    // Proxy created on the fly
    var chat = $.connection.chat;

    // Declare a function on the chat hub so the server can invoke it
    chat.addMessage = function (message) {
    $('#messages').append('
  • ' + message + '');
    };

    $("#broadcast").click(function () {
    // Call the chat method on the server
    chat.send($('#msg').val());
    });

    // Start the connection
    $.connection.hub.start();
    });
  • Then there is no need for routing and the connection.chat will map to this on the server, and the server can then call the client back.

    public class Chat : Hub
    {
    public void Send(string message)
    {
    // Call the addMessage method on all clients
    Clients.addMessage(message);
    }
    }

    At this point your brain should have exploded and leaked out of your ears. This is C#, server-side code and we're telling all the clients to call the addMessage() JavaScript function. We're calling the client back from the server by sending the name of the client method to call down from the server via our persistent connection. It's similar to NowJS but not a lot of people are familiar with this technique.

    SignalR will handle all the connection stuff on both client and server, making sure it stays open and alive. It'll use the right connection for your browser and will scale on the server with async and await techniques (like I talked about in the node.js post where I showed scalable async evented I/O on asp.net).

    Want to see this sample running LIVE?

    We've got a tiny tiny chat app running on Azure over at http://jabbr.net/, so go beat on it. There are folks in /join aspnet. Try pasting in YouTube links or images!

    SignalR Chat

    It's early, but it's an interesting new LEGO piece for .NET that didn't completely exist before. Feel free to check it out on GitHub and talk to the authors of SignalR, David Fowler and Damian Edwards. Enjoy.

    About Scott

    Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

    facebook twitter subscribe
    About   Newsletter
    Sponsored By
    Hosting By
    Dedicated Windows Server Hosting by SherWeb

    Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.