Scott Hanselman

The Importance (and Ease) of Minifying your CSS and JavaScript and Optimizing PNGs for your Blog or Website

September 02, 2011 Comment on this post [47] Posted in ASP.NET | IIS
Sponsored By

Hello Dear Reader. You may feel free to add a comment at the bottom of this post, something like "Um, DUH!" after reading this. It's funny how one gets lazy with their own website. What's the old joke, "those who can't, teach." I show folks how to optimize their websites all the time but never got around to optimizing my own.

It's important (and useful!) to send as few bytes of CSS and JS and HTML markup down the wire as possible. It's not just about size, though, it's also about the number of requests to get the bits. In fact, that's often more of a problem then file size.

First, go run YSlow on your site.

YSlow such a wonderful tool and it will totally ruin your day and make you feel horrible about yourself and your site. ;) But you can work through that. Eek. First, my images are huge. I've also got 184k of JS, 21k of CSS and 30k of markup. Note my favicon is small. It was  LOT bigger before and even sucked up gigabytes of bandwidth a few years back.

YSlow also tells me that I am making folks make too many HTTP requests:

This page has 33 external JavaScript scripts. Try combining them into one.
This page has 5 external stylesheets. Try combining them into one.

Seems that speeding things up is not just about making things smaller, but also asking for fewer things and getting more for the asking. I want to make fewer request that may have larger payloads, but then those payloads will be minified and then compressed with GZip.

Optimize, Minify, Squish and GZip your CSS and JavaScript

CSS can look like this:

body {
line-height: 1;
}
ol, ul {
list-style: none;
}
blockquote, q {
quotes: none;
}

Or like this, and it still works.

body{line-height:1}ol,ul{list-style:none}blockquote,q{quotes:none}

There's lots of ways to "minify" CSS and JavaScript, and fortunately you don't need to care! Think about CSS/JS minifying kind of like the Great Zip File Wars of the early nineties. There's a lot of different choices, they are all within a few percentage points of each other, and everyone thinks theirs is the best one ever.

There are JavaScript specific compressors. You run your code through these before you put your site live.

And there are CSS compressors:

And some of these integrate nicely into your development workflow. You can put them in your build files, or minify things on the fly.

  • YUICompressor - .NET Port that can compress on the fly or at build time. Also on NuGet.
  • AjaxMin  - Has MSBuild tasks and can be integrated into your project's build.
  • SquishIt - Used at runtime in your ASP.NET applications' views and does magic at runtime.
  • UPDATE: Chirpy - "Mashes, minifies, and validates your javascript, stylesheet, and dotless files."
  • UPDATE: Combres - ".NET library which enables minification, compression, combination, and caching of JavaScript and CSS resources for ASP.NET and ASP.NET MVC web applications."
  • UPDATE: Cassette by Andrew Davey - Does it all, compiles CoffeeScript, script combining, smart about debug- and release-time.

There's plenty of comparisons out there looking at the different choices. Ultimately when compression percentages don't matter much, you should focus on two things:

  • compatibility - does it break your CSS? It should never do this
  • workflow - does it fit into your life and how you work?

For me, I have a template language in my blog and I need to compress my CSS and JS when I deploy my new template. A batch file and command line utility works nicely so I used AjaxMin (yes, it's made by Microsoft, but it did exactly what I needed.)

I created a simple batch file that took the pile of JS from the top of my blog and the pile from the bottom and created a .header.js and a .footer.js. I also squished all the CSS, including my plugins that needed CSS, and put them in one file while being sure to maintain file order.

I've split these lines up for readability only.

set PATH=%~dp0;"C:\Program Files (x86)\Microsoft\Microsoft Ajax Minifier\"

ajaxmin -clobber
scripts\openid.css
scripts\syntaxhighlighter_3.0.83\styles\shCore.css
scripts\syntaxhighlighter_3.0.83\styles\shThemeDefault.css
scripts\fancybox\jquery.fancybox-1.3.4.css
themes\Hanselman\css\screenv5.css
-o css\hanselman.v5.min.css

ajaxmin -clobber
themes/Hanselman/scripts/activatePlaceholders.js
themes/Hanselman/scripts/convertListToSelect.js
scripts/fancybox/jquery.fancybox-1.3.4.pack.js
-o scripts\hanselman.header.v4.min.js

ajaxmin -clobber
scripts/omni_external_blogs_v2.js
scripts/syntaxhighlighter_3.0.83/scripts/shCore.js
scripts/syntaxhighlighter_3.0.83/scripts/shLegacy.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCSharp.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushPowershell.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushXml.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCpp.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushJScript.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushCss.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushRuby.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushVb.js
scripts/syntaxhighlighter_3.0.83/scripts/shBrushPython.js
scripts/twitterbloggerv2.js scripts/ga_social_tracker.js
-o scripts\hanselman.footer.v4.min.js

pause
All those ones at the bottom support my code highlighter. This looks complex ,but it's just making three files, two JS and a CSS out of all my mess of required JS files.

This squished all my CSS down to 26k, and here's the output:

CSS
Original Size: 35667 bytes; reduced size: 26537 bytes (25.6% minification)
Gzip of output approximately 5823 bytes (78.1% compression)

JS
Original Size: 83505 bytes; reduced size: 64515 bytes (22.7% minification)
Gzip of output approximately 34415 bytes (46.7% compression)

That also turned 22 HTTP requests into 3.

Optimize your Images (particularly PNGs)

Looks like my 1600k cold (machine not cached) home page is mostly images, about 1300k. That's because I put a lot of articles on the home page but I also use PNGs for images most of my blog posts. I could be more thoughtful and:

  • Use JPEGs for photos of people, things that are visually "busy"
  • Use PNGs for charts, screenshots, things that must be "crystal clear"

I can also optimize the size of my PNGs (did you know you can do that!) before I upload them with PNGOUT. For bloggers and for ease I recommend PNGGauntlet, which is a Windows app that calls PNGOut for you.  Easier than PowerShell, although I do that also.

If you use Visual Studio 2010, you can use Mad's Beta Image Optimizer Extension that will let you optimize images directly from Visual Studio.

To show you how useful this is, I downloaded the images from the last month or so of posts on this blog totaling 7.29MB and then ran them through PNGOut via PNGGauntlet.

All my PNGs getting in line to be optimzed inside of PNGGauntlet

Then I took a few of the PNGs that were too large and saved them as JPGs. All in all, I saved 1359k (that's almost a meg and a half or almost 20%) for minimal extra work.

If you think this kind of optimization is a bad idea, or boring or a waste of time, think about the multipliers. You're saving (or I am) a meg and a half of image downloads, thousands of times. When you're dead and gone your blog will still be saving bytes for your readers! ;)

This is important not just because saving bandwidth is nice, but because perception of speed is important. Give the browser less work to do, especially if, like me, almost 10% of your users are mobile. Don't make these little phones work harder than they need to and remember that not everyone has an unlimited data plan.

Let your browser cache everything, forever

Mads reminded me about this great tip for IIS7 that tells the webserver to set the "Expires" header to a far future date, effectively telling the browser to cache things forever. What's nice about this is that if you or your host is using IIS7, you can change this setting yourself from web.config and don't need to touch IIS settings.

<staticContent>
<clientCache httpExpires="Sun, 29 Mar 2020 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>

You might think this is insane. This is, in fact, insane. Insane like a fox. I built the website so I want control. I version my CSS and JS files in the filename. Others use QueryStrings with versions and some use hashes. The point is are YOU in control or are you just letting caching happen? Even if you don't use this tip, know how and why things are cached and how you can control it.

Compress everything

Make sure everything is GZip'ed as it goes out of your Web Server. This is also easy with IIS7 and allowed me to get rid of some old 3rd party libraries. All these settings are in system.webServer.

<urlCompression doDynamicCompression="true" doStaticCompression="true" dynamicCompressionBeforeCache="true"/>

If there is one thing you can do to your website, it's turning on HTTP compression. For average pages, like my 100k of HTML, it can turn into 20k. It downloads faster and the perception of speed by the user from "click to render" will increase.

Certainly this post just scratches the surface of REAL performance optimization and only goes up to the point where the bits hit the browser. You can go nuts trying to get an "A" grade in YSlow, optimizing for # of DOM objects, DNS Lookups, JavaScript ordering, and on and on.

That said, you can get 80% of the benefit for 5% of the effort by these tips. It'll take you no time and you'll reap the benefits hourly:

  • Minifying your JS and CSS
  • Combining CSS and JS into single files to minimize HTTP requests
  • Turning on Gzip Compression for as much as you can
  • Set Expires Headers on everything you can
  • Compress your PNGs and JPEGs (but definitely your PNGs)
  • Use CSS Sprites if you have lots of tiny images

I still have a "D" on YSlow, but a D is a passing grade. ;) Enjoy, and leave your tips, and your "duh!" in the comments, Dear Reader.

Also, read anything by Steve Souders. He wrote YSlow.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Hosting By
Hosted in an Azure App Service
September 02, 2011 2:27
If you also want to gzip AJAX/JSON stuff (you want to), you can sadly only do it globally in the server's ApplicationHost.config using the httpCompression element. It's a real shame you can't set this on a per-application basis using Web.config.
September 02, 2011 2:38
jonf - Why can't you?
September 02, 2011 2:51
Regarding gzipping, does that IIS setting above work with WCF RIA services? What about JSON results in MVC? I'm currently using this method:

http://thisthattechnology.blogspot.com/2010/08/enable-iis-compression-on-ria-wcf.html
September 02, 2011 2:51
a suggestion: make your links open another window. Often I am reading along and follow the link you provide for more context...however having to use the back button is the devil....especially on a mobile device
September 02, 2011 3:00
Great post Scott. I just started using (last week) AjaxMin on a new project I'm working on and it works great.

For versioning my css and js files I created a MVC Helper in my project called Url.StaticContent, that works the same way as Url.Content, except is appends the Ticks of the last write time for the requested file to the url as a parameter. I find this better than renaming the files because I have less places to "fix" later.

I was thinking about merging the js files (and css too) into one, but than I'll have to change my _layout page and since I'm using AjaxMin only in the build server, I would be forced to have all the original files as well in order to be able to test and debug localy. I'm trying to figure out a good way to do this, but couldn't find anything to my liking yet.

Anyway, great post.

Regards,
Kelps
September 02, 2011 3:00
It looks like my site is making too many external calls to other servers. I scored well in the images and CSS but not doing so well on those server calls and cookies!

Overall grade was a D at 69 out of 100.

http://www.flickr.com/photos/adriarichards/6104651958

Grade F on Make fewer HTTP requests
Grade F on Use a Content Delivery Network (CDN)
Grade A on Avoid empty src or href
Grade F on Add Expires headers
Grade D on Compress components with gzip
Grade A on Put CSS at top
Grade F on Put JavaScript at bottom
Grade A on Avoid CSS expressions
Grade n/a on Make JavaScript and CSS external
Grade F on Reduce DNS lookups
Grade A on Minify JavaScript and CSS
Grade A on Avoid URL redirects
Grade A on Remove duplicate JavaScript and CSS
Grade F on Configure entity tags (ETags)
Grade A on Make AJAX cacheable
Grade A on Use GET for AJAX requests
Grade D on Reduce the number of DOM elements
Grade A on Avoid HTTP 404 (Not Found) error
Grade B on Reduce cookie size
Grade A on Use cookie-free domains
Grade A on Avoid AlphaImageLoader filter
Grade C on Do not scale images in HTML
Grade A on Make favicon small and cacheable


I also tried out the beta ySlow for Chrome and that reported the same information https://chrome.google.com/webstore/detail/ninejjcohidippngpapiilnmkgllmakh
September 02, 2011 4:17
@e rolnicki re: I'm sure you know how to open new links in a new window, why not use those features of your browser?

Scott, I'm surprised you didn't mention Combres. I'm sure I've seen you mention it before:
http://combres.codeplex.com/

Nuget package that you drop onto your site, configure the css / js to use, and it will handle combining / minifying and gzipping.

I roll it out on all my sites.
September 02, 2011 5:51
Thanks for the info on optimizing PNGs! I previously used Photoshop to compress my PNGs because I didn't know any better, but recently read somewhere that Fireworks does a way better job at compression than PS does. I'm definitely going to try out your suggestions at work.
September 02, 2011 7:46
Typo - The creator of YSlow is Steve Souders not Sounders
September 02, 2011 10:17
Great post scott, lots of things in there that a lot of people know but sometimes they just don't get to it. Having a summary like this is very useful as an eyeopener on how little work can make such a big difference
September 02, 2011 11:06
A wonderful VS add-in to handle combination and compression of CSS and JS on the fly is Chirpy http://chirpy.codeplex.com/.

My entire team use it and we love it.
September 02, 2011 11:26
I have found out that the default gzip compression for IIS is not optimal for most sites. Even the js files hosted at http://ajax.aspnetcdn.com are using the default compression.

Scott Forsyth has a nice post about the settings (http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx)

If someone can talk to the aspnetcdn guys to fix their setting, i would appreciate it. I tried few times.
September 02, 2011 11:32
Expires shouldn't be set to more than a year in the future.

http://code.google.com/speed/page-speed/docs/caching.html

'Set Expires to a minimum of one month, and preferably up to one year, in the future. (We prefer Expires over Cache-Control: max-age because it is is more widely supported.) Do not set it to more than one year in the future, as that violates the RFC guidelines.'
September 02, 2011 11:42
@Scott Hanselman:
Because, for some reason, the MIME type application/json is not compressible using urlCompression. I think you can get it to work if you use text/json instead, but that's a workaround.

Rick Strahl has some information: http://www.west-wind.com/weblog/posts/2011/May/05/Builtin-GZipDeflate-Compression-on-IIS-7x
September 02, 2011 11:47
Just some best practices from me:
I use for my projects Combres (http://combres.codeplex.com/), which I think is similar to SquishIt and is really easy to use and has gzip, minfy and a kind of asset management.
A good starting point to customize your web.config is the web.config from the HTML5BoilerTemplate (http://html5boilerplate.com/) - there are some really nice things in it (GZip, compression, caching).
September 02, 2011 12:07
I'd like to second Jonas' shout for Chirpy.

As an add-in for Visual Studio, not only does it convert SASS, LESS and CoffeeScript and offer combining and minifying, handiest of all (IMO) it does this in the background each time you save a file so if you wish you can develop while using the combined-minified files.

On the flip-side, I'd like to find out how to implement versioning in Chirpy (in an automated sense).
September 02, 2011 12:27
@jonf: I also think that the default JS compression settings in IIS are flawed, but changing them is not too complex. See this blog post for the settings you need in your web.config.
September 02, 2011 13:34
Excellent post, this is a topic that is often overlooked by dev teams and yet it is so fundamental to the development process in my opinion.

I'd like to second (third?) the shout out to Chirpy and Combres. These two elements are dead simple to use and improve performance incredibly, our site footprints have decreased to around 1/3 of their previous size. For the amount of effort involved to implement these two are a massive ROI.

We also do all of our own g-zipping of ActionResults using a custom AttributeFilter that compresses the response OnActionExecuting which gives us complete control over what we compress.

These simple things only take a day or so to setup and you can port the functionality between projects with minimal fuss.
September 02, 2011 13:36
A lesser know compactor/compressor is MiniMe which builds on top of AjaxMin

can be seen here: https://bitbucket.org/DotNetNerd/minime/overview

Just to let you know :)

September 02, 2011 14:29
Has anyoen looked at the size of some page requests in SharePoint 2010, e.g. the default Team Site or My Sites? Shocking. OK, not so bad when you're close to the server, if you've got a stack of WFEs, but a real issue in WAN or other distributed environments.
September 02, 2011 17:24
+1 for Chirpy. I also have found Knapsack to be quite nice. http://aboutcode.net/knapsack
September 02, 2011 17:39
@Tero
Sure, but that's only for static content, for which <urlCompression /> is just fine. I'm talking about dynamically generated JSON (application/json). doDynamicCompression = true will NOT gzip application/*, only text/*.

But yes, that article raises a good point. I've always used this for IIS7.x in my Web.config, which is pretty much the same as the article you link to proposes (minus the caching).


<system.webServer>
<urlCompression doDynamicCompression="true" doStaticCompression="true" dynamicCompressionBeforeCache="true" />
<staticContent>
<remove fileExtension=".js" />
<mimeMap fileExtension=".js" mimeType="text/javascript" />
<clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="365.00:00:00" />
</staticContent>
</system.webServer>


That MIME-type re-mapping must be done for IIS7.5 to compress .js files. Not sure about IIS7, but you probably want to update the article with this, Scott. If you don't do this remapping, IIS7.5 will, for some reason unbeknownst to me, use an application/x-javascript mimetype and it won't gzip.
September 02, 2011 17:41
Heh, being a web prgrammer I have all errors turned on in IE9. You may have broken something. Though I'd not put it past me to have broken IE with some setting change. Love the tips simple and often overlooked, especially in a small shop like ours where we can just ask for more resources from the VM administrator.

Webpage error details

User Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; MS-RTC LM 8; InfoPath.3)
Timestamp: Fri, 2 Sep 2011 13:36:53 UTC


Message: Expected identifier
Line: 2
Char: 9
Code: 0
URI: http://www.hanselman.com/blog/TheImportanceAndEaseOfMinifyingYourCSSAndJavaScriptAndOptimizingPNGsForYourBlogOrWebsite.aspx?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+ScottHanselman+%28Scott+Hanselman+-+ComputerZen.com%29

September 02, 2011 17:48
I also use AjaxMin in a post build process for Web Deployment Projects to squish the individual files. I would LOVE to smash them all into one JS or CSS file, but how do you manage this going from Dev to Prod? I suppose I could have some #If DEBUG and #If RELEASE blocks where I put the several scripts in the DEBUG block and the single reference in the RELEASE block. Is this best practice? I just don't want to have to go edit my output files and replace my script blocks before deployment.

Am I missing something unblushingly obvious here?

(ugh, this solution reminds me of my If False Then block that I have to put into my MVC3 projects because you don't get intellisense with Url.Content *hint hint*... while I'm at it, it would be neat if "Url.Content(" brought up a nifty "Browse..." helper the way typing "src=" does... is this for The Gu? How did I get here?)
September 02, 2011 19:09
You can also try wro4j for managing your assets and apply different types of compressors.
September 02, 2011 19:21
Is there anyway to use a CDN on a secure site? I tried to add a link to a PDF using a URL that was outside the web root, but when you login you get the message "some content is insecure, display all content?" This does not engender confidence and I don't want to train users to ignore that message. So I have 30 copies of the secure site and I just made a batch file to copy the PDF into all of them, but it grates on my "resuse" nerve.
Do you just have to skip CDN on a secure site?
September 02, 2011 19:58
I think "YSlow" should be called "YYouNoFast?".

Don't you? :)
September 02, 2011 20:07
@Emil

We've been investigating how to improve the compression ration on the MS Ajax CDN for a while now but unfortunately it's not straightforward. The CDN is run on the Azure CDN infrastructure, so we have absolutely zero control over these types of settings, and any changes they would make for us affect many, many sites. We're continuing to push to improve it though (we changed the URL to ensure there are no cookies for example) and hope to get the compression turned up to 11 some day in the future.
September 02, 2011 20:35
@James Frater,

Check out this article:

http://encosia.com/cripple-the-google-cdns-caching-with-a-single-character/
September 02, 2011 21:21
Combining a lot of scripts into one is not necessarily the best way to improve performance. This means that they cannot be cached individually, so changes to any script require every single one of your users to re-download the entire collection. This might make sense for a collection of small, seldom-changing scripts, but it definitely does not make sense to wrap everything in your site this way. You lose the whole advantage of client side caching.

Another way to decrease the footprint of your site that seems to get overlooked a lot is to load scripts only when needed.

A lot of developers just add every single script that gets used somewhere into a template. In practice there may be few that are used on every page (e.q. jQuery). Many site users may never even go to pages or features that need some of your scripts. Set up an architecture that lets you specify what is required on each page, and load only those scripts.

Finally you can load rarely-used scripts on demand. For example, I have a web site that only loads the authentication script (which includes SHA2 encryption code, and so on) when a user actually clicks "login." The actual process of logging in only takes place once per user session - why keep that script on every page? Just use (e.g.) jQuery getScript and load such things on demand. As long as there is not a need for instantaneous response, this will be invisible to the user. That is - logging in requires handshaking with the server anyway, so this doesn't impact the perception of performance.





September 03, 2011 1:31
I think the plain-old Firebug (using the NetExport plugin to persist your output to a file) not only gives you a better graph than YSlow, but saves you from having to install YSlow. Basically, in other words, YSlow is just a re-invention of the wheel, and not a very good one at that.
September 03, 2011 1:51
If you didn't know this stuff and you're a "web developer", you probably should fire yourself.
September 03, 2011 8:41
Well done article covering the concepts. I use to use the YUI minifier for years then I switched over to the Microsoft Ajax Minifier and then to the Google Closure compiler in VS 2008 (validates your JS and minifies it). My biggest issue was the idea of having both the minified and the original files around caused issues in development and debugging. So about a year ago I came across an article that discussed a way to combine and minify your JS and CSS files from within .Net. The Script Combiner technique uses C# code to combine multiple JS and CSS files, minify, GZip and Cache them on the server. Here is the website I found the technique. http://atashbahar.com/post/Combine-minify-compress-JavaScript-files-to-load-ASPNET-pages-faster.aspx . We have been using this technique in a production environment for one of our applications for the last six months with great success. The only problem I have is on JS files that are already minified (jQuery and such); however, I usually keep those files separate. One technique that we have been doing lately is to break our larger JS files into smaller files that have related content. This has made it easier for our developers to check code into SVN. The modified Script Combiner loads all of the JS files that are located inside each folder and delivers the content as a single file.

September 04, 2011 15:33
Are danglig sentences advocated as a compression technique too? :D

"Give the browser less work to do, especially if, like me, almost 10% of your users are mobile. Don't make these little phones work hard than they need to and remember that not everyone

Let your browser cache everything, forever"
September 05, 2011 6:11
Nice post.

Another related topic worth mentioning is image spriting. Might check out RequestReduce which sprites background images on the fly to optimized 8 bit PNGs as well as minifies and merges CSS. It brought my blog from a YSLOW C to an A with practically 0 effort.

Matt Wrock
September 05, 2011 12:17
James Frater - Use a protocol-less URL with a CDN to switch between SSL and non-SSL without actually specifying the protocol.

Anonymous Fan - Fixed, thanks!

Jon - I think the Chief Performance Engineer at Google would disagree with you. ;)
September 05, 2011 13:19
Hi,

maybe I'm a bit late on this but I still hope to get an answer.

Is it recommended to use Base64 encoded images?

Just loudly thinking: If I have the same image multiple times on my page in the base64 format a benefit could only be seen when using GZIP compression, is this correct?
Otherwise the image would be transmitted twice (as a string within the markup).

In this case "normal" png's for example are the preferred way I guess, because it gets transmitted only once and then used from cache (the second time).

(My) conclusion: Base64 should only be used in coherence with GZIP.

Why I'm asking this is because I want to reduce http requests but find it relatively hard creating and working with sprites.

In which way could the use of Base64 replace the sprite technology?

Additional thoughts? I'd really appreciate it.
September 05, 2011 17:49
Thanks again for sharing your wisdom, Scott!

I agree with the above poster that notes "combining a lot of scripts into one is not necessarily the best way to improve performance."

Nowadays, with HTTP/1.1, we have such a thing as Keep-Alive. That means there is a negligible difference between loading 1 script from your server, and loading 10 (I'm talking latency, not bandwidth).

The benefit of separate scripts, as the above poster notes, is that they can load independently on only the pages they are required, and they're cached independently.

Of course, you should still be versioning, minifying, and zip'ing them!
September 06, 2011 16:24
Great article! This is something I've "been meaning" to do for a while and not for the first time, Mr. Hanselman delivered a swift kick on my lazy ass.

I went with Combres. Very nice!
September 15, 2011 3:37
Scott,

In the case of running an intranet site (lets say 200 users) could one get away with just "caching things forever"? This would seem to just make upgrades easier and really only take the performance hit on each users first page visit.

Thanks good article!
September 15, 2011 20:56
Cassette seems to be the magic bullet when it comes to squishing!
April 05, 2012 10:03
Scott,

I have couple of question....

1) Lets say for example commercial website has lots of code that has more css, javascript and few html files, so after minifiying these above mentioned files will there be any exceptions that can arise if the minified file is deployed on the webserver where the code resides or will there be environment challenges or other?

2) Apart from the advantages is there any disadvantage for Minifiying a Javascript or a css or even a html file?
May 17, 2012 5:36
Scott, Really a nice article. Thanks. I had one question.
1. Is there a YSlow like tool to work with IE? If not what other tool/process we can follow to analyze a web page on IE?
Thanks
Raghu
May 23, 2012 19:29
This is really interesting, You’re a very skilled blogger.
November 04, 2012 9:58
Hi Scott
Nice article, just implemented the some of the techniques discussed here in my existing website. I used squihit
November 16, 2012 22:16
Hi Scott,
Very good article!

Just curious and don't really expect an answer, but how does PNG compare to GIF? I often use GIF for other purposes (email attachments, etc.) when the picture is suitable for this lossless format (when it has large blocks of solid color, basically).

Does PNG support 'GIF-like' compression with similar performance even if the original is not a vector-based image?

//Sten
November 17, 2012 2:49
Sten - No, PNG isn't good at large blocks of compression. It does provide 24bit color vs the pallettized 256 that GIF offers

Comments are closed.

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.