Scott Hanselman

It's the transparency, stupid!

November 2, '11 Comments [34] Posted in Musings
Sponsored By

Sad TivoI have long said it's important to not give bile a permalink so don't take this as a post that's picking on a specific company. Nearly every company is guilty of withholding information for no apparent reason. Sometimes it's to protect shareholder value but most often it's motivated by fear, the unknown, and fear of the unknown. This is my opinion.

I really believe there's little reason to not be extremely transparent in business today. Especially when business means releasing software or hardware on a regular cadence. Apple is great about being secretive and announcing "one more thing" that  no one expected, but that's not an easy culture to maintain.

I'm a fan of clear roadmaps. It's OK if the roadmap gets blurry long term, but at least tell me where the road is! The thing is, if you don't release a public roadmap, it'll get leaked or someone will make one up for you.

Also, if you aren't transparent with your customers you take a risk that the customer use your opaqueness against you.

  • "They haven't said anything about Product X, I wonder if they themselves know what they're going to do!"
  • "We've asked for Feature Y for the last 2 years and while they say it's coming, they won't say when or what's taking so long!"

The irony is that the customers who are pounding on you the most, demanding updates and status are your best customers. They care!

I'm not saying my Mom needs to know the technology roadmap or the release notes for her Universal Remote Control. I'm saying I do. Why? Because I'm an enthusiast and I've likely sold more of these remotes just by being a fan than Best Buy.

Here's a concrete example. I've got a TiVo (Digital Video Recorder) and I like it. Except when I hate it. It works great and then stops working, and this is a known issue. The TiVo Premiere I have has a dual core processor. Except it's slow because only one of the processors is enabled. It uses Flash for its UI and much of the UI is in HiDef with a 16x9 ratio. Except a bunch of the menus are NOT in HiDef. You move in an out of the menus with a jarring leap from HiDef to Standard Def and back. It's been like this for years, plural.

If you search the web or forums where TiVo enthusiasts hang out, you'll hear them complaining. Understand that these are folks that have a TiVo, sure, but they care enough to want the new features. They care enough to participate in an online forum. For every one customer who is complaining about you online, there are like 100 just like them complaining offline.

Online discontent is just the beginning. The spark of discontent can ignite into the fires of rebellion.

So why not just be straight with them? I'll pick on TiVo VP of User Experience Margret Schmidt for a moment. First, to be clear, she's exceedingly helpful on Twitter, positive, kind and has put herself out there as a public face for her company, so kudos and respect for her. I've asked her questions like "when will the second core be enabled" and "when will Flash stop hanging" and "when will all the menus be HD." Unfortunately it's clear that her hands are tied by some higher level mandate. 

@tivodesign TiVo Margret Schmidt - @shanselman No updates I can share, but updates are coming. (Sorry, I know that isn't helpful.)

It's apparently company policy not to comment on new features or their roadmap, even when those features have been speculated about online for years. Nurture the community you have by entrusting them with your plans. They'll understand if you don't know exact dates. But don't hide the truth.

I would encourage TiVo, Microsoft (I work here and pushing for transparency is part of my job) and companies like them who release products on a regular cadence as well as existing products to just be transparent.

Think of the hundreds if not thousands of forum posts with anger that would be assuaged with a TiVo Release Notes blog post that said something like:

"We know our users have been waiting for an updated that enables the second core in your dual core TiVos. We've had some _______ problem with _____. It's been a sticky issue but our engineers tell me they've got it cracked. Look for an update in the next __ months that enables this exciting feature. Thanks for your patience and most of all, for your enthusiasm! Viva Tivo!"

It's not hard. Just say something.

Related Links

Here's some examples of some technology roadmaps that are clear and organized:

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

NuGet Package of Week #11 - ImageResizer enables clean, clear image resizing in ASP.NET

October 31, '11 Comments [7] Posted in ASP.NET | ASP.NET MVC | NuGet | NuGetPOW | Open Source
Sponsored By

The Backstory: I was thinking since the NuGet .NET package management site is starting to fill up that I should start looking for gems (no pun intended) in there. You know, really useful stuff that folks might otherwise not find. I'll look for mostly open source projects, ones I think are really useful. I'll look at how they built their NuGet packages, if there's anything interesting about the way the designed the out of the box experience (and anything they could do to make it better) as well as what the package itself does.  Today, it's imageresizer.

Bertrand Le Roy has long been an advocate of doing image resizing correctly on .NET and particularly on ASP.NET. Last week he posted a great post on a new library to choose from; a library that is pure .NET and works in medium trust. It's "imageresizer." What a creative name! ;)

Seriously, though, it couldn't be easier. Here's a nice sample from Bertrand's blog showing how to do resizing of a JPEG as stream of bytes using the imageresizer library directly:

var settings = new ResizeSettings {
MaxWidth = thumbnailSize,
MaxHeight = thumbnailSize,
Format = "jpg"
};
settings.Add("quality", quality.ToString());
ImageBuilder.Current.Build(inStream, outStream, settings);
resized = outStream.ToArray();

There's a complete API with lots of flexibility. However, how quickly can I get from File | New Project to something cool?

ImageResizer

Well, make a new ASP.NET (MVC or WebForms) project and put an image in a folder.

Their default NuGet package is called ImageResizer, and their ASP.NET preconfigured web.config package is "ImageResizer.WebConfig" which includes a default intercepting module to get you the instant gratification you crave. I used NuGet to install-package imageresizer.webconfig.

I've got an image of my giant head that I can, of course, visit in any browser.

imageresizer

And now with the intercepting HttpModule installed with imageresizer.webconfig I can add ?width=100 to the end of the query string and I get a nice resized image that fits into the constraints of "100 wide." It's a trivial example, but it's a nice touch to have them do the "figure out how tall this should be" work for me.

imageresizer2

Of course, I'm sure you could DoS (Denial of Service) someone's system with resizing request, but for small sites their intercepting module is a quick fix and a great example. DoS problems aren't unique to CPU intensive requests and a problem solved elsewhere.

UPDATE: Please read the comment from the author below. He points out a correction and some useful stuff.

I'd like to clarify that it's not just for small sites. It's been running large social networking sites for years, and there are at least 6 companies using it with 10-20TB image collections (it powers a lot of photo album systems).  It's designed for web farms, Amazon EC2 clusters, and even..... Microsoft Azure.

"Performance-wise, it's just as fast as GDI (despite Bertrand's article, which he'll be updating soon). Default behavior is to favor quality over performance (since it's never more than a 40% difference even with the worst settings), but that IS adjustable."

He also tells me in email:

"All the cropping, flipping, rotation, and format conversion can be done from the URL syntax also. Everything you can do from the Managed API you can also do from the URL."

For more sophisticated use they include a separate API dll where you can do even more like cropping, rotating, flipping, watermarking and even conversion. Bertrand has a chart that explores their speed issues, as they are slower than straight GDI and Windows Imaging Components, but as I said, they are pure managed code and work in Medium Trust which is a huge win. Their quality is also top notch.

ImageResizer also includes plugin support that you can buy. Genius, seriously, I tip my hat to these guys. The most popular and useful features are free, and crazy easy to use. If you want to do even more you buy plugins like DiskCache for huge performance wins, S3Reader or AzureReader for Amazon or Azure support, and lots of free plugins for 404 handling, DropShadows and more. So polished. Kudos to Nathanael Jones and team for a really nice use of ASP.NET, .NET, NuGet and a clever open source library with a plugin model for profit.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Embrace Authorship - The importance of rel=me and rel=author on your content's SEO and Google

October 28, '11 Comments [29] Posted in Blogging | Musings
Sponsored By

There's a lot of garbage out of there on the internet. I know, I've been writing some on this blog for almost 10 years. ;) One way to let Google and Friends know that you are a real person and are really the writer of something is to use microformats like authorship markup like rel="author" to markup your content.

When you write a blog post, make sure that the rel="author" attribute is on the page with a link back to your "About Me" page or to your Google+ profile. The easiest way is to just include a link like this:

by <a title="Scott Hanselman is on Google+" rel="author" href="http://profiles.google.com/hanselman.scott?rel=author" alt="Google+" title="Google+">Scott Hanselman</a>

But it's not proof enough that you wrote something just to link to a profile. You have to close the loop by linking back to your site from your profile, indicating that the page is about your. This will end up looking like this with a rel="me" attribute:

<a href="https://www.hanselman.com/blog/" rel="me" title="Scott Hanselman">Scott Hanselman</a>

While this isn't a perfect way to guarantee to Google that you actually authored and own some content, it's a good start. Presumably if Google trusts your Profile and your website there is an implied chain of trust. If some spammer decides to programmatically steal your entire site, or even just suck it down with RSS and reblog it, it's now possible for Google to downrank those splogs (spam blogs) or delist them, while simultaneously assigning a higher Page Rank - or Author Rank - to your site.

Alternatively you can have an "author page" or About Me page on your site within the same domain and use rel="author" to point to it. You then use rel="me" to markup links that all point to sites that represent the same person. If you are using rel="author" to point to an About Me page, that page should the include a link with rel="me" that points to your Google Profile.

If this seems confusing, you can use the Rich Snippets Testing Tool to test our your pages and how they might show up if Google decides to trust you.

My site as seen by the Google Rich Snippets Testing Tool

The most important part with the Rich Snippets Testing Tool is the Extracted Author info. Does Google successfully extract that you are an author and show those links without  errors or warnings?

My rel="author" markup is error-free

At this point, you know only that Google doesn't think you suck but you have no idea if they will actually use the data. This appears to be where magic pixie dust comes in. You essentially wait a week or two and if it works, when you start Googling for your articles they will start showing up like this in search results:

Googling for Scott Hanselman

Or a specific article, for example:

Googling for Windows 8 Scott Hanselman

Note that Google shows that I'm in some Google+ circles and that there's 31 comments on a G+ post on this blog post.

Aside: This starts to seem a little unbalanced to me, as Google could have looked at my RSS feed or RSS Comments Feed and determined how many actual comments there are on that post on my site. Or, I could include microformat metadata on comments to indicate that they are comments vs. original content. I want the discussion to happen on my blog, not on Google+. Or maybe I want the conversation to happen on Disqus, or on Facebook. It's too bad that Google doesn't support a microformat like rel="comments" (I made that up) so that I might take control of the URL where comments should be left. Maybe I want Twitter or Facebook profiles to be used with rel="author." With the addition of Google+ and the "convenience" of using Google+ for rel="author" and the automatic retrieval of comment metadata, again from Google+, the open markup-based Google The Search Engine plus Google+ The Social Networks becomes a walled garden without choice, like Facebook.

Commentary on openness put aside, the usefulness of rel="author" in the context of Google users and from the perspective of the content author is obvious.

  • Search results that list pages written by actual humans alongside their smiling faces will be more likely to be clicked on.
  • Folks can +1 results directly from the results AND add you, the author, to their Google+ circles.

Of course, I'm not sure what I think about searching Google for the word "phony" and finding my face show up as the result. ;)

What a phony

The task for you, Dear Reader, is to go forth and implement rel="author" for your blogs and content.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

I know apps

October 26, '11 Comments [49] Posted in Musings
Sponsored By

Lots of technologies listed on a resumeA very good friend of mine - not a programmer but a very technical IT professional - sent me their resume to review today and I noticed how the top part of the resume contained a lot of applications, technologies, keywords and acronyms.

At what point do we as a subculture need to stop doing this?

It's so ironic that the most technical amongst us without jobs are asked to create a resume to be consumed by the least technical so that they might facilitate an introduction to the very technical people with jobs to give. Some larger companies *cough*Nike*cough* are rumored to use high speed scanners and OCR to hunt for keywords and assign a weight value to a resume. This just results in us padding our resumes with every TLA (three letter acronym) we've ever encountered.

And why are we still listing Word and Excel? Has anyone missed out on an opportunity or lost a job when they forgot to add Microsoft Office? At what point in an industry or a level of experience does it become compulsory to know these tools?

Aside: Ever get a resume in Microsoft Word format then press the little Paragraph Mark toolbar button that shows tabs and spaces as characters? Not to sound too judgey or anything, but if you really want to know if someone knows Word or not, explore some of the insane feats that the uninitiated can do with a few thousand ill-conceived tabs or spaces.

I am less interested in whether you know Word or Excel and more interested if you know, for example, about iCal files. Could you subscribe to an iCal feed in a calendaring app? (Any calendaring app, to be clear) Could you write a program that creates a feed like this? Do you understand structured data, the many ways to store it and the many ways to move it from place to place?

I am less interested in the fact you have "Mozilla" on your list of Apps you're an "expert" at, and more interested in your understanding of HTTP and what certain headers do, how caching works and how mime-types enable browsers to launch apps. Do you know why bookmarklets are interesting? Why Greasemonkey is useful?

Are you a user? Are you a Real User? Do you actually use the hell out of your applications, your phones, your web sites, the Web itself?

I am less interested in your experience with Basecamp and more interested in how you implemented Agile at your last job. Did you use Scrum or Scrummerfall? What worked and what didn't and more importantly, do you know why?

I blogged years ago how funny it was that folks work for five-plus years to get the privilege of putting ",PhD" at the end of their names, but computer people take a 45 minute test and tack on ",A+,MCSD,MCP,MCSE+I" without a thought.

Why don't we include projects rather than companies on our resumes? How about a little post-mortem with some details about what worked and what didn't and why? Do you have 20 years experience or do you have the same 1 year of experience twenty times?

Do you know how to make text dance? There's a big difference between the XMLs, CSVs, vCards and open text formats of the world and the PSDs and proprietary binary formats of the world. Other than Adobe products that do years to master, I am going to assume you know how to use an application. I'm assuming you've seen a mouse, get the concept behind hotkeys and you can type, although perhaps that's too much to assume.

If you're truly able to make Excel dance or you spent a summer writing a TCP driver, by all means, tell us. If you wrote your own SQL lexer, you're a special person. But instead of a list of applications you know, tell a story about your successes and failures and the applications and technologies that played starring roles in your experiences.

I like what StackOverflow Careers is doing in this space in that a listing emphasizes not just what you've done, but also what you've written and what you've read. The list of technologies only happens in the context of projects you've worked on. Here's an invite if you want to try it. This is not an ad link or an affiliate code. They have advertised on my podcast once before, but I mention them here because their resumes present a more well-rounded picture of an engineer. My profile is at http://careers.stackoverflow.com/shanselman.

Personally, I think on my next resume I'll just put this:

Scott Hanselman
Programmer.
I know apps.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Reading more than ever: An analysis of four lazy years with an Amazon Kindle and no dead trees

October 24, '11 Comments [71] Posted in Musings
Sponsored By

Bezos on the cover of NewsweekIt's nice to finally see ebooks going mainstream. By mainstream, I mean that my Mom bought a Kindle Fire with minimal angst and gnashing of teeth. I've been reading ebooks since my first Apple Newton and I coveted the Sony Reader nearly 5 years ago. Finally bought a Kindle for my birthday in 2008, so come January I'll have had a Kindle as a part of my life for four years.

Even more, I haven't purchased a physical book in that time. In fact, I actually spend more time in my local libraries now than I did before the kindle. My library is the place where I get dead tree books. Ironically, my local library just announced their ebook lending program.

Interesting Historical Aside: I did architectural consulting with netLibrary around 1999. They were a totally-ahead-of-their-time e-book company. They scanned thousands of ebooks in anticipation of the coming ebook revolution. It's tragic they were at least a half decade if not a full decade ahead of their time. They are gone now.

According to the Amazon Kindle Social Network (yes, they have their own social network! You and I can connect and you can stalk my books as well) I've purchased 141 books since I got my Kindle and read 90 of them. It doesn't see books that you copied to your Kindle from free websites, so that's maybe another dozen or so. I'm in the middle of reading 5 books right now. 

The Kindle keeps my current page sync'd between any other devices I may choose to read on. A few pages on the iPhone, a few on the PC, then back to the Kindle.

I read before, but never so much as when I got a Kindle? Why the change? Laziness. It's effortless to get books. The Kindle is literally a one-click link between my wallet and Jeff Bezo's bank account. I see a book and click, I'm reading it. You might think that gets expensive, but for every $8.99 book I get (which is not a lot) there's a lot of really good books I grab under $5 and some for free or 99 cents.

Remember all the blog posts about how Kindle would never work because it was $359? Well, three years later and the cheap Kindle is $79. That's less than two copies of The Walking Dead Compendium.

It's the reading, stupid.

A recent study showed that it doesn't matter if you read from paper or from an electronic screen. The words make it into your head all the the same. Here's a passage from the Mashable article, emphasis mine.

The study was conducted after readers in Germany became skeptical about reading from electronic devices like ereaders and tablet PCs compared to traditional printed books.

Participants in the study read a variety of texts with different levels of understanding on an Amazon Kindle 3, Apple iPad and in print. Their reading behaviors and brain activity were examined using an EEG machine and eye tracking tools.

The study proved that reading from an electronic device instead of print has no negative effects, contradicting the misconception from German readers.

Everywhere I go I take my Kindle with me. And everywhere I go I end up meeting someone who says what all non-Kindle owners say "I just like the tactile experience...the feel of the paper." At this point I ask them if they've ever read a book on a Kindle or used an ereader. Most never have. They're just down on the idea of change. I get that, and I too, mourn the end of the physical book. But at the same time I, for one, welcome our new e-ink overlords.

I issue them the same challenge I'll issue you, Dear Reader. Read one book on a Kindle or small e-ink device. Just one, cover to virtual cover. I'm confident that most folks will never go back. I realize there are advantages to reading from paper just as there are advantages to using photographic film over digital cameras, but they are few and they don't outweigh the overwhelming advantages of a small e-reader.

Some folks swear by iPad or other illuminated LCD reading, but I believe those screens cause eye fatigue. They have a lower resolution than eink, they are hard to read  - if not impossible - in the sun or outside, they have limited battery life and they just don't look like paper. Each of these reasons is reason enough to go with e-ink. I went overseas for a week and didn't even take my Kindle charger. No need to. It lasts weeks on a full charge; that's almost long enough to pretend it's not an electronic device. Even the 3G wireless worked seamlessly all over Europe without me doing anything special.

I feel awful about it, but I can't count how many times I've been at a small airport bookseller, browsed, looked at a book, then purchased it on my Kindle while standing right there. If only there were a way to give that book seller some money for the referral.

My 4 years with a Kindle have got me reading more than ever because the Kindle has:

  • made it easy to get books
  • made it easy to carry my whole collection
  • made it easy to read even large books with thousands of pages (I'm looking at you, Neal Stephenson)
  • made it easy to finish a book in a series and immediately start the next book.

These are the reasons I've read more in the last 4 years with my Kindle than in previous years. It's removed what little friction physical books had imposed while seamlessly fitting into my life.

Nicolas Negroponte said last year that physical books would be dead within 5 years and Kindle ebooks surpassed the sales of physical books at Amazon last July. In fact, it's starting to be a multiplier, where for every 100 physical books sold there's perhaps 200 ebooks sold.

Everyone believes that the DTB (dead tree book) is on its last legs. How long do you think it'll take, Dear Reader? 5 years? 10? Will it be Amazon's centralization and DRM (digital rights management) that will hold it back or do you think that Amazon will eventually do what iTunes did, turning off DRM in favor of MP3s and low prices?

How long until physical books die, then are brought back from the dead by future-hipsters just like vinyl records? (Bet you didn't know vinyl sales were up 40%, did you?)

I don't know, but I know physical books will die. Why will physical books die? One reason and one reason only.

Because it's cheaper to move electrons than molecules.

Ultimately that's why ebooks will win.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.