Scott Hanselman

If malware authors ever learn how to spell we're all screwed - the coming HTML5 malware apocalypse

June 29, '12 Comments [84] Posted in HTML5 | Musings
Sponsored By

Forgive the lousy screenshot and transparency in the title bar, but I just got this fake virus popup while searching for an image. I admit for a single moment my heart jumped.

A very scary fake virus popup

Then I thought after a few seconds as a techie (and note that all these observations just happened all at once in my head in no order):

  • The dialog is perfectly centered in the browser. I'm not sure why this was my #1 tipoff, but for me, it was the first thing I noticed.
  • This "popup" was as a result of a browser navigation. If it were legit I'd expect it to happen a little more asynchronously.
  • The word "migth" misspelling in the popup.
  • The fonts in the column headers are anti-aliased with one technique and the rest of the text doesn't  use ClearType while my machine does.
  • Poorly phrased English: "You need to clean your computer immediately to prevent the system crash."
  • There's no option other than "Clean computer." No ignore, repair, quarantine.
  • The word "computer" at the end of the first line goes too far to the right of the grid's right margin. It should have wrapped to the next line. Yes, I'm a UI nerd.
  • Their Aero theme color is GRAY and mine is BLUE.
  • Ctrl-Scroll ZOOMs the image. ;)
  • The URL is obvious nonsense.
  • Adware.Win32.Fraud? Seriously?

It's scary just to look at floating in your webpage there isn't it?

A scary fake virus popup

How is my Mom supposed to defend against this? Windows OR Mac (or tablets) the bad guys are out there, and one day they will finally learn English and put a little work and attention to detail into these things.

One day these things won't be "selectable" to prove to us that they are HTML:

I selected the virus to make it invert its colors to prove it's fake

As we enable HTML5 with local storage, geolocation, possibly native code and  and other features the bad guys will start doing the same with their malware. If you can write Doom in HTML5 there's nothing (except the skill and the will) to keep you from writing adware/scareware/malware in JavaScript. Not just the standard CSRF/XSS type JS - which is bad, I know, I used to be in banking - but sophisticated duplicates of trusted software accurately recreated entirely in HTML5/CSS3 and today's modern JS.

Google Offline Mail and extensions run in the background in my browser now, what's to say some future malware won't? Should we digitally sign HTML5 apps? Do more Extended Validation SSL Certificates? How do you defend against this?

What do you think, Dear Reader?

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

NuGet 2.0 (.NET Package Manager) released - GO UPGRADE NOW and here's why

June 27, '12 Comments [30] Posted in NuGet
Sponsored By

Before we get started, take a second and head over to and click Install NuGet. Actually, just do it from here. I'll wait.

Install NuGet

It's a 2.5 meg VSIX file and will take just a minute to install. It'll work on Visual Studio 2010 SP1 as well as Visual Studio 2012 RC. If you have them all installed at the same time, NuGet will prompt you to install in all of them if you like.

Weird Issues you might hit

If you are an early adopter and are testing Visual Studio 2012 RC, first, thanks. If you see a dialog box on VS2012RC that says "Configuring Extensions" and seems to sit forever, we have a bug that was fixed for RTM that is starting to surface more frequently with the recent NuGet update. The bug is a race condition that occurs intermittently when a user updates his extensions in the RC release.

Workaround for hung at "Configuring Extensions" on Visual Studio 2012 RC

  1. Close all instances of VS
  2. Examine the contents of HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\11.0\ExtensionManager\PendingDeletions
  3. Delete the folders listed for each entry
  4. Delete HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\11.0\ExtensionManager\PendingDeletions

99.9% of upgrades or installs will work just fine. If you see any other issues, run Visual Studio once as Administrator, go to Tools | Extensions Manager and uninstall NuGet, then install again.

The most common installation issue is a certificate mismatch with Visual Studio 2010 SP1. Unfortunately this one isn't NuGet's bug but you can get a Visual Studio hotfix here to fix it once and for all - This should be in Windows Update one day, I hope. Either installing that hotfix or uninstalling/reinstalling as admin will fix the issue for NuGet.

Fixed Issues and New Features

Here's a query to the complete list of the 80 issues that were fixed in Version 2.0 of NuGet. NuGet has seen over 14 MILLION package downloads and there are over 6,000 unique packages in the gallery. You can see the updated stats anytime at

The best fix, and the one I have personally been pushing the most on was this Issue: NuGet PowerShell Tab Completion is SLOW over a slow connection. If you are on a slow collection (I'm talking to you, New Zealand) or just appreciate speed, this is reason enough to upgrade NuGet.

Before, typing Install-Package jQuery.[TAB] would cause an HTTP call to go to OData that would return more data than was required. I'm always pushing for folks who are not in the US on 35 megabit connections. Often because I'm over there to, sucking data through 3G.

With NuGet 2.0 typing Install-Package JQuery.[TAB] will make a quick JSON call like this:

GET /api/v2/package-ids?partialId=jQuery. HTTP/1.0

Which will return, in this case, 603 bytes of JSON, as it should. It's fast.


And you'll get nice Intellisense for packages.

Intellisense in NuGet 2.0

New Features

Not only is NuGet 2.0 faster, but there's some new features like dependency grouping by target framework. You can vary your dependences such that one package can service .NET 2 and .NET 4 but each target framework requires a different bunch of packages. Here's a example:

<dependency id="RouteMagic" version="1.1.0" />

<group targetFramework="net40">
<dependency id="jQuery" />
<dependency id="WebActivator" />

<group targetFramework="sl30">

From the docs:

Note that a group can contain zero dependencies. In the example above, if the package is installed into a project that targets Silverlight 3.0 or later, no dependencies will be installed. If the package is installed into a project that targets .NET 4.0 or later, two dependencies, jQuery and WebActivator, will be installed. If the package is installed into a project that targets an early version of these 2 frameworks, or any other framework, RouteMagic 1.1.0 will be installed. There is no inheritance between groups. If a project's target framework matches the targetFramework attribute of a group, only the dependencies within that group will be installed.

Even better, you can now group your PowerShell scripts as well as your content files by target framework. Specific scripts can run depending on your versions and specific content files can be included. This uses the same directory hierarchy you are already using for dependencies only now it works for /content and /tools as well.

Turn on "Allow NuGet to download missing packages during build" to make your life easier

Finally, do be aware that you have to explicitly give the OK to "restore packages" at least once, in order to enable NuGet to fetch a bunch of dependencies for you. Often you'll get a large project that you want to compile and perhaps that project includes a packages.config but not the packages itself (you don't want check your binary packages into source control, for example) so NuGet will restore missing packages when it's time to build. You only need to do this once to satisfy the lawyers.

Turn on Package Consent in Package Manager | General

Sneak Peak of Feature UI Features (thanks Mads!)

You know the new Ctrl-Q "search all commands" feature in Visual Studio 2012? I've seen a daily build of a possible improvement to NuGet on Mads' computer that not only searched Visual Studio local commands, but also the Visual Studio Gallery AND NuGet Packages. Leave a comment if you like this feature and I'll put pressure on Mads. Or, you will. ;)

jQuery searched for in the main Visual Studio CTRL-Q Quick search box

The Great Package Rename

We've changed the names of a BUNCH of packages (and forwarded the old names) so there's a little more logic to the Microsoft ones, at least. For example, here's the autocomplete for Microsoft.AspNet...

Microsoft.AspNet. intellisense

With this RC release, all of the NuGet packages involved in the ASP.NET products were renamed. Internally we called it the "Big Package Rename of 2012." Here is a mapping of old package names to new package names. In this list, the old names refer to prior versions of the products, including the Beta releases that shipped with VS 11 Beta.

AspNetMvc Microsoft.AspNet.Mvc
AspNetRazor.Core Microsoft.AspNet.Razor
AspNetWebApi Microsoft.AspNet.WebApi
AspNetWebApi.Core Microsoft.AspNet.WebApi.Core
AspNetWebApi.SelfHost Microsoft.AspNet.WebApi.SelfHost
AspNetWebPages.Core Microsoft.AspNet.WebPages
AspNetWebPages Microsoft.AspNet.WebPages.Administration
jQuery.Ajax.Unobtrusive Microsoft.jQuery.Unobtrusive.Ajax
jQuery.Validation.Unobtrusive Microsoft.jQuery.Unobtrusive.Validation
Microsoft.Web.Optimization Microsoft.AspNet.Web.Optimization
SqlServerCompact Microsoft.SqlServer.Compact
System.Net.Http Microsoft.Net.Http
System.Net.Http.Formatting Microsoft.AspNet.WebApi.Client
System.Web.Providers Microsoft.AspNet.Providers
System.Web.Providers.Core Microsoft.AspNet.Providers.Core
System.Web.Providers.LocalDb Microsoft.AspNet.Providers.LocalDb
System.Web.Providers.SqlCE Microsoft.AspNet.Providers.SqlCE

We are hoping the other companies (and others inside of Microsoft) will follow the same standard naming structure

Hosting your own Feeds (and other NuGet sightings in the community)

If you haven't noticed there's a bunch of cool NuGet-specific sites and applications showing up in the wild.

  • MyGet - Create and host your own NuGet feed in the cloud. Host for your company, add security, privileges and more. Great for companies as well as automated build systems.
  • SymbolSource - "SymbolSource is an integrated solution for hosting and browsing code releases - specifically, but not only, NuGet and OpenWrap packages. It's true power, however, comes from implementing the srcsrv protocol, which allows Visual Studio and other compatible software to download on-demand symbol (PDB) and source files from SymbolSource."
  • TeamCity - TeamCity 7 supports packing and publishing of Nuget packages via a NuGet plugin (thanks Eugene Petrenko!) that is now installed by default!
  • Sonatype Nexus - Supports the Java Maven repository and now NuGet.
    • Allows the customer to have a local copy of the entire NuGet repository
    • Allow the customer to select which license types of software they support and only show them matching NuGet’s
    • Allow the customer to see that the NuGet’s they are consuming do not contain code copied from other projects
  • NuGet Server written in Java - Eugene created a small NuGet server that you can run under Linux and Java 1.6. It's all part of the larger TeamCity NuGet support and on GitHub as well as NuGet itself (inception!)
  • WebMatrix 2 - WebMatrix not only supports NuGet but it includes the gallery as a toolbar button and uses NuGet to install additional functionality like iPhone and iPad simulators!
  • NuGetFeed - Create a personalized feed of packages you care about and never miss another update!
  • ProGet is an NuGet repository for the enterprise, includes LDAP-based permissions and scoped feeds for multiple teams. Host private NuGet packages, as well as cache and filter other repositories. Free edition available.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Back To Basics: You aren't smarter than the compiler. (plus fun with Microbenchmarks)

June 26, '12 Comments [36] Posted in Back to Basics
Sponsored By

Microbenchmarks are evil. Ya, I said it. Folks spend hours in tight loops measuring things trying to find out the "best way" to do something and forget that while they are changing 5ms between two techniques they've missed the 300ms Database Call or the looming N+1 selects issue that has their ORM quietly making even more database calls.

My friend Sam Saffron says we should "take global approach to optimizations." Sam cautions us to avoid trying to be too clever.

// You think you're slick. You're not.
// faster than .Count()? Stop being clever.
var count = (stuff as ICollection<int>).Count;

All that said, let's argue microbenchmark, shall we? ;)

I did a blog post a few months back called "Back to Basics: Moving beyond for, if and switch" and as with all blog posts where one makes a few declarative statement (or shows ANY code at all, for that matter) it inspired some spirited comments. The best of them was from Chris Rigter in defense of LINQ.

I started the post by showing this little bit of counting code:

var biggerThan10 = new List;
for (int i = 0; i < array.Length; i++){
if (array [i] > 10)
biggerThan10.Add (array[i]);

and then changed it into LINQ which can be either of these one liners

var a = from x in array where x > 10 select x; 
var b = array.Where(x => x > 10);

and a few questions came up like this one from Teusje:

"does rewriting code to one line make your code faster or slower or is it not worth talking about these nanoseconds?"

The short answer is, measure it. The longer answer is measure it yourself. You have the power to profile your code. If you don't know what's happening, profile it. There's some interesting discussion on benchmarking small code samples over on this StackOverflow question.

Now, with all kinds of code like this folks go and do microbenchmarks. This usually means doing something trivial a million times in a tight loop. That's lots of fun and I'm doing to do JUST that very thing right now with Chris's good work, but it's important to remember that your code is usually NOT doing something trivial a million times in a tight loop. Unless it is.

Knaģis says:

"Unfortunately LINQ has now created a whole generation of coders who completely ignores any perception of writing performant code. for/if are compiled into nice machine code, whereas .Where() creates instances of enumerator class and then iterates through that instance using MoveNext method...Please, please do not advocate for using LINQ to produce shorter, nicer to read etc. code unless it is accompanied by warning that it affects performance"

I think that LINQ above could probably be replaced with "datagrids" or "pants" or "google" or any number of conveniences but I get the point. Some code is shown in the comments where LINQ appears to be 10x slower. I can't reproduce his result.

Let's take Chris's comment and deconstruct it. First, taking an enumerable Range as an array and spinning through it.

var enumerable = Enumerable.Range(0, 9999999);
var sw = new Stopwatch();
int c = 0;

// approach 1

var array = enumerable.ToArray();
for (int i = 0; i < array.Length; i++)
if (array[i] > 10)

The "ToArray()" part takes 123ms and the for loop takes 9ms on my system. Arrays are super fast.

Starting from the enumerable itself (not the array!) we can try the Count() one liner:

// approach 2
c = enumerable.Count(x => x > 10);

It takes 86ms.

I can try it easily in Parallel over 12 processors but it's not a large enough sample nor is it doing enough work to justify the overhead.

// approach 3
Console.WriteLine("Enumerable.AsParallel() (12 procs)");
c = enumerable.AsParallel().Where(x => x > 10).Count();

It adds overhead and takes 129ms. However, you see how easy it was to try a naïve parallel loop in this case. Now you know how to try it (and measure it!) in your own tests.

Next, let's do something stupid and tell LINQ that everything is an object so we are forced to do a bunch of extra work. You'd be surprised (or maybe you wouldn't) how often you find code like this in production. This is an example of coercing types back and forth and as you can see, you'll pay the price if you're not paying attention. It always seems like a good idea at the time, doesn't it?

//Approach 4 - Type Checking?
Console.WriteLine("Enumerable.OfType(object) ");
var objectEnum = enumerable.OfType<object>().Concat(new[] { "Hello" });
var objectArray = objectEnum.ToArray();
for (int i = 0; i < objectArray.Length; i++)
int outVal;
var isInt = int.TryParse(objectArray[i].ToString(), out outVal);
if (isInt && Convert.ToInt32(objectArray[i]) > 10)

That whole thing cost over 4 seconds. 4146ms in fact. Avoid conversions. Tell the compiler as much as you can up front so it can be more efficient, right?

What if we enumerate over the types with a little hint of extra information?

// approach 5
Console.WriteLine("Enumerable.OfType(int) ");
c = enumerable.OfType<int>().Count(x => x > 10);

Nope, the type check wasn't necessarily in this case. It took 230ms and added overhead. What if this was parallel?

// approach 6
Console.WriteLine("Enumerable.AsParallel().OfType(int) ");
c = enumerable.AsParallel().OfType<int>().Where(x => x > 10).Count();

That's 208ms, consistently. Slightly faster, but ultimately I shouldn't be doing unnecessary work.

In this simple example of looping over something simple, my best bet turned out to be either the Array (super fast if it was an Array to start) or a simple Count() with LINQ. I measured, so I would know what was happening, but in this case the simplest thing also performed the best.

What's the moral of this story? Measure and profile and make a good judgment. Microbenchmarks are fun and ALWAYS good for an argument but ultimately they exists only so you can know your options, try a few, and pick the one that does the least work. More often than not (not always, but usually) the compiler creators aren't idiots and more often than not the simplest syntax will the best one for you.

Network access, database access, unnecessary serializations, unneeded marshaling, boxing and unboxing, type coercion - these things all take up time. Avoid doing them and when do you do them, don't just know why you're doing them, but also that you are doing them.

Is it fair to say "LINQ is evil and makes things slow?" No, it's fair to say that code in general can be unintuitive if you don't know what's going on. There can be subtle side-effects whose time can get multiplied inside of a loop. This includes type checking, type conversion, boxing, threads and more.

The Rule of Scale: The less you do, the more you can do of it.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Managing the Cloud from the Command Line

June 20, '12 Comments [16] Posted in Azure | Open Source
Sponsored By

I blogged about the Windows Azure cloud a few weeks ago. I'm digging the new stuff and trying different scenarios on Macs, PCs and Linux (I prefer Ubuntu). As a long time PowerShell and Command Line fan I'm always looking for ways to do stuff "in text mode" as well as scripting site creations and deployments.

Turns out there are a mess of ways to access Azure from the command line - more than even I thought. There's a JSON-based Web API that these tools end up talking to. You could certainly call that API directly if you wanted, but the command line tools are damn fun.

You can install the Mac Azure SDK installer to get the tools and more on a Mac, or if you install node.js on Windows or Mac or Linux you can use the Node Package Manager (npm) to install Azure tools like this:

npm install azure-cli --global

You can also use apt-get or other repository commands. After this, you can just run "azure" which gives you these commands that you link together in a very intuitive way, "azure topic(noun)verb option" so "azure site list" or "azure vm disk create" and the like.

Azure command line format

There's even ASCII art, and who doesn't like that. ;)

Seriously, though, it's slick. Here's a sample interaction I did just now. I trimmed some boring stuff but this is starting from a fresh machine with no tools and ending with me interacting with my Windows Azure account.

scott@hanselmac:~$ npm install azure-cli
npm http GET
...bunch of GETS...
scott@hanselmac:~$ azure
info: _ _____ _ ___ ___
info: /_\ |_ / | | | _ \ __|
info: _ ___/ _ \__/ /| |_| | / _|___ _ _
info: (___ /_/ \_\/___|\___/|_|_\___| _____)
info: (_______ _ _) _ ______ _)_ _
info: (______________ _ ) (___ _ _)
info: Windows Azure: Microsoft's Cloud Platform
info: Tool version 0.6.0
...bunch of help stuff...

scott@hanselmac:~$ azure account download
info: Executing command account download
info: Launching browser to
help: Save the downloaded file, then execute the command
help: account import
info: account download command OK
scott@hanselmac:~$ cd ~
scott@hanselmac:~$ cd Downloads/
scott@hanselmac:~/Downloads$ ls
3-Month Free Trial.publishsettings
scott@hanselmac:~/Downloads$ azure account import 3-Month\ Free\ Trial.publishsettings
info: Executing command account import
info: Setting service endpoint to:
info: Setting service port to: 443
info: Found subscription: 3-Month Free Trial
info: Setting default subscription to: 3-Month Free Trial
warn: Remember to delete it now that it has been imported.
info: Account publish settings imported successfully
info: account import command OK
scott@hanselmac:~/Downloads$ azure site list
info: Executing command site list
+ Enumerating locations
+ Enumerating sites
data: Name State Host names
data: ----------------- ------- -------------------------------------------------------------
data: superawesome Running,
info: site list command OK
scott@hanselmac:~/Downloads$ azure site list --json
"AdminEnabled": "true",
"AvailabilityState": "Normal",
"EnabledHostNames": [
"HostNames": [
"State": "Running",
"UsageState": "Normal",
"WebSpace": "eastuswebspace"

Here's how I can create and start a VM from the command line. First I'll list the available images I can start with, then I create it. I wait for it to get ready, then it's started and ready to remote (RDP, SSH, etc) into.

scott@hanselmac:~$ azure vm image list
info: Executing command vm image list
+ Fetching VM images
data: Name Category OS
data: -------------------------------------------------------------------------- --------- -------
data: CANONICAL__Canonical-Ubuntu-12-04-amd64-server-20120528.1.3-en-us-30GB.vhd Canonical Linux
data: MSFT__Windows-Server-2012-RC-June2012-en-us-30GB.vhd Microsoft Windows
data: MSFT__Sql-Server-11EVAL-11.0.2215.0-05152012-en-us-30GB.vhd Microsoft Windows
data: MSFT__Win2K8R2SP1-120514-1520-141205-01-en-us-30GB.vhd Microsoft Windows
data: OpenLogic__OpenLogic-CentOS-62-20120531-en-us-30GB.vhd OpenLogic Linux
data: SUSE__openSUSE-12-1-20120603-en-us-30GB.vhd SUSE Linux
data: SUSE__SUSE-Linux-Enterprise-Server-11SP2-20120601-en-us-30GB.vhd SUSE Linux
info: vm image list command OK
scott@hanselmac:~$ azure vm create hanselvm MSFT__Windows-Server-2012-RC-June2012-en-us-30GB.vhd scott superpassword --location "West US"
info: Executing command vm create
+ Looking up image
+ Looking up cloud service
+ Creating cloud service
+ Retrieving storage accounts
+ Creating VM
info: vm create command OK
scott@hanselmac:~$ azure vm list
info: Executing command vm list
+ Fetching VMs
data: DNS Name VM Name Status
data: --------------------- -------- ------------
data: hanselvm Provisioning
info: vm list command OK

That's the command line tool for Mac, Linux, and optionally Windows (if you install node and run "npm install azure --global") and there's PowerShell commands for the Windows admin. It's also worth noting that you can check out all the code for these as they are all open source and up on github at The whole command line app is written in JavaScript, in fact.

Just as the command line version of the management tools has a very specific and comfortable noun/verb/options style, the cmdlets are very "PowerShelly" and will feel comfortable folks who are used to PowerShell. The documentation and tools are in a Preview mode and are under ongoing development, so you'll find some holes in the documentation.

The PowerShell commands all work together and data is passed between them. Here a new Azure VM configuration is created while the VM Name is pull from the list, then the a provisioning config object is passed into New-AzureVM.

C:\PS>New-AzureVMConfig -Name "MySUSEVM2" -InstanceSize ExtraSmall -ImageName (Get-AzureVMImage)[7].ImageName 
` | Add-AzureProvisioningConfig –Linux –LinuxUser $lxUser -Password $adminPassword
` | New-AzureVM

Next, I want to figure out how I can spin up a whole farm of websites from the command line, deploy an app to the new web farm, configure the farm for traffic, then load test it hard, all from the command line. Such fun!

Sponsor: I want to thank the folks at DevExpress for sponsoring this week's feed. Check out their DXperience tools, they are amazing. You can create web-based iPad apps with ASP.NET and Web Forms. I was personally genuinely impressed. Introducing DXperience 12.1 by DevExpress - The technology landscape is changing and new platforms are emerging. New tools by DevExpress deliver next-generation user experiences on the desktop, on the Web or across a broad array of Touch-enabled mobile devices.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

The Sad State of Diabetes Technology in 2012

June 17, '12 Comments [118] Posted in Diabetes
Sponsored By

animation1I've been diabetic for almost two decades. It's tiring, let me tell you. Here's a video of my routine when I change my insulin pump and continuous meter. I'm not looking for pity, sadness or suggestions for herbs and spices that might help me out. I'd just like a day off. Just a single day out of the last 7000 or the next, I'd like to have a single piece of pie and not chase my blood sugar for hours.

Every time I visit the doctor (I do every 3 months) and every time I talk to someone in industry (I do a few times a year) I'm told that there will be a breakthrough "in the next 5 years." I've been hearing that line - "it's coming soon" - for twenty.

I used to wait a minute for a finger stick test result. Now I wait 5 seconds but we still have blood sugar strips with +-20% accuracy. That means I can check my sugar via finger stick twice and get a number I'd take action on along with one I wouldn't. Blood sugar strip accuracy is appalling and a dirty little secret in the diabetes community.

I started with insulin that would reach its peak strength after about 4 hours. Today it takes about an hour. Awesome, but that's not fast enough when a meal can take me to the stratosphere in minutes.

We are hurting here and we can't all wait another five years. Diabetes is the leading cause of blindness, leading cause of kidney failure and leading cause of amputation.

I wrote the first Glucose Management system for the PalmPilot in 1998 called GlucoPilot and provided on the go in-depth analysis for the first time. The first thing that struck me was that the PalmPilot and the Blood Sugar Meter were the same size. Why did I need two devices with batteries, screens, buttons and a CPU? Why so many devices?

NewColorSmall_smallIn 2001 I went on a trip across the country with my wife, an insulin pump and 8 PDAs (personal digital assistants, the "iPhones" of the time) and tried to manage my diabetes using all the latest wireless technology. Here's what I had to say 11 years ago:

With Bluetooth coming, why couldn't my [PalmPilot] monitor my newly implanted smart-pump? GlucoPilot could generate charts and graphics from information transmitted wirelessly from the pump. For that matter, the pump, implanted in my abdomen, could constantly transmit information to Bluetooth-enabled devices that surround me. The pump might use my cell phone to call in its data into a central server when I'm not using the phone. If I wander near my home computer, the pump or Visor might take the opportunity to upload its data. During a visit to the doctor, Bluetooth's 30-meter range could provide the doctor with my minute-by-minute medical history as I sat in the waiting room.

Back in 1998 when I was writing and marketing GlucoPilot I was using a custom cable that connected directly from my PalmPilot to the glucose meter and downloaded my historical glucose data. Fast forward to 2012 and what new technologicals innovation do we have?

Yes, that's a custom cable to plug-in to my PDA. Yes, I'm a frustrated diabetic. This a 15 year old solution with no backing standards, no standard interchange format, no central cloud to store the data in. It's vendor lock-in on both sides.

Kudos to the Glooko guys for fighting the good fight and shame on the blood sugar meter manufacturers for making their job hard.


Fifteen years ago we talked about data standards and interoperability. I was even on a standards board for a while to try and pressure the industry to standardize on data interchange formats. I have personally written multiple blood sugar meter data importers from the very simple (CSV) to the very complex (binary packed and purposely obscured to prevent 3rd party data dumps) and I can tell you that the blood sugar meter manufacturers are not interested in making it easy to move our data around. This is a billion dollar industry.

Today I read an article about the iBGStar (a forgettable name) glucose meter that plugs directly into an iPhone 30 pin port. The article came up on Hacker News and one of the designers said this in a comment:

I'm one of the designers of the iBGStar and we considered Bluetooth. We actually have another FDA cleared product that uses Bluetooth, but cost, battery life, and a bunch of technical issues led us to favor the 30 pin.

iBGStar-IPhoneThis is hugely disappointing especially since Bluetooth 4.0 is said to offer battery life as long as 10 years on some products. Given all the new iPhones have Bluetooth 4.0 just waiting for devices to connect to, you'd think this is a perfect opportunity for a Bluetooth 4.0 glucose meter.

I appreciate the attempts and the word that is being done in the space, I truly do, but as an end user when I see products like this that are trying to push the envelope but fail with fundamental usability issues, I'm saddened. Most diabetics check their blood sugar 10 times a day or more. I can't keep this glucose meter attached to my phone. It'll fall off, get bent, mess up the 30 pin connector. It's simply not reasonable for a day to day use coming in and out of pockets.

A more reasonable mode of usage would mirror the FitBit. It's tiny, clips to my belt and automatically notices when I pass by my computer then uploads its data wirelessly. That's how wireless is supposed to work. And the battery lasts at least a week.

Twenty years and no significant moves. We are still wiring our devices together, translating from one format to another, all the while being hamstrung by the FDA and their processes. When we do start to get something working well, it's attacked and we're told that our insulin pumps can be hacked from a mile away and we can be killed in our sleep. This will no doubt slow progress and make the FDA even more paranoid when approving new technology.

I've just this week switched from a Medtronic Continuous Glucose Meter to a DexCom, which is another company. This new CGM gives me more accurate data with less lag time. However, I still have the same insulin pump. This means my meter and pump aren't integrated so I carry another device on my person. This is because while the Animas Vibe, a pump that integrates both the DexCom meter and an insulin pump as well as other features like being waterproof, is available EVERYWHERE but the US. It's in the FDA process. Maybe ready in 6 months? 18? Who knows. When it shows up, the technology will be years old while the iPhone is on generation 6. We've got 3D TVs to watch crappy movies on by my insulin pump's firmware hasn't changed in nearly a decade.

The article about the iBGStar is poorly researched and galling.  I appreciate what Hacker News commenter lloyd said with emphasis mine, calling out this inane line from the article.

"Could this be the beginning of mobile diabetes monitoring?"

As so many people above have stated, no, you moron. We've been monitoring blood sugar on the go for the past 30 years.

I've got Type 1 diabetes...and my current meter is smaller than the one shown here. I can plug it into my Mac via USB to download and visualize the data (& can control my insulin pump via bluetooth using the meter).

The only benefit with this particular iPhone-compatible meter would be enhanced, immediate visualization of results. Which might be easier to get, and might not, given the inconvenience of having to remove an iPhone case and plug in the meter. (Not to mention other issues - what if my iPhone's batteries are dead? Will it still work?)

Unfortunately, this product reminds me of 5 years ago, when someone would announce a new toaster, and the tech crowd wouldn't be impressed...unless it was a Bluetooth toaster. We're so focused on it being the hot new thing (it's compatible with iOS! Oooh!!) that we ignore the fact that there's nothing revolutionary being presented here.

The way I see it, this doesn't really change anything in terms of treatment. If it's a more accurate meter, great - sell based on that. Not on the bogus "we're taking blood glucose monitoring mobile" claims.

You may feel like technology is amazing and it's moving so very fast and it surely is. But as a diabetic who relies on technology to stay alive as along as I possibly can, it feels like nothing has changed in 20 years. Maybe something will happen in just 5 more.

Sponsor: I want to thank the folks at DevExpress for sponsoring this week's feed. Check out their DXperience tools, they are amazing. You can create web-based iPad apps with ASP.NET and Web Forms. I was personally genuinely impressed. Introducing DXperience 12.1 by DevExpress - The technology landscape is changing and new platforms are emerging. New tools by DevExpress deliver next-generation user experiences on the desktop, on the Web or across a broad array of Touch-enabled mobile devices.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.