Scott Hanselman

It's WAY too early to call this Insulin Pump an Artificial Pancreas

September 29, '13 Comments [41] Posted in Diabetes
Sponsored By

app_6_steps_621

The diabetic internet and lots of mainstream news agencies are abuzz about the new insulin pump from Medtronic. Poorly written news articles that are effectively regurgitations of the Medtronic Press Release have exciting headlines like this:

Other news outlets have slightly better headlines like

But then ruin it with vague subtitles that are missing important context:

  • FDA approved the company’s automated insulin delivery system.

This is Step 1, possibly Step 0.

TO BE CLEAR. This new Medtronic 530G pump is NOT an artificial pancreas. It is an insulin pump, similar to the very model I'm wearing right now. It is paired with a revision of Medtronic's CGM (Continuous Glucose Meter) system and it does one new thing.

This new pump will turn off if you ignore its alarm that you may be having a low blood sugar.

Read it again, I'll wait.

Note the JDRF chart above describing the steps we need to towards a true artificial pancreas. This new 530G from Medtronic is arguably Step 1 in this 6 step process. It's the first step of the first generation.

But wait, doesn't your pump just handle things for you? You don't have to stick your fingers anymore, right? Wrong.

Let's stop and level set for a moment. Here's a generalization of your day if you're not diabetic.

image

Here's what a Type 1 diabetic (like me) does:

image

If I get this new pump that news outlets are incorrectly calling an artificial pancreas will anything in this cycle change? No.

There's NOTHING automatic here. I want to make that clear. Today's insulin pumps are NOT automatic. I set them manually, I tell them what to do manually. Yes, they "automatically deliver insulin as I sleep" but only because I told it to. If I eat and do nothing, I WILL get high blood sugar and today's insulin pumps will do exactly NOTHING about it.

If I only make decisions about insulin dosage based on my CGM then I WILL eventually get in trouble because today's CGMs are demonstrably less accurate than finger sticks. And, here's the kicker, finger sticks aren't even that accurate either.

Even more insidious is the issue of lag time. Medtronic's last generation of CGM lagged by 20 to 30 minutes BEHIND a finger stick. That meant I was getting "real time values" that in fact represented my blood sugar in the past. It's hard to make reliable altitude changes in your plane if your altimeter shows your altitude a half hour ago.

The Medtronic Press Release says that this new Enlite Sensor is 31% more accurate. I hope so. I personally continue to use a Medtronic 522 pump (this new one is the 530G) but I have given up on Medtronic's CGM in favor of a Dexcom G4. I am thrilled with it. The G4 has about a 5 minute lag time and is astonishingly accurate.

NOTE: I have no personal or investment relationship with either Dexcom or Medtronic. I am not a doctor or a scientist. I write this blog post with the expertise of someone who has been a Type 1 Diabetic for 20 years, a user of a Medtronic Pump for 15 years, a user of a Medtronic CGM for 4 years, and more recently a user of a Dexcom G4 for a year. My most recent A1C test was 5.5 putting my blood sugars at near non-diabetic levels on average. TL;DR - I'm a very good diabetic who uses the best available technology to keep me alive as long as possible.

I am extremely disappointed in the lack of research, due diligence and basic medical common sense in these articles. If you are a Type 1 Diabetic or have someone in your life who is, do the research and the reading and please spread the word so people can make informed decisions.

Related Reading

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

The Myth of the Rockstar Programmer

September 29, '13 Comments [94] Posted in Musings
Sponsored By

There-is-an-I-in-TEAMThe Myth of the Rockstar Programmer is just that, a myth. It's an unfortunate myth for a number of reasons.

  • It sets an unreasonable expectation for regular folks.
  • Calling out rockstars demotivates the team.
  • Telling someone they are a rockstar may cause them to actually believe it.

Reality is a normal distribution curve. Lots of good average senior developers, some amazing outliers and some junior folks with potential. (and some folks that suck.)

Brooks's law: Adding manpower to a late software project makes it later.

The Rockstar Developer is a Myth

People love to say that a rockstar can do the work of 10 regular engineers. That's just nonsense. 9 women can't have one baby in a month, and 10 "rockstar" developers can't replace 100 regular ones.

I hate Quora so I won't link to them, but here's a modification of a great answer from Nate Waddoups that was taken from some internal engineering paperwork:

  • Junior Engineer - Creates complex solutions to simple problems.
  • Engineer - Creates simple solutions to simple problems.
  • Senior Engineer - Creates simple solutions to complex problems.
  • Rockstar Engineer - Makes complex problems disappear.

Am *I* a rockstar? I'm a competent Senior Engineer who is also loud. I've been on a lot of successful projects in the last 20 years and I was never the smartest guy in the room.

Senior + Loud != Rockstar

In my experience, in fact...

Senior + Thoughtful == Rockstar

That may or may not include being loud. Just because someone has written a blog, or a book, or speaks well doesn't mean they are a good developer. You certainly don't want a diva. Diva Developers do more harm than good.

Are rockstars about lines of code? No, good developers solve problems. More specifically, they make problems go away. They fix problems rather than complaining about them.

The Rockstar Team is Reality

In fact, it's diversity of thought and experience in a team that makes a Rockstar Team - that's what you really want. Put thoughtful and experience architects with enthusiastic and positive engineers who are learning and you'll get something.  If you insist on calling someone a rockstar, they are likely the team's teacher and mentor.

Jon Galloway says:

Pairing "step back and think" devs with "crank a lot of pretty good code out" devs is a recipe for a good team.

Build smart, diverse teams. Build rockstar teams.

UPDATE: I was just told about this post by shanley on the "10x Engineer." It's a great and detailed piece and you should check it out!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Sit, Stand, Walk, Type - Using a Treadmill Desk

September 25, '13 Comments [46] Posted in Musings
Sponsored By
Treadmill Desk

I've been doing this whole "sitting and thinking for money" thing for over twenty years now. I've written about some of the things that happen to the body after sitting and typing for long periods, and talked about ways we can try to stem the tide, like

Almost ten years ago I blogged about The Programmer's Back, The Programmer's Hands, and worse yet (and most recently) The Programmer's Body.

I'm happy with my desk, but since Being a Remote Worker Sucks I get cabin fever and need to mix it up. Sometimes I sit at my desk, sometimes I stand, sometimes I just escape to a local café. I needed another option.

I noticed that I wasn't getting nearly close enough to the arbitrary goal of 10,000 daily steps per my FitBit. When I travel I walk obsessively, but here in Oregon running and walking in the rain is really no fun. I started running on the treadmill in the last few months while making may way through my NetFlix queue but quickly realized that this is prime-email-deleting-time I'm wasting!

It was finally time to make this Treadmill into a desk. Being the immensely handy fix-it type that I am (not) I promptly tried to cut a piece of wood. It was quite the achievement, let me tell you.

The prototype was fine, just a board laid across the treadmill but it worked. I enlisted my Dad (who is actually Handy) and we iterated. Here's what we came up with. Bonus points to my Dad who is incapable of letting a piece of wood leave his shop without being sanded or property stained.

Treadmill Desk

First, we took the original boards and added small supports to keep it from moving laterally. Then I added foam from the inside of a bike helmet to make the fit even tighter against the side supports.

Treadmill Desk

Then, Dad built a small box with a lip to sit on top of the boards. This brings the laptop (my Lenovo X1 Carbon Touch) up to a height that keeps my hands at exactly a 90 degree angle to the keyboard. This has proven very comfortable - not too low and not too high.

Treadmill Desk

If I want to run full out, I just lift the two pieces up and move the aside. It's also worth noting that I'm still using the safety cord in case I trip or fall off the treadmill. I'm considering actually drilling a 1.5" hole through the middle of the box to thread the cord so if I do take a spill, it won't take the box with me.

I've been doing about 2 miles per hour at a slight incline. I don't like super slow walking (1 mph) as I find it actually requires more thinking than normal walking. So far today I've moseyed about 5 miles on the treadmill desk without really feeling it. I'm not sure I'd want to spend a full day doing this, but it's very comfortable and I think I'll use it for at least an hour or so at a time.

This was super easy to do and I recommend it to anyone who has (or can cheaply get) a treadmill, a few pieces of wood, and a laptop. It was so easy and the benefits are so clearly obvious, I'm actually a little disappointed I didn't do this years ago.


Sponsor: Thank you to RedGate for sponsoring the feed this week! Easy release management: Deploy your .NET apps, services and SQL Server databases in a single, repeatable process with Red Gate’s Deployment Manager. There’s a free Starter edition, so get started now!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Create a complete System Image Backup with Windows 8.1 and File History

September 24, '13 Comments [42] Posted in Win8
Sponsored By

I feel better when things are backed up. I use the File History feature of Windows 8 to backup files every hour or so. I really encourage folks to use the Computer Backup Rule of Three.

One of the features of Windows 7 that I love is System Image Backup. I used to use 3rd party products to image my system. In Windows 8 (8.0, that is) it's kind of hard to find System Image Backup. While I use File History locally as well as regular cloud backup (using CrashPlan on my Synology) I also like to do a full System Image every month or so.

I've seen a number of tutorials on the web on "how to create a system image backup on windows 8.1" that have folks going to a PowerShell prompt to start a backup. While that's possible, it's certainly not the primary way you want to start typical backup at home.

In Windows 8.1, go to the Start Menu, type "File History" and run it.

image

Now, hit System Image Backup in the lower corner there.

image

You can put an image on DVDs or an external hard drive.

Now, to be clear, should this be your primary backup strategy? No. I've got most things in the cloud or automatically backed up to external drives. If I needed to totally reinstall Windows from scratch, I can get back up and working in about an hour without using a complete System Image. However, I'm comforted by having at least one or two System Image backups. It's nice to have options.

Recommended Reading

Here's some other blog posts on the topic of backup. Now, take action.


Sponsor: Thank you to RedGate for sponsoring the feed this week! Easy release management: Deploy your .NET apps, services and SQL Server databases in a single, repeatable process with Red Gate’s Deployment Manager. There’s a free Starter edition, so get started now!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

The New Turbo Button - Balancing Power Management and Performance on Windows Servers

September 17, '13 Comments [20] Posted in Musings
Sponsored By

TurboButtonDo you remember the Turbo Button? I actually thought of it is the "be slow button" because we always kept it on Turbo. Why wouldn't you want a fast computer all the time? The Turbo Button was actually an "underclock" button. When it was off, you were setting your 286 or 386 to XT speeds so older DOS games would work at their designed speed.

Power Management, both software and hardware, seems to be the new Turbo Button. My laptops get way faster when I plug it in - like very noticeably faster to the point where I just don't like using them on battery. For typing documents, it's fine, but for development, compiling, running VMs, it's unacceptable to me. I'll end up spending more power to get more performance. 

It's important to remember that Power Management affects servers as well.

Recently Mike Harder, a development manager, noticed that stuff he does every day was taking longer on the "Balanced" power option than the "High Performance" option. He said:

My naïve belief was that “Balanced” is supposed to save power when your machine is idle, but give full power when needed, so the overall perf hit should be small.

Here's a very basic benchmark Mike did:

Hardware/OS
Hardware: HP z420, Intel Xeon E5 1650 @ 3.2GHz, 32GB RAM, SSD
OS: Windows Server 2012 Standard

(in seconds) High Performance Balanced Delta
7-Zip, LZMA, 2 Threads 55 115 109%
7-Zip, LZMA2, 12 Threads 28 49 75%
Build Source Tree, 48 Threads 37 55 49%

This started a fascinating thread on power management and the balance between getting good performance from a system (desktop or laptop or server) and wasting power and heat. Here's the best parts of that internal thread here for all of our education.

Bruce Worthington said:

Depends on the workload.  The full performance of the system is available, but (for example) if the workload is very bursty you will take an initial hit at the beginning of each burst as the power management algorithms determine that more resources need to be brought on line.  Or if it is a low-utilization steady state workload, you will run at a lower CPU frequency throughout.

There is no free lunch, so there is always a tradeoff that is being made.

There is also an excellent thread on this at ServerFault. Jeff Atwood asks:

Our 8-cpu database server has a ton of traffic, but extremely low CPU utilization (just due to the nature of our SQL queries -- lots of them, but really simple queries). It's usually sitting at 10% or less. So I expect it was downclocking even more than the above screenshot. Anyway, when I turned power management to "high performance" I saw my simple SQL query benchmark improve by about 20%, and become very consistent from run to run.

imageThis makes sense to me. The CPU isn't working hard enough for long enough for the power management algorithms to put full power to the CPU. But, if Jeff sets power management to High Performance he's effectively saying "full speed ahead...always."

In the last half-decade power management in servers has become more of an issue. With high power comes heating and cooling as well as power costs. Windows Server 2008's default power is "Balanced."

Bruce again in an excellent explanation with emphasis mine:

I'll try to give a quick perspective below as to why we use Balanced mode as our default and how we arrive at the tunings for that mode.

As of Windows Server 2008, the default setting of the OS was switched from High Performance to Balanced.  Energy efficiency was becoming a larger factor in the real world, and our ability to balance between the oft-opposing poles of Power and Perf was improving.  That being said, there will always be environments where our recommendation is that the power policy should be switched back to High Performance.  Anything super latency sensitive will clearly fall into that bucket, such as banking, stock markets, etc.

OEMs have the flexibility to add custom tunings onto their factory settings if they want to put in the additional effort to find a balance that works better for their specific customers.  System administrators also have that flexibility. But tuning the power/perf knobs in the OS is a very tricky business, not for the faint of heart. 

<snip…>


Some of us on the Windows "power" teams were performance analysts before we become power analysts, so we are very sensitive to the tradeoffs that are being made and don’t like seeing any perf lost at all.  But there is no free lunch to be had, and there are big electric bills being paid (and polar bears falling into the water) that can be helped through sacrificing some level of performance in many environments.

<snip>

We will continue to provide multiple power policies because one size clearly does not fit all servers.

Another great point made for why have "Balanced"  be the default, from Sean McGrane:

[We're] looking at an industry landscape where servers in data centers are very underutilized, typically somewhere below 20% utilization. By going with balanced mode we saved a lot of energy and cost and improved their carbon footprint more or less for free. There was very strong support from customers to do this.

Virtualization has helped raise the utilization levels and most cloud DCs now operate at higher levels of utilization. However the majority of servers deployed are still running a single workload and that will be the case for a while.

This get to the point of measuring. Are your servers working hard now? Perhaps they'll perform better on High Performance. Are they often idle or at lower levels of utilization? Then Balanced is likely fine and will save power. Test and see.

As with all things in software development, it's a series of trade offs. If you blindly switch your servers' power options to High Performance because you read it on a random blog on the Internet, you're of course missing the point.

Change a variable, then measure.

Consider your workloads, how your workloads cause your CPUs to idle and how hard they work the CPU when pushed. Are you doing single threaded low CPU work, or massively parallel CPU intensive work?

I'm now going to pay more attention to power management profiles when developing, putting machines into production, stress testing and benchmarking. It's nice to have a Turbo Button.


Sponsor: Thank you to RedGate for sponsoring the feed this week! Easy release management: Deploy your .NET apps, services and SQL Server databases in a single, repeatable process with Red Gate’s Deployment Manager. There’s a free Starter edition, so get started now!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.