# Scott Hanselman

## Catch up on all the videos from DotNetConf Spring 2014

July 3, '14 Comments [10] Posted in Learning .NET

Did you miss out on DotNetConf when it streamed LIVE just a few weeks ago? Don't you worry, it's all recorded and online for you to stream or download!

We are happy happy to announce that we’re planning another .NET Conf to be happening in a few months, so keep tuned thru the .NET Conf Twitter account (Twitter: @dnetconf), or checking our .NET Conf site in the future: http://www.dotnetconf.net. Big thanks to Javier Lozano for all his work with the site and conference coordination.

Everything was recorded is is up here: http://channel9.msdn.com/Events/dotnetConf/2014

### .NET Conf summary and recorded content

The .NET Conf 2014 was a two-day virtual event (June 25th-26th) focused on .NET technologies, covering application development for the desktop, mobile and cloud/server. It is hosted by the MVP community and Microsoft, bringing top speakers and great topics straight to your PC.

Below you can review all the delivered sessions and reach to their related recorded content.

### Day 1 – .NET core and .NET in client/devices

 State of .NET (Keynote) - Jay Schmelzer Opening and overview of current .NET state and .NET on the Client side. New Innovations in .NET Runtime Andrew Pardoe We're changing the way we execute code in the .NET Runtime. Hear about .NET Native, RyuJIT, and modern server strategy. The Future of C# The Microsoft Managed Languages team has been focused on rebuilding the VB and C# compilers and editing experiences as part of Project "Roslyn". This effort has paved the way for these languages to continue evolving for many years to come. However, what does that future actually look like? We explore the editing experience, how public APIs may be used to write language-level extensions, as well as new language features. Building Universal Windows Apps with XAML and C# in Visual Studio Larry Lieberman In April at Build 2014, Microsoft unveiled universal Windows apps, a new approach that enables developers to maximize their ability to deliver outstanding application experiences across Windows PCs, laptops, tablets, and Windows Phones. This means it's now easier than ever to create apps that share most of their code. Code can be shared using the new shared app templates, as well as by creating Portable class libraries. This session will walk through the development of a shared app and will discuss where it still makes sense to implement platform specific features. .NET Native Deep Dive Andrew Pardoe Look inside the .NET Native compiler toolchain to understand how we enable .NET Windows Store apps to compile to self-contained native apps. Fun with .NET - Windows Phone, LEGO Mindstorms, and Azure Dan Fernandez In this demo-packed session, we'll walk through building your first .NET controlled LEGO Mindstorm using Windows Phone. You'll learn about the LEGO EV3 API, how to control motors and read sensor data, and how to batch commands to the robot. Once we have a working, drivable robot, we'll switch to cloud-enabling the robot so that you can drive the robot remotely via a Web site hosted in Microsoft Azure. Kinect for Windows Ben Lower We will take a look at what's new in Kinect for Windows v2 including the improvements in core sources like Infrared and Depth data.  We will also show how the new Kinect Studio enables Kinect development even while travelling via plane, train, or automobile (note: you should not dev and drive) and how Kinect Interactions can be used to add a new input modality to Windows Store applications. What's New in XAML Platform & Tooling Tim Heuer Tim will do a lap around what is new to the Windows Phone 8.1 platform as well as a tour of the new XAML tooling in Visual Studio Update 2 for developers and designers.  Developing Native iOS, Android, and Windows Apps with Xamarin James Montemagmo (Xamarin) Mobile continues to expand and evolve at a rapid pace. Users expect great native experiences in the palm of their hands on each and every platform. A major hurdle for developers today is that each platform has its own programming language and tools to learn and maintain. Even if you tackle the burden of learning Objective-C and Java you will still have to manage multiple code bases, which can be a nightmare for any development team large or small. It doesn't have to be this way as you can create Android, iOS, Windows Phone, and Windows Store apps leveraging the .NET framework and everything you love about C#. What's new for WPF Developers Dmitry Lyalin Windows Presentation Foundation (WPF) enables .NET developers to build rich and powerful Windows desktop applications using managed languages and XAML. In this session we'll cover all the latest innovations available to WPF developers such as improvements coming from .NET, integration points with the latest cloud technologies and enhanced tooling & profiling capabilities in Visual Studio.

### Day 2 – .NET in server and cloud

We encourage you to share this content with your colleagues and friends, and remember that .NET Conf and all its content is free!

Sponsor: Thanks to friends at RayGun.io. I use their product and LOVE IT. Get notified of your software’s bugs as they happen! Raygun.io has error tracking solutions for every major programming language and platform - Start a free trial in under a minute!

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

Hosting By

## Diabetics: It's fun to say Bionic Pancreas but how about a reality check

June 30, '14 Comments [23] Posted in Diabetes

The state of healthcare reporting is just abysmal. It's all link-bait. It's fun to write things like "Random Joe invents cure for diabetes in his garage, saves dying 5 year old." It's surely less fun to read them with you're the one with the disease.

IMPORTANT UPDATE: Scott (me) has now interviewed Dr. Steven Jon Russell, MD, PhD, a member of the Bionic Pancreas Team! Check out their interview at http://hanselminutes.com/431.

It's time for medical journalists to try a little harder and pushback against editors that write headlines optimized for pageviews. The thing is, I've met a dozen General Practitioners who are themselves confused about how diabetes works, and link-bait journalism just ruins it for the public, too. I've received no fewer than 50 personal emails or FB posts from well-meaning friends this last week. "Have you heard? They've cured your diabetes with a bionic pancreas!"

I have been a Type 1 Diabetic for 20 years, I've worn an insulin pump 24 hours a day for the last 15 years (that's over 130,000 hours, in case you're counting), I'm a diabetes off-label body hacker with an A1C of 5.5%. What's that mean to you? I'm not a doctor, but I'm a hell of a good diabetic.

I know what I'm talking about because I'm living it, and living it well. A doctor may be able to tell me to adjust my insulin every 3 months when I see them, but they aren't up with me at 4 am in a hotel in Germany with jet-lag telling me what to do when I'm having a low. Forgive me this hubris, but it comes from 75,000 finger pricks and yes, it hurts every time, and no, my insulin pump doesn't automatically cure me.

Last year the FDA approved an Insulin Pump that shuts off automatically if it detects the wearer is having a low sugar. The press and the company itself called this new feature an "artificial pancreas." Nonsense. It's WAY too early to call this Insulin Pump an Artificial Pancreas.

Now we are seeing a new "bionic" pancreas for which that the press is writing headlines like "A Father Has Invented a Bionic Organ to Save His Son From Type 1 Diabetes" and "Bionic Pancreas" Astonishes Diabetes Researchers."

It's a great proof concept for a closed system based on dual insulin pumps (one with glucagon) and a high accuracy CGM managed by an iPhone. But that's a not a fun headline, is it?

"Boston University biomedical engineer Ed Damiano and a team of other researchers published a study earlier this month detailing a system that could prevent these dangerous situations."

Indeed, the study in the New England Journal of Medicine where Ed Damiano, Ph.D. is listed alongside Steven J. Russell, M.D., Ph.D., Firas H. El-Khatib, Ph.D., Manasi Sinha, M.D., M.P.H., Kendra L. Magyar, M.S.N., N.P., Katherine McKeon, M.Eng., Laura G. Goergen, B.S.N., R.N., Courtney Balliro, B.S.N, R.N., Mallory A. Hillard, B.S., David M. Nathan, M.D.

They are clearly all brilliant and of note. Let's break the study down.

"...we compared glycemic control with a wearable, bihormonal, automated, “bionic” pancreas (bionic-pancreas period) with glycemic control with an insulin pump (control period) for 5 days in 20 adults and 32 adolescents with type 1 diabetes mellitus."

They are trying to improve blood sugar control. That means keeping my numbers as "normal" as possible to avoid the nasty side-effects like blindness and amputation in the long-term with highs, and death and coma with lows. The general idea is that since my actual pancreas isn't operating, I'll need another way to get insulin into my system. "Bihormonal" means they are delivering not just insulin, which lowers blood sugar, but also glucagon, which effectively raises blood sugar. They tested this for 5 days on a bunch of people.

"The device consisted of an iPhone 4S (Apple), which ran the control algorithm, and a G4 Platinum continuous glucose monitor (DexCom) connected by a custom hardware interface."

I use a DexCom G4, by the way. It's a lovely device and it gives me an estimate of my blood sugar every 5 minutes by drawing a parallel between what it detects in the interstitial fluid of my own fat and tissues (not my whole blood) and then sends it wirelessly to a handset. I currently then make calculations in my head and decide (Note that keyword: decide) how much insulin to take. I then manually tell my Medtronic Insulin Pump how much insulin to take. The DexCom must be calibrated at least twice daily with a whole blood finger stick. Also, it's not too accurate on day 1, and can be wholly inaccurate after it's listed 7 day effectiveness range. But it's that keyword that this project is trying to help with. Decide. I have to decide, calculate, guess, determine. That's hard for me as an adult. It's near-impossible for an 8 year old. Or an 80-year old. Computers are good at calculating, maybe it can do this tedious work for us.

The thing is, with Type 1 Diabetes there's dozens of other factors to consider. How much did I eat? What did I eat? Am I sick? Does my stomach work? Do I digest slowly? Quickly? Do I have any acetaminophen in my system? Am I going jogging afterwards? Is this insulin going bad? Is the insulin pump's cannula bent, and dozens (I'm sure I could come up with a hundred) of other factors. Read Lane Desborough's paper (PPT as a PDF) on "Applying STPA (System Theoretic Process Analysis) to the Artificial Pancreas for People with Type 1 Diabetes" for a taste of what needs to be done.

The brilliance of this system - this "bionic" pancreas - is this...and these are MY words, no one else's:

The two pump bionic pancreas system gives you rather a LOT of insulin if needed (as if it's descending a plane quickly and dramatically) then it pulls you up nicely with a bit of glucagon (as if the pilot screamed pull up as he noticed the altitude change).

It's the addition of the glucagon to get you out of lows that is interesting. Typically Diabetics have a big syringe of glucagon in the fridge for emergencies. If you're super low - dangerously loopy - your partner can get you out of it with a big bolus of glucagon. But if you put glucagon in an insulin pump, you can deliver tiny amounts and now you are are moving the graph in two directions.

Think I'm kidding about the "pull up, pull up" analogy?

Here's a snippet of a graph from page 15 of one of the Appendices (PDF). Note around 19:00, the blue bar going down, that's a lot of insulin. Then the BG numbers come down, FAST. Note the black triangle at around 20:20. That's "pull up, pull up" and a bolus of glucagon in red. And more, and more, in fact, there are many glucagon boluses keeping the numbers up, presumably happening while the subject sleeps. Then around 07:00 the numbers rise, presumably from the Dawn Effect, and another automatic insulin bolus (an overcorrection) and then more glucagon. It's a wonderfully controlled roller-coaster. This isn't using the word roller-coaster as a pejorative - that is the life I lead as a diabetic.

It's also not mentioned in the press that this system uses lot more insulin than I do today. A lot more, due to it's "dose and correct" algorithm's design.

"Among the other 11 patients, the mean total daily dose of insulin was 50% higher during the bionic-pancreas period than during the control period (P=0.001);"

UPDATE: I spoke to Dr. Russell, and I'm not entirely correct that this system uses a lot more insulin. The system didn't use much more insulin in diabetic kids who have very controlled diets, and was 50% higher in only some of the adults, presumably because (anecdotally) many of them were eating a lot more and "testing" the extents of the system.

I use about 40U a day, total. So we're looking at me using perhaps 60U a day with this system. As with any drug, though, insulin use has its side effects. It can cause fat deposits, scarring at injection sites, and we can become resistant to it. It'd be interesting to think about a study where someone's on 50% more insulin for years. Would that cause increases in any of these side effects? I don't know, but it's an interesting question. Should a closed system also optimize for doing its job with the minimum possible insulin. I optimize for that today, on my own, hoping that it will make a difference in the long run.

But, glucagon isn't pump friendly as it is today. An unfortunate note that isn't covered in any of the press is that they are having to replace the glucagon every day. Juxtapose that with what I do currently with insulin. I keep my pump filled and swap out its contents and cannula (insertion site) every 4-7 days. Insulin itself can surface ~28 days at room temperature although it's most often refrigerated. Changing one of the pumps daily is a bummer, as they point out.

"...the poor stability of currently available glucagon formations necessitated daily replacement of the glucagon in the pump with freshly reconstituted material."

It's early, people. It's not integrated, it's a proof of concept. It's impressive, to be sure, but Rube-Goldbergian in its hardware implementation. Two pumps, a Dexcom G4 inside a docking station, receiving BG data over RF from the transmitter, then the Dexcom wirelessly talking to an iPhone within another docking station.

"Since a single device that integrates all the components of a bionic pancreas is not yet available, we had to rely on wireless connectivity to the insulin and glucagon pumps, which was not completely reliable."

I'm not trying undermine, undercut, or minimize the work, it's super promising, but medical journalists need to seriously understand what's really going on here.

Fast forward a few years, and there will very likely be an bi-hormonal "double" pump with both (more stable) glucagon and insulin that combines with a continuous glucose meter that provides the average Type 1 Diabetic with a reasonable solution to keep their numbers out of imminent danger. Great for kids, a relief for many.

But, just as pumps are today, it'll be USD$5000 to USD$10000. It will require insurance, and equipment, it'll require testing and software, it'll require training, and it won't be - it can't be - perfect. This is a move forward, but it's not a cure. Accept it for what it is, a step in the right direction.

Do I want it? Totally. But, journalists and families of diabetics, let's not overreact or get too ahead of ourselves. Does this mean I should eat crap and the machine will take care of it? No. I'm healthy today because I care to be. I work at it. Every day. As I'm typing now, I know my numbers, my trend-line, and my goal: stay alive another day.

Read my article from 2001 - yes, that's 13 years ago - called One Guy, an Insulin Pump, and 8 PDAs:

"I imagine a world of true digital convergence -- assuming that I won't be cured of diabetes by some biological means in my lifetime -- an implanted pump and glucose sensor, an advanced artificial pancreas. A closed system for diabetics that automatically senses sugar levels and delivers insulin has been the diabetics' holy grail for years. But with the advent of wireless technology and the Internet, my already optimistic vision has brightened. If I had an implanted device with wireless capabilities, it could be in constant contact with my doctor. If the pump failed, it could simultaneously alert me, my doctor, and the local emergency room, downloading my health history in preparation for my visit. If it was running low on insulin, the pump could report its status to my insurance company, and I'd have new insulin delivered to my doorstep the next day. But that's not enough. With Bluetooth coming, why couldn't my [PDA] monitor my newly implanted smart-pump?"

Go an educate yourselves about the "We Are Not Waiting" movement. Hear how Scott Leibrand has a "DIY Artificial Pancreas" that's lowered his girlfriends average blood sugar dramatically using only an DexCom G4 and smart algorithms. You can make a change today, at your own risk, of course.

Read about the The DiabetesMine D-Data ExChange and how non-profit Tidepool is creating open source software and systems to make innovation happen now, rather than waiting for it. Get the code, join the conversation. Exercise, eat better, read, work. You can hack your Diabetes today. #WeAreNotWaiting

Sponsor: Thanks to friends at RayGun.io. I use their product and LOVE IT. Get notified of your software’s bugs as they happen! Raygun.io has error tracking solutions for every major programming language and platform - Start a free trial in under a minute!

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

Hosting By

## NuGet Package of the Week: ASP.NET Web API Caching with CacheCow and CacheOutput

June 27, '14 Comments [8] Posted in ASP.NET Web API | NuGet | NuGetPOW

You can see other cool NuGet Packages I've mentioned on the blog here. Today's NuGet package is CacheCow, which has possibly the coolest Open Source Library name since Lawnchair.js.

"CacheCow is a library for implementing HTTP caching on both client and server in ASP.NET Web API. It uses message handlers on both client and server to intercept request and response and apply caching logic and rules."

CacheCow was started by Ali Kheyrollahi with help from Tugberk Ugurlu and the community, and is a fantastically useful piece of work. I wouldn't be surprised to see this library start showing in more places one day.

As an aside, Ali, this would be a great candidate for setting up a free AppVeyor Continuous Integration build along with a badge showing that the project is building and healthy!

CacheCow on the server can manage the cache in a number of ways. You can store it in SQL Server with the EntityTagStore, or implement your own storage handler. You can keep the cache in memcached, Redis, etc.

Consider using a library like CacheCow if you're putting together a Web API and haven't given sufficient thought to caching yet, or if you're already sprinkling cache code throughout your business logic. You might already suspect that is going to litter your code but perhaps haven't gotten around to tidying up. Now is a good time to unify your caching.

As a very simple example, here's the HTTP Headers from an HTTP GET to a Web API:

Cache-Control: no-cacheContent-Length: 19Content-Type: application/json; charset=utf-8Date: Fri, 27 Jun 2014 23:22:10 GMTExpires: -1Pragma: no-cache

Here's the same thing after adding the most basic caching to my ASP.NET applications config:

GlobalConfiguration.Configuration.MessageHandlers.Add(new CachingHandler(GlobalConfiguration.Configuration));

The HTTP Headers with the same GET with CacheCow enabled:

Cache-Control: no-transform, must-revalidate, max-age=0, privateContent-Length: 19Content-Type: application/json; charset=utf-8Date: Fri, 27 Jun 2014 23:24:16 GMTETag: W/"e1c5ab4f818f4cde9426c6b0824afe5b"Last-Modified: Fri, 27 Jun 2014 23:24:16 GMT

Notice the Cache-Control header, the Last-Modified, and the ETag. The ETag is weak as indicted by "W/" which means that this response is semantically equivalent to the last response. If I was caching persistently, I could get a strong ETag indicating that the cached response was byte-for-byte identical. Also, if the client was smart about caching and added If-Modified-Since or If-None-Match for ETags, the response might be a 304 Not Modified, rather than a 200 OK. If you're going to add caching to your Web API server, you'll want to make sure your clients respect those headers fully!

From ALI's blog, you can still use HttpClient in your clients, but you use WebRequestHandler as the message handler:

HttpClient client = new HttpClient(new WebRequestHandler(){    CachePolicy = new RequestCachePolicy(RequestCacheLevel.Default)});var httpResponseMessage = await client.GetAsync(http://superpoopy);

Really don't want a resource cached? Remember, this is HTTP so, Cache-Control: no-cache from the client!

Of course, one of the most important aspects of caching anything is "when do I invalidate the cache?" CacheCow gives you a lot control over this, but you really need to be aware of what your actual goal is or you'll find things cache you don't want, or things not cached that you do.

• Are you looking for time-based caching? Cache for 5 min after a DB access?
• Are you looking for smart caching that invalidates when it sees what could be a modification? Invalidate a collection after a POST/PUT/DELETE?

Given that you're likely using REST, you'll want to make sure that the semantics of these caching headers and their intent is reflected in your behavior. Last-Modified should reflect realty when possible.

From the CacheCow Wiki, there's great features for both the Server-side and Client-side. Here's CacheCow.Server features

• Implementing returning Not-Modified 304 and precondition failed 412 responses for conditional calls
• Invalidating cache in case of PUT, POST, PATCH and DELETE
• Flexible resource organization. Rules can be defined so invalidation of a resource can invalidate linked resources

and the CacheCow.Client features

• Caching GET responses according to their caching headers
• Verifying cached items for their staleness
• Validating cached items if must-revalidate parameter of Cache-Control header is set to true. It will use ETag or Expires whichever exists
• Making conditional PUT for resources that are cached based on their ETag or expires header, whichever exists

Another good ASP.NET caching library to explore is ASP.NET Web API "CacheOutput" by Filip Wojcieszyn. While it doesn't have an fun to say name ;) it's got some great features and is super easy to get started with. You can find CacheOutput with NuGet at

Install-Package Strathweb.CacheOutput.WebApi2

And you'll configure your caching options using the intuitive CacheOutput attributes like those you may have seen in ASP.NET MVC:

[CacheOutput(ClientTimeSpan = 100, ServerTimeSpan = 100)]public IEnumerable<string> Get(){    return new string[] { "value1", "value2" };}

ASP.NET Web API CacheOutput has great getting started docs and clear easy to ready code.

So, you've got options. Go explore!

You can also pickup the Pro ASP.NET Web API book at Amazon. Go explore CacheCow or CacheOutput and support open source! If you find issues or feel there's work to be done in the documentation, why not do it and submit a pull request? I'm sure any project would appreciate some help with updated samples, quickstarts, or better docs.

Sponsor: Many thanks to our friends at Octopus Deploy for sponsoring the feed this week. Did you know that NuGet.org deploys with Octopus? Using NuGet and powerful conventions, Octopus Deploy makes it easy to automate releases of ASP.NET applications and Windows Services. Say goodbye to remote desktop and start automating today!

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

Hosting By

## Trying Redis Caching as a Service on Windows Azure

June 25, '14 Comments [15] Posted in Azure

First, if you have already have an MSDN subscription (through your work, whatever) make sure to link your MSDN account and an Azure Account, otherwise you're throwing money away. MSDN subscribers get between US$50 and US$150 a month in free Azure time, plus a 33% discount on VMs and 25% off Reserved Websites.

Next, log into the Azure Preview Portal at https://portal.azure.com.  Then, go New | Redis Cache to make a new instance. The Redis Cache is in preview today and pricing details are here. both 250 meg and 1 GB caches are free until July 1, 2014 so you've got a week to party hard for free.

Of course, if you're a Redis expert, you can (and always could) run your own VM with Redis on it. There's two "Security Hardened" Ubuntu VMs with Redis at the MS Open Tech VMDepot that you could start with.

I put one Redis Cache in Northwest US where my podcast's website is.  The new Azure Portal knows that these two resources are associated with each other because I put them in the same resource group.

There's Basic and Standard. Similar to Website's "basic vs standard" it comes down to Standard you can count on, it has an SLA and replication setup. Basic doesn't. Both have SSL, are dedicated, and include auth. I'd think of Standard as being "I'm serious about my cache" and Basic is "I'm messing around."

There are multiple caching services (or Cache as a Service) on Azure.

• Redis Cache: Built on the open source Redis cache. This is a dedicated service, currently in Preview.
• Managed Cache Service: Built on AppFabric Cache. This is a dedicated service, currently in General Availability.
• In-Role Cache: Built on App Fabric Cache. This is a self-hosted cache, available via the Azure SDK.

Having Redis available on Azure is nice since my startup MyEcho uses SignalR and SignalR can use Redis as the backplane for scaleout.

Marc Gravell (with a "C") over at StackExchange/StackOverflow has done us all a service with the StackExchange.Redis client for .NET on NuGet. Getting stuff in and out of Redis using .NET is very familiar to anyone who has used a distributed Key Value store before.

• BONUS: There's also ServiceStack.Redis from https://servicestack.net that includes both the native-feeling IRedisNativeClient and the more .NET-like IRedisClient. Service Stack also supports Redis 2.8's new SCAN operations for cursoring around large data sets.
ConnectionMultiplexer connection = ConnectionMultiplexer.Connect("contoso5.redis.cache.windows.net,ssl=true,password=...");IDatabase cache = connection.GetDatabase();// Perform cache operations using the cache object...// Simple put of integral data types into the cachecache.StringSet("key1", "value");cache.StringSet("key2", 25);// Simple get of data types from the cachestring key1 = cache.StringGet("key1");int key2 = (int)cache.StringGet("key2");

In fact, the ASP.NET team announced just last month the ASP.NET Session State Provider for Redis Preview Release that you may have missed. Also on NuGet (as a -preview) this lets you point the Session State of your existing (perhaps legacy) ASP.NET apps to Redis.

After pushing and pulling data out of Redis for a while, you'll notice how nice the new dashboard is. It gives you a great visual sense of what's going on with your cache. You see CPU and Memory Usage, but more importantly Cache Hits and Misses, Gets and Sets, as well as any extraordinary events you need to know about. As a managed service, though, there's no need to sweat the VM (or whatever) that your cache is running on. It's handled.

From the Azure Redis site:

Perhaps you're interested in Redis but you don't want to run it on Azure, or perhaps even on Linux. You can run Redis via MSOpenTech's Redis on Windows fork. You can install it from NuGet, Chocolatey or download it directly from the project github repository. If you do get Redis for Windows (super easy with Chocolatey), you can use the redis-cli.exe at the command line to talk to the Azure Redis Cache as well (of course!).

It's easy to run a local Redis server with redis-server.exe, test it out in develoment, then change your app's Redis connection string when you deploy to Azure.

Sponsor: Many thanks to our friends at Octopus Deploy for sponsoring the feed this week. Did you know that NuGet.org deploys with Octopus? Using NuGet and powerful conventions, Octopus Deploy makes it easy to automate releases of ASP.NET applications and Windows Services. Say goodbye to remote desktop and start automating today!

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

Hosting By

## Exploring cross-browser math equations using MathML or LaTeX with MathJax

June 23, '14 Comments [12] Posted in Tools

Let me just start by saying that I got a C in Calculus. It just didn't click for me. I'm aces at basic math, but you lost me at limits. That said, I have always found MathML to be fascinating, and I'm surprised that even now over 15 years after MathML's first release that is has such minimal browser support. Other than Firefox, and surprisingly iOS Safari, there's basically no widely available native support.

Perhaps it's because MathML is, well, XML. You wanna express something simple like 2+2, except stacked up with a line? Here's that:

<mstack>  <mn>2</mn>  <msrow> <mo>+</mo> <none/> <mn>2</mn> </msrow>  <msline/></mstack>

That's rather verbose. Most math folks that I've talked to (including a very nice Stanford professor I met on a plane recently) use LaTeX to express Math. The professor I met edited whole academic papers in raw LaTeX and compiled them to PDF. He was surprised when I explained that he was a programmer but perhaps didn't realize it!

LaTeX looks like this, for the 2+2 example:

$2+2$

I'd show you some more complex MathML examples but it would get pretty crazy very quickly.

The landscape for writing equations easily online using either MathML or LaTeX is, from my untrained eye, rather chaotic, link-rotty, and messy. Many Math and MathML online resources are full of link rot or their apps have been forgotten. While looking for a MathML to LaTeX convertor I found four different sits that promised to deliver only to discover that their back end systems are gone.

It seems that there are two places online that are shining examples of online Math rendering and that's MathJax and Math.StackExchange.com. MathJax is "an open source JavaScript display engine for mathematics that works in all browsers" and Math.StackOverflow.com is well, StackOverflow for Math people. What's special about Math.SO is that it's integrated MathJax's fantastic library into it's main editor.

NOTE: Not all StackExchange sites have the integrated math preview due to the size of the libraries involves. It's not something that you just "throw in just in case."

Here's an example where I'm typing some LaTeX into StackOverflow's site. In this case, I'm using Chrome.

They are using MathJax's library here. I can right-click on the equation, and ask "show math as MathML" and get this popup:

How does MathJax manage all this without actual browser support for MathML (or LaTeX?)

You configure MathJax like this, telling it what to look for when doing inline math. Sometimes it's bracketed by $and$ and sometimes by $and$.

<script type="text/x-mathjax-config">  MathJax.Hub.Config({    tex2jax: { inlineMath: [['$','$'],['\$','\$']] }  });</script><script type='text/javascript' src='http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML'></script>`

For Tex as in this example, MathJax is using HTML and CSS, generating the inline Math equations into a series of DOM elements.

When using MathML proper, you can configure MathJax to use native MathML rendering when it's available. Only FireFox really supports that. This page lets you switch between HTML-CSS using Web Fonts, MathML, or SVG. SVG looks a little rough to me, but HTML-CSS always looks nice.

Bottom line, if you have a need to express mathematical equations of any kind online, you're going to want to use MathJax. Love them, thank them, appreciate them.

NOTE via Wikipedia: The MathJax project was founded by the American Mathematical Society, Design Science, and the Society for Industrial and Applied Mathematics and is supported by numerous sponsors such as the American Institute of Physics and Stack Exchange.

Oh, and if you're English, "Maths."

Sponsor: Many thanks to our friends at Octopus Deploy for sponsoring the feed this week. Did you know that NuGet.org deploys with Octopus? Using NuGet and powerful conventions, Octopus Deploy makes it easy to automate releases of ASP.NET applications and Windows Services. Say goodbye to remote desktop and start automating today!