When did we stop caring about memory management?
This post is neither a rant nor a complaint, but rather, an observation.
There's some amazing work happening over in the C#-based Kestrel web server. This is an little open source webserver that (currently) sits on libuv and lets you run ASP.NET web applications on Windows, Mac, or Linux. It was started by Louis DeJardin but more recently Ben Adams from Illyriad Games has become a primary committer, and obsessive optimizer.
Kestrel is now doing 1.2 MILLION requests a second on benchmarking hardware (all published at https://github.com/aspnet/benchmarks) and it's written in C#. There's some amazing stuff going on in the code base with various micro-optimizations that management memory more intelligently.
. @ben_a_adams I think it's safe to say you're on to something in that PR pic.twitter.com/ELIyxhYyun— Damian Edwards (@DamianEdwards) December 23, 2015
Here's my question to you, Dear Reader, and I realize it will differ based on your language of choice:
When did you stop caring about Memory Management, and is that a bad thing?
When I started school, although I had poked around in BASIC a bit, I learned x86 Assembler first, then C, then Java. We were taught intense memory management and learned on things like Minix, writing device drivers, before moving up the stack to garbage collected languages. Many years later I wrote a tiny operating system simulator in C# that simulated virtual memory vs physical memory, page faults, etc.
There's a great reference here at Ravenbook (within their Memory Pool System docs) that lists popular languages and their memory management strategies. Let me pull this bit out about the C language:
The [C] language is notorious for fostering memory management bugs, including:
- Accessing arrays with indexes that are out of bounds;
- Using stack-allocated structures beyond their lifetimes (see use after free);
- Using heap-allocated structures after freeing them (see use after free);
- Neglecting to free heap-allocated objects when they are no longer required (see memory leak);
- Failing to allocate memory for a pointer before using it;
- Allocating insufficient memory for the intended contents;
- Loading from allocated memory before storing into it;
- Dereferencing non-pointers as if they were pointers.
When was the last time you thought about these things, assuming you're an application developer?
I've met and spoken to a number of application developers who have never thought about memory management in 10 and 15 year long careers. Java and C# and other languages have completely hidden this aspect of software from them.
They have performance issues. They don't profile their applications. And sometimes, just sometimes, they struggle to find out why their application is slow.
My buddy Glenn Condron says you don't have to think about memory management until you totally have to think about memory management. He says "time spent sweating memory is time you're not writing your app. The hard part is developing the experience is that you need to know when you need to care."
I've talked about this a little in podcasts like the This Developer's Life episode on Abstractions with guests like Ward Cunningham, Charles Petzold, and Dan Bricklin as well as this post called Please Learn to Think about Abstractions.
I propose it IS important but I also think it's important to know how a differential gear works, but that's a "because" argument. What do you think?
Sponsor: Big thanks to Infragistics for sponsoring the blog this week! Responsive web design on any browser, any platform and any device with Infragistics jQuery/HTML5 Controls. Get super-charged performance with the world’s fastest HTML5 Grid - Download for free now!