Compositing two images into one from the ASP.NET Server Side
Today I had a system that was sending me two base64'ed images in an XML response. The images were of the front and back image of a check. However, the requirement is to show a single composite check image at the browser with the front image stacked on top of the back image. Of course, it's got to be secure so no temp files, blah blah.
Here's the solution, done as an HttpHandler, so something like <img src="checkimage.ashx?whatever=4&something=6">
public class SomeCheckImageHandler : IHttpHandler
//some stuff snipped
public void ProcessRequest(HttpContext context)
context.Response.ContentType = "image/jpg";
//some stuff snipped
GetCheckImageRequest req = new GetCheckImageRequest();
//some stuff snipped, get the params from the QueryString
GetCheckImageResponse res = banking.GetCheckImage(req);
//some stuff snipped
if (res.ImageBack != null)
//merge them into one image
using(MemoryStream m = new MemoryStream(res.BackImageBytes))
using(Image backImage = System.Drawing.Image.FromStream(m))
using(MemoryStream m2 = new MemoryStream(res.BrontImageBytes))
using(Image frontImage = System.Drawing.Image.FromStream(m2))
using(Bitmap compositeImage = new Bitmap(frontImage.Width,frontImage.Height+backImage.Height))
using(Graphics compositeGraphics = Graphics.FromImage(compositeImage))
compositeGraphics.CompositingMode = CompositingMode.SourceCopy;
else //just show the front, we've got no back
using(MemoryStream m = new MemoryStream(frontImageBytes))
using(Image image = System.Drawing.Image.FromStream(m))
I love it when .NET makes things this easy.
Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.
All of these built-in win32 wrapper classes (eg, Bitmap) use finalizers internally, so the unmanaged resources behind them will already get GC'ed appropriately no matter what happens-- even if you get an exception.
Using is really an optimization-- the stuff that would get GC'ed in the event of an exception anyhow, just gets GC'ed.. a tiny bit sooner.
And if you were looking to release this memory as soon as possible, you'd have to explicitly set these objects = Nothing to force them to get cleaned up ASAP. So even as an optimization, Using doesn't buy us a lot.
Dunno. Using isn't wrong, clearly, but I question whether the marginally better memory management benefits of Try..Finally are worth the significant extra noise in this particular example.
Had I called Dispose() explicitly, would that be better? Are you arguing against the explicit guaranteed Dispose provided by the using() pattern, or against the Try..Finally?
The setup for a TRY isn't expensive, the CATCH is expensive.
I argue that using isn't an optimization, it's appropriate in any case, such as this one, where one should clean up sooner and for sure, than later and maybe.
Additionally, in answer to your "noise" comment ;) using 'using' in this context doesn't ADD any extra lines of code to the example. The only 'noise' is the word 'using.' (and all these 'quotes' :)
My 2 cents.
Well, those unmanaged resources get released no matter what (even with an exception), because the Bitmap object (for example) has a finalizer to clean up its internal managed resources. The question is WHEN they get released.
And as you pointed out they fall out of scope almost immediately in the normal case.
> hey don't need to be set to Nothing (null) to be GC'ed
Some objects (I'm thinking of DataSets) actually don't behave this way; you can close and dispose them 'til the cows come home but the memory won't be released until you explicitly set them to Nothing.
I asked Brad Abrams about this when he was in town for his talk, and he confirmed: setting an object to Nothing (null) is another kind of optimization you might need sometimes, if you want that memory back as soon as possible.
> you *shouldn't* dispose of streams and GDI objects.
Well, I'm not against it, I just think there's more subtlety to the situation than blindly following the rule "must always have Using". It's certainly more critical when you have file or database handles vs. in-memory objects like bitmaps, etc.
Remember the whole point of managed code is that you don't have to mess around with memory management, and using is a type of memory management. Explicit memory management becomes an *optimization* in the managed .NET world..
I'm talking about *resource* management - not memory management. (Yes I'm making a differentiation between phyiscal resources proper and memory which while phyiscal, I'm putting in a category of its own) I'm not worried about leaking memory, of course the GC will take care of me. I'm worried about wasting HWND/HDCs when underload. As you point out it's more critical when using objects that are fronts for physical resources.
If there was a SerialPort class or a Socket class that used the using pattern, would you not agree it'd be important to use dispose of these objects physical resources immediately? Do you not use 'using' when you're working with SqlConnection or XmlTextReader? With both of those classes - particularly Readers that front physical files - deterministic finalization is even more important.
The reason that Datasets are different is that they were written differently. If you look at the Dispose on System.Data.DataSet, you'll notice that it doesn't set its own memory structures to nothing. If it did, then you wouldn't need to set it to null (Nothing).
And all setting a DataSet or a whatever to nothing does, is causes it to leave 'scope' faster in the eyes of the GC. It's just like using - it's doing something explictly that will happen later regardless.
(note that I don't feel THAT strongly about this, but its a fun conversation)
Your Bitmap HDCs won't be "wasted"-- they'll be released by the Bitmap object Finalizer, 100% of the time, whether you put a Using there or not. The question is, how fast will they be released and how finite/precious of a resource are we talking about? This gets into optimization and, as you know "premature optimization is the root of all evil."
> would you not agree it'd be important to use dispose of these objects physical resources immediately?
Well, again, depends on
- how finite/precious those resources are
- how quickly they get released in the "worst case"
- how often the worst case occurs
- how many times per second this could happen
For something like a serial port handle, yeah, it's pretty important. For a bitmap, not so much. Who cares if a bitmap handle (or the memory used by that bitmap) is released 1/1000th of a second later than it normally would have been?
> It's just like using - it's doing something explictly that will happen later regardless.
Right, it's an OPTIMIZATION not a REQUIREMENT. And that important distinction is lost on many, many .NET developers who somehow "forget" that they don't have to worry about memory management any more.
Programmers spend so much time thinking about edge conditions (to be fair, that's our job) that we delude themselves into believing that the standard case is an edge condition -- eg, boy I better clean up these HDCs manually or else I'm screwed! Just never lose sight of the fact that memory management is an OPTIMIZATION in the .NET world. Don't do it because you "have" to.
> but its a fun conversation
Right, and the code you wrote is totally correct.. it's just a little distinction that I've tended to forget in my own code. I actually got into trouble recently with a class that implemented IDisposable and *SHOULDN'T* have (it wasn't wrapping anything unmanaged!), which led to some weird COM exceptions under load..
Is there an easy way to map handlers to directories i.e. delicious:
to eliminate the url / querystring format:
Yes, see here: http://www.interact-sw.co.uk/iangblog/2004/01/12/shinyurl and here: http://www.interact-sw.co.uk/iangblog/2004/01/14/rewritingurls
Comments are closed.
Good point. Especially relevant, considering your last post.
Any signers of the petition willing to post the equivalent VB6 code?