Brainstorming - Creating a small single self-contained executable out of a .NET Core application
I've been using ILMerge and various hacks to merge/squish executables together for well over 12 years. The .NET community has long toyed with the idea of a single self-contained EXE that would "just work." No need to copy a folder, no need to install anything. Just a single EXE.
While work and thought continues on a CoreCLR Single File EXE solution, there's a nice Rust tool called Warp that creates self-contained single executables. Warp is cross-platform, works on any tech, and is very clever
The Warp Packer app has a slightly complex command line, like this:
.\warp-packer --arch windows-x64 --input_dir bin/Release/netcoreapp2.1/win10-x64/publish --exec myapp.exe --output myapp.exe
Fortunately Hubert Rybak has created a very nice "dotnet-warp" global tool that wraps this all up into a single command, dotnet-warp.
All you have to do is this:
C:\supertestweb> dotnet tool install -g dotnet-warp
O Running Publish...
O Running Pack...
In this example, I just took a Razor web app with "dotnet new razor" and then packed it up with this tool using Warp packer. Now I've got a 40 meg self-contained app. I don't need to install anything, it just works.
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 2/6/2019 9:14 AM bin
d----- 2/6/2019 9:14 AM obj
d----- 2/6/2019 9:13 AM Pages
d----- 2/6/2019 9:13 AM Properties
d----- 2/6/2019 9:13 AM wwwroot
-a---- 2/6/2019 9:13 AM 146 appsettings.Development.json
-a---- 2/6/2019 9:13 AM 157 appsettings.json
-a---- 2/6/2019 9:13 AM 767 Program.cs
-a---- 2/6/2019 9:13 AM 2115 Startup.cs
-a---- 2/6/2019 9:13 AM 294 supertestweb.csproj
-a---- 2/6/2019 9:15 AM 40982879 supertestweb.exe
Now here's what it gets interesting. Let's say I have a console app. Hello World, packed with Warp, ends up being about 35 megs. But if I use the "dotnet-warp -l aggressive" the tool will add the Mono ILLinker (tree shaker/trimmer) and shake off all the methods that aren't needed. The resulting single executable? Just 9 megs compressed (20 uncompressed).
C:\squishedapp> dotnet-warp -l aggressive
O Running AddLinkerPackage...
O Running Publish...
O Running Pack...
O Running RemoveLinkerPackage...
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 2/6/2019 9:32 AM bin
d----- 2/6/2019 9:32 AM obj
-a---- 2/6/2019 9:31 AM 47 global.json
-a---- 2/6/2019 9:31 AM 193 Program.cs
-a---- 2/6/2019 9:32 AM 178 squishedapp.csproj
-a---- 2/6/2019 9:32 AM 9116643 squishedapp.exe
Here is where you come in!
NOTE: The .NET team has planned to have a "single EXE" supported packing solution built into .NET 3.0. There's a lot of ways to do this. Do you zip it all up with a header/unzipper? Well, that would hit the disk a lot and be messy. Do you "unzip" into memory? Do you merge into a single assembly? Or do you try to AoT (Ahead of Time) compile and do as much work as possible before you merge things? Is a small size more important than speed?
What do you think? How should a built-in feature like this work and what would YOU focus on?
Sponsor: Check out Seq 5 for real-time diagnostics from ASP.NET Core and Serilog, now with faster queries, support for Docker on Linux, and beautiful new dark and light themes.
Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.
.net apps today are actually quite long to start, maybe because of many dlls to load. I'd expect a single exe to load faster.
That being said, I'd like to be able to do scenarios like having app A and B in the same folder that share the same library L as a dll (this dependency might be a combination of L1 and L2 too). In other words : not only self-contained apps, but also self-contained libraries and self-contained + some externals (could be for licensing reasons).
The key thing is : make it easy to configure complex link scenarii
Do you zip it all up with a header/unzipper? Well, that would hit the disk a lot and be messy
Warp is interesting, but isn't this exactly how it works?
Do you "unzip" into memory?
If we can't rely on a shared CLR being installed, presumably this would require a native 'packer' executable that contained the compressed binaries?
Do you merge into a single assembly?
Can't ILRepack and/or ILMerge do this today?
Or do you try to AoT (Ahead of Time) compile and do as much work as possible before you merge things?
Is that what IL Linker and Mono Linker do? I think this is a really interesting avenue to explore, but would it play nicely with reflection?
It's been working well for me.
Just wanted to add some information about using "dotnet-pack -l aggressive" and dotnet core 3.0
Scott already explains this uses the mono ILLinker and currently this doesn't work with dotnet core 3.0. Read and follow the issue here: https://github.com/mono/linker/issues/428
I would hope to choose at build time between uncompressed for speed and various levels of compression.
I would hope to choose at startup between "uncompress to memory", "uncompress to disk and keep for later" and "uncompress to disk/tmp every time". Also with option to do it all at startup or just in time.
If you run from a DVD it would be nice to have the whole assembly load linearly into memory. Yes we still use DVDs in the places I frequent because USB devices are not allowed.
We have over a 100 services running on 1 server, and hundreds of small application across our network.
We don't want hundreds of copies of the same .NET framework.
We need a way to load .NET framework on server startup and that needs to remain in memory as long as the server (or PC) is running.
Make .NET (CORE) smaller and faster instead of trying to install it with every application we deliver to our customers.
Maybe an option would be to pack the .NET framework in 1 library and leave the exe just the way it is. When deploying we can put just 1 .NET library on a server (PC) and share it between all applications.
I think we need something similar to .NET Framework 4 (which installs on a system/computer level), downloaded either through Windows Update or separate, which offer the possibility to keep the EXE size small.
Like Johan, we also have so many small EXEs (and DLLs) which is used throughout the company servers in so many different ways. Yes, size DO matter.
For the ASP.NET Core road, I register that you already have something called a Runtime & Hosting Bundle. Why not just create a .NET Core Framework Bundle 3.x, and make it installable the same way as you do today with .NET Framework.
Look at other tools, Delphi for example, they do it with 10MB or less even.
And yes, it works on really every 32/64 windows version without the need for any additional file at all.
Here are the problems:
- GitHub based packages ensure DLL hell for a large application with package inside package dependencies
- Transition from .NET framework from 1 per computer to several GitHub packages ensures DLL hell
- Shifting the version maintenance from the OS level (MS) to individual GitHub packages ensures a large C# project lasting 5+ years will have man months spent upgrading GitHub packages, debugging problems and dealing with DLL hell
Forcing man years of wasted effort and expense of moving existing c# .net Framework applications to .net core. You can run the old .net framework for a long time, but many of the github packages you use or third party libraries will have moved to .net core. Your option is to stick with the old packages or old third party libraries with all the bugs and limitation or spend many man months doing a full release cycle just to switch to .net core.
Simple put. .NET Core should ship the entire .net framework on Windows 10 machines and later with no need to add many .net core GitHub packages to get small bits of the .net core framework.
.NET core is an OS level component and should be on every Windows 10 machine by default.
Linux users will have to install the entire .net core framework before using your application. This worked well for the regular .net framework when v2.0 was released.
A mid level of frustration in that I spent the last 3 weeks with GitHub .net framework and third party packages DLL hell for a 1 million line .net application. Keeping it as is with old packages did not help as serious bug fixes needed were only in the newer GitHub packages.
MS common business practice is to shift the high maintenance and configuration cost of the Framework side of an application from MS to the paying corporate customers.
How can .net dev team make such business decisions without regard to TCO for corporate customers?
The SQL Server team could not make these type of decisions and force huge costly upgrades on the customers every 3 years.
No large company would buy SQL Server if the terms and conditions stated that you'll have to replace SQL Server, reconfigure you data center, and spend lots of money every 3 years to get back to the level of reliability and security you had when initially installing SQL server.
Half maintained GitHub packages should not be the norm. Development should get more productive over time for very large systems and not just the click-drag-click bam 2 screen demo app in 5 minutes.
Microsoft has community responsibility to push for technologies making development easier, cleaner, more secure and lower cost for long term large projects.
Microsoft has been doing the opposite by breaking the framework into many small pieces, using more soon to be maintenance problem third party tools in VS, shifting more and more of the configuration to corporate developers.
First should come good MS effort towards large scale solution developers which have been ignored by MS for 10 years; secondly should come good MS effort towards developers of small solutions.
My large employer has stopped adopting most MS .net/Azure related technologies outside of central C#, asp.net mvc, sql server on premise or hosted in a basic Azure web site. That's after several 6 month X person projects just to upgrade existing critical business systems to the latest VS and .net framework versions.
The large system I work on has ~40 projects each with up to 25 GitHub packages. Budget for adding new business functionality for the large system comes once every 2 to 3 years. Hotfixes every 3 to 4 months. Budget for syncing up and upgrading just GitHub packages is non-existent.
It's ridiculous - Windows 3.1 (I know, a long time ago) was only 11 Mb, for a complete graphical OS. So i think that 9Mb for a Hello World console application is simply ridiculous.
Hard to come up with use cases for this. If you can deliver a single exe you can usually deliver a single folder full of files.
Many people have commented on the GitHub issue that the ability to do that is not important to them. The team seems stubbornly set on delivering this.
native: compile a single native instruction set
all: all the referenced dll in the same package
reference: the referenced dll stays as reference (but the app may be natively compiled)
shake: perform treeshaking and includes only the code that can be statically reached
hint: include in the package also the indicated parts of code
different apps may have different needs, as always having options may save the day :)
Captain Obvious :P
IMHO, I see a natural progression:
- Binaries with static and dynamic shared libraries - this will come and go with different methods of API communication
- Simple roll it all in binary - easy to share and publish, but big
- Pruned binary - I like the idea of trimming the unused binary fat
- Optimized binary - Make grandpa Borland proud, self-managed, self-contained, lean, fast, small
- 'highlander' binary - only one, runs anywhere, immortal but cut off the head and the body dies (future replacing 'containers')
I feel like decompressing to run might come with it's own set of problems. Although instead of decompressing to memory or the FS, why not decompress to windows container (ala UWP) and ask for permission to do anything beyond what a UWP app can do, but keep it a possibility.
I think some are missing out on where the single binary could be going and that the shared lib option will probably never go away. If you are scared of the single binary, are you also looking into using Docker to scoop up the mess in development and dump it into production? You might want to meditate on what is different. In the long term I would like to see compiling to a single exe the future of containers: Small, Fast, Sandboxed. Maybe future binaries have a completely different extension because ... Container as Executable
I am writing my command line utilities "still" in cpp. If 32bit compiled against the "old" Win7 SDK and linked against VS runtime statically, I end up with a exe of 1.2MB running blazing fast and completely standalone from WinXP up to the latest Win10.
And with all the new C11/C17 features available, cpp has become much less cumbersome than it used to be.
My cmd utilities are often called in tight loops (scripts iterating over file/folder trees). So startup time is important. Much more important than for fat desktop UI apps.
If runtime lightness is more important than dev time, I use cpp. If dev time is more important, I use managed languages as C# or Java.
Having both at the same time is like wanting to have the cake and still eating it. For me this is a solution looking for a problem. But of course other's MMV.
I apologize in advance for posting this comment on the wrong blog entry - it is intended for your blog post of Jan 16, 2019 - Installing .NET Core 2.x SDK on a Raspberry Pi; but comments are closed on that post. If possible, I would love the content of this comment to be moved to that post - as it will hopefully help people who are trying to follow that blog (like I was).
I followed your excellent instructions for installing .NET Core on a Raspberry Pi 3. My install was a little tweaked, because I was installing 2.2.103 - but your instructions explain how to find the link for newer versions of the framework.
Note 1) You explain how to also download and install the ASP.NET Runtime; but I wonder if that is no longer needed. It seemed to me that the ASP.NET Runtime is included in the .NET Core SDK with version 2.2.103.
Note 2) This is the reason I wanted to add this comment: After cloning my .NET Core project via Git to my Raspberry Pi; I tried running "dotnet restore" to restore NuGet packages. This caused a ton of very weird SSL errors while talking to the NuGet API host. I spent hours trying to diagnose this and my Internet connection and the packages I used in my project, etc. In the end, what I figured out is that it seems like "dotnet restore" is trying to download too many packages at once from api.nuget.org - and this overwhelms some part of SSL-handling (OpenSSL?) on the Raspberry Pi. After hours of many wild goose chases, the simple fix was to run "dotnet restore --disable-parallel". Wow, I wish I would have figured out to to do that earlier; and hopefully that saves others some pain.
Note 3) There was some discussion in the comments about running WinForms and other GUIs on .NET Core on the Raspberry Pi. I am having some luck with using AvaloniaUI on the Raspberry Pi. There are specific notes/discussion for running "Avalonia on Raspbian" - so interested people should make sure they Google-Bing that phrase. (I had some trouble with libSkiaSharp - but the correct ARM version is in the latest NuGet package of Avalonia.Skia.Linux.Natives.)
Thank you for your excellent blog!
Comments are closed.