Scott Hanselman

Fixing System.Core 2.0.5 FileLoadException, Portable Libraries and Windows XP support

May 8, '14 Comments [17] Posted in Bugs | Learning .NET | WPF
Sponsored By

Installing Windows XP to testMy buddy Greg and I are getting ready to launch our little side startup, and I was going through our product backlog. Our app consists of a global cloud service with Signalr, an iPhone app made with Xamarin tools, and a WPF app.

One of the items in our Trello backlog was "Support Windows XP. Gasp!"

I hadn't given this item much thought, but I figure it was worth a few hours look. If it was easy, why not, right?

Our WPF desktop application was written for .NET 4.5, which isn't supported on Windows XP.  I want to my app to support as basic and mainstream a .NET 4 installation as possible.

Could I change my app to target .NET 4 directly? I use the new async and await features extensively.

Well, of course, I remembered Microsoft released the Async Targeting Pack (Microsoft.Bcl.Async) through NuGet to do just this. In fact, if I was targeting .NET 3.5 I could use Omer Mor's AsyncBridge for .NET 3.5, so it's good that I have choices.

I changed my project to target .NET 4, rather than 4.5, installed these NuGets, and recompiled. No problem, right?

However, when I run my application on Windows XP it crashes immediately. Fortunately I have instrumented it with Raygun.io so all my crashes to to the cloud for analysis. It gives me this nice summary:

raygun.io is amazing 

Here's the important part:

FileLoadException: Could not load file or assembly 
'System.Core, Version=2.0.5.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e, Retargetable=Yes'
or one of its dependencies. The given assembly name or codebase was invalid.
(Exception from HRESULT: 0x80131047)

That's weird, I'm using .NET 4 which includes System.Core version 4.0. I can confirm what's in the GAC (Global Assembly Cache) with this command at the command line. Remember, your computer isn't a black box.

C:\>gacutil /l | find /i "system.core"
System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089, processorArchitecture=MSIL
System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089, processorArchitecture=MSIL

OK, so there isn't even a System.Core version 2.0.5 in the GAC. Turns out that System.Core 2.0.5 is the Portable Libraries version, meant to be used everywhere (that means, Silverlight, etc, everywhere) so they made the version number compatible.

Because we're building our iPhone app with Xamarin tools and we anticipate supporting other platforms, we use a Portable Library to share code. But, it seems that support for Portable Libraries were enabled on .NET 4 vanilla by the KB2468871 update.

I don't want to require any specific patch level or hotfixes. While this .NET 4 framework update was pushed to machines via Windows Update, for now I want to support the most basic install if I can. So if the issue is Portable Libraries (which I still want to use) then I'll want to bring those shared files in another way.

You can LINK source code in Visual Studio when you Add File by clicking the little dropdown and then Add as Link:

Adding source code as a Link within Visual Studio

Now my Messages.cs file is a link. See the little shortcut overlay in blue?

A linked file as a little overlay on the icon

I removed the project reference to the Portable Library for this WPF application and brought the code in this way. I'm still sharing core, but just not as a binary for this one application.

Recompile and redeploy and magically .NET 4 WPF application with async/await and MahApps.Metro styling starts up and runs wonderfully on this 12 year old OS with just .NET 4 installed.

For our application this means that my market just got opened up a little and now I can sell my product to the millions of pirated and forever unpatched Windows XP machines in the world. Which is a good thing.


Sponsor: Big thanks to Aspose for sponsoring the blog feed this week. Aspose.Total for .NET has all the APIs you need to create, manipulate and convert Microsoft Office documents and a host of other file formats in your applications. Curious? Start a free trial today.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

FREE Pluralsight video: "Get Involved" in community!

May 7, '14 Comments [17] Posted in Musings | Open Source
Sponsored By

imageI'm absolutely thrilled to announce that my feature-length collaboration with Rob Conery called "Get Involved" is now available FREE from Pluralsight!

You can watch the whole movie RIGHT now for FREE. No sign up or subscription needed! Please tweet and tell your friends!

http://getinvolved.hanselman.com

We really poured our hearts into this production and we really hope you enjoy it.

In this feature-length production, Scott Hanselman and Rob Conery offer suggestions and advice on how you can get out there, and get involved. Blogging, Twitter, Github, StackOverflow, User Groups and Conferences: all of this can make you a happier, more productive developer and inspire you to take your career to the next level.

You certainly don't have to be social to be better at writing code - but sharpening your skills this way helps you when it comes time for a job interview, a yearly review where a promotion is on the line, or when you want to start running an Open Source project.

If you're a fan of This Developer's Life you know how tightly Rob and I like to produce things - this video is no exception.

Filmed on the streets of Portland and at a Portland user group, we talk about Blogging, Twitter, Github, StackOverflow, Open Source, Speaking, User Groups and Conferences - all of this hoping to make you a happier, more productive, more connected developer. We want to inspire you and perhaps to take your career to the next level.

Additionally, we stretched far beyond Portland to seek out the other people who active in the social technology space!

  • Jon Skeet joins us to talk about what a Good Question is on StackOverflow - and also how you can gain reputation by providing Good Answers - and edits to Good Questions!
  • We venture out to the Portland Area DotNet Users Group (PADNUG) and meet a few developers who have just started going - as well as people who have been there for years.
  • While we were there, I gave a 10-minute lightning talk on Azure - a great way to get started speaking if you're not a fan of public speaking. Rob filmed the whole thing.

By the way, if you do have a Pluralsight Subscription, you've got access to thousands of hours of technical video training, like my other video on technical presenting!

image

The Art of Speaking - Become a better technical public speaker

Have you thought about speaking on technology publically? Maybe you want start talking a local user groups and then work your way up to larger regional code camps? "The Art of Speaking" is an 80 min Pluralsight course that Rob and I created to help you do just that!

You'll learn all about:

  • The Speaker Mindset
    • Not Wasting Time
    • What to do, What Not to Do
    • Defining a Perfect Talk
  • Preparation
    • Choosing Your Demo
    • Choosing Your Delivery Style
    • Handling Pressure
    • Engaging the Audience
    • Code vs. Slides
    • Creating an Outline
    • Creating the Story
    • Building a DemoYou have not watched this Clip.
    • Building a More Complex DemoYou have not watched this Clip.
    • Creating Your Slides
  • ExecutionYou have not watched this Module.
    • The Unexpected
    • The Tech Check
    • The Delivery

Keep an eye out for my next video with Rob, coming very soon exclusively to Pluralsight Subscribers!

Rob has a lot of great videos as well like these and many more!

Rob and I also have a one hour video where we move the This Developer's Life podcast website to Microsoft Azure LIVE, so check that out.

Many, many, thanks to Pluralsight for giving this video to the community for free! If you're on Twitter, go thank them now @pluralsight.


Sponsor: Big thanks to Aspose for sponsoring the blog feed this week. Aspose.Total for .NET has all the APIs you need to create, manipulate and convert Microsoft Office documents and a host of other file formats in your applications. Curious? Start a free trial today.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Cloud Power: How to scale Azure Websites globally with Traffic Manager

May 5, '14 Comments [26] Posted in Azure
Sponsored By

The "cloud" is one of those things that I totally get and totally intellectualize, but it still consistently blows me away. And I work on a cloud, too, which is a little ironic that I should be impressed.

I guess part of it is historical context. Today's engineers get mad if a deployment takes 10 minutes or if a scale-out operation has them waiting five. I used to have multi-hour builds and a scale out operation involved a drive over to PC Micro Center. Worse yet, having a Cisco engineer fly in to configure a load balancer. Certainly engineers in the generation before mine could lose hours with a single punch card mistake.

It's the power that impresses me.

And I don't mean CPU power, I mean the power to build, to create, to achieve, in minutes, globally. My that's a lot of comma faults.

Someone told me once that the average middle class person is more powerful than a 15th century king. You eat on a regular basis, can fly across the country in a few hours, you have antibiotics and probably won't die from a scratch.

Cloud power is that. Here's what I did last weekend that blew me away.

Here's how I did it.

Scaling an Azure Website globally in minutes, plus adding SSL

I'm working on a little startup with my friend Greg, and I recently deploy our backend service to a small Azure website in "North Central US." I bought a domain name for $8 and setup a CNAME to point to this new Azure website. Setting up custom DNS takes just minutes of course.

CNAME Hub DNS

Adding SSL to Azure Websites

I want to run my service traffic over SSL, so I headed over to DNSimple where I host my DNS and bought a wildcard SSL for *.mydomain.com for only $100!

Active SSL Certs

Adding the SSL certificate to Azure is easy, you upload it from the Configure tab on Azure Websites, then binding it to your site.

SSL Bindings

Most SSL certificates are issued as a *.crt file, but Azure and IIS prefer *.pfx. I just downloaded OpenSSL for Windows and ran:

openssl pkcs12 -export -out mysslcert.pfx -inkey myprivate.key -in myoriginalcert.crt

Then I upload mysslcert.pfx to Azure. If you have intermediaries then you might need to include those as well.

This gets me a secure connection to my single webserver, but I need multiple ones as my beta testers in Asia and Europe have complained that my service is slow for them.

Adding multiple global Azure Website locations

It's easy to add more websites, so I made two more, spreading them out a bit.

Multiple locations

I use Git deployment for my websites, so I added two extra named remotes in Git. That way I can deploy like this:

>git push azure-NorthCentral master
>git push azure-SoutheastAsia master
>git push azure-WestEurope master

At this point, I've got three web sites in three locations but they aren't associated together in any way.

I also added a "Location" configuration name/value pair for each website so I could put the location at the bottom of the site to confirm when global load balancing is working just by pulling it out like this:

location = ConfigurationManager.AppSettings["Location"];

I could also potentially glean my location by exploring the Environment variables like WEBSITE_SITE_NAME for my application name, which I made match my site's location.

Now I bring these all together by setting up a Traffic Manager in Azure.

Traffic Manager

I change my DNS CNAME to point to the Traffic Manager, NOT the original website. Then I make sure the traffic manager knows about each of the Azure Website endpoints.

Then I make sure that my main CNAME is setup in my Azure Website, along with the Traffic Manager domain. Here's my DNSimple record:

image

And here's my Azure website configuration:

Azure Website Configuration

Important Note: You may be thinking, hang on, I though there was already load balancing built in to Azure Websites? It's important to remember that there's the load balancing that selects which data center, and there's the load balancing that selects an actual web server within a data center. 
Also, you can choose between straight round-robin, failover (sites between datacenters), or Performance, when you have sites in geographic locations and you want the "closest" one to the user. That's what I chose. It's all automatic, which is nice.

Azure Traffic Manager

Since the Traffic Manager is just going to resolve to a specific endpoint and all my endpoints already have a wildcard SSL, it all literally just works.

When I run NSLOOKUP myHub I get something like this:

>nslookup hub.mystartup.com
Server: ROUTER
Address: 10.71.1.1

Non-authoritative answer:
Name: ssl.mystartup-northcentralus.azurewebsites.net
Address: 23.96.211.345
Aliases: hub.mystartup.com
mystartup.trafficmanager.net
mystartup-northcentralus.azurewebsites.net

As I'm in Oregon, I get the closest data center. I asked friends via Skype in Australia, Germany, and Ireland to test and they each got one of the other data centers.

I can test for myself by using https://www.whatsmydns.net and seeing the different IPs from different locations.

Global DNS

This whole operation took about 45 minutes, and about 15 minutes of that was waiting for DNS to propagate.

In less than an hour went from a small prototype in a data center in Chicago and then scaled it out to datacenters globally and added SSL.

Magical power.

Related Links


Sponsor: Big thanks to Aspose for sponsoring the blog feed this week. Aspose.Total for .NET has all the APIs you need to create, manipulate and convert Microsoft Office documents and a host of other file formats in your applications. Curious? Start a free trial today.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

"It's just a software issue"- Edge.js brings Node and .NET together on three platforms

April 30, '14 Comments [44] Posted in Learning .NET | nodejs | Open Source
Sponsored By
.NET and node together on three platforms

There was an engineer I used to work with who always said "That's just a software issue." No matter how complex the issue, no matter how daunting, they were confident it could be solved with software.

.NET and C# and NuGet and the community have been making some amazing stuff in the last few years like ScriptCS, Chocolately, Boxstarter. Azure Websites now supports ASP.NET, sure, but also PHP, Python, Java (Tomcat or Jetty or your own container), and node.js. Getting these things to work together has been an interesting software issue. Apps can run side-by-side, but they can't really talk to each other in-process. (Mostly one just moves data between universes over JSON and HTTP when need-be.)

However, Tomasz Janczuk has been working on Edge.js (on Github) for a while now. I showed his work at jQuery Portland last year, but this week he's taking it to the next level. He is creating a wormhole between software universes.

Edge.js now lets you run node.js and .NET code in-process on Windows, Mac, and Linux.

The name is great. An edge connects two nodes, and Edge.js is that edge.

node and .NET connected by edge.js

Here's a node app hello world node app calling .NET. Don't sweat that the .NET code is tunneled inside a comment, this is the Hello World proof of concept.

var edge = require('edge');

var helloWorld = edge.func(function () {/*
async (input) => {
return ".NET Welcomes " + input.ToString();
}
*/});

helloWorld('JavaScript', function (error, result) {
if (error) throw error;
console.log(result);
});

Perhaps you have a bunch of CPU intensive work or algorithms in C#, but you've also got a node.js app that needs the result of that work. Edge can help with that.

You can bring in a CS or CSX file into node like this:

var myCSharpCode = edge.func(require('path').join(__dirname, 'myCSharpCode.csx'));

You can bring code from a .NET DLL into a node.js compiled as well.

var clrMethod = edge.func({
assemblyFile: 'My.Edge.Samples.dll',
typeName: 'Samples.FooBar.MyType',
methodName: 'MyMethod'
});

It's not a hack, it's a clear way to marshal between CLR threads and the V8 (the node Javascript engine) thread. It's also interesting from a comp-sci perspective as the CLR can have many threads and V8 has the one.

nodecsharp

Here's Tomasz's own words:

Edge.js provides an asynchronous, in-process mechanism for interoperability between Node.js and .NET.

You can use this mechanism to:

  • access MS SQL from Node.js using ADO.NET more...
  • use CLR multi-threading from Node.js for CPU intensive work more...
  • write native extensions to Node.js in C# instead of C/C++
  • intergate existing .NET components into Node.js applications

Read more about the background and motivations of the project here.

Now, you might ask yourself, what problem does Edge.js solve? The answer is in the Edge.js FAQ.

Go explore what you can do. Edge goes much further than just C# and Node. It works on Windows, OSX, and Ubuntu but you should just "npm install edge" as there's a node package available.

Have fun! You have a lot more power and flexibility than you think. It's just a software problem.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Review: The Linksys WRT1900AC Dual-Wireless Router is the second coming of the WRT54G

April 24, '14 Comments [34] Posted in Reviews
Sponsored By

Linksys WRT1900AC RouterI just blogged about how I simplified my home network with a MoCA/Ethernet bridge. As a part of my home network rebuild, I swapped out my Netgear N600 for a shiny new Linksys WRT1900AC Wireless Router.

I've been a Linksys WRT54G fan for almost a decade. I ran HyperWRT for a while and then ended up with DD-WRT. Having a reliable, hackable router was a joy back in the day.

The Hardware

The new Linksys WRT1900AC has a design that is clearly meant to evoke the WRT54G, but it's a whole new beast. My first WRT54G was a Broadcom BCM4702 running @ 125Mhz, although later models went to 240Mhz. It had 16 megs of RAM and 4 megs of Flash. I was thrilled that theh WRT54G had "fast ethernet."

Compare that to the WRT1900AC with its dual-core 1.2Ghz ARM processor with 256 megs of DDR3. It's a PC, frankly, and I appreciate the power and flexibility.

This router is clearly a little spendy, and I was initially wondering it US$249 is worth the money. However, after using it for a week I can say yes. Let's say that it only lasts a year, that's less than $1 a day. If it lasts 5 years like previous routers, it's pennies. Considering that I work from home and need consistent and reliable connectivity, I'm willing to pay a premium for a premium device.

First, this is a 802.11a/b/g/n router and supports all devices, including the newer 80.11ac spec. It cover the full spectrum, pun intended, and has both 2.4GHz and 5.0Ghz support. It's got 4 large adjustable antennas, and the whole device is the size of a medium pizza. They even warn you not to put stuff on top of it so you don't block the heat sink.

I was also pleasantly surprised that the WRT1900AC has a USB 3.0 port and an eSATA port where you can plug in external storage, then access it as a file share. I was just talking to a neighbor who was considering a $600 NAS (Network Attached Storage) device, and I see now that the WRT1900AC could be that basic NAS for him. It supports FAT, NTFS, and HFS+ filesystems.

It's also super fast. Here's a large file copy for example. It's fast and rock solid at 100+ megabytes a second. I'm getting between 40-60 megabytes a second over wireless. I've also been able to get 20-40 megs a second off an attached hard drive. It's a competent simple NAS.

image

It's been consistently faster than my previous router in basically everything that I do. I haven't done formal tests, but it's looking like 20-30% just on the wireless side.

The Software

The WRT1900AC also will support OpenWRT later this year, and Linksys is encouraging folks like the DD-WRT, Open WRT, and Tomato projects to target this device. It's nice when a company creates hardware and doesn't freak out when the community wants to hack on it.

The installation was a breeze and I was impressed that they included a non-standard default password for out of the box security.

Their initial release of the built-in software is a little lacking, IMHO, in a few areas, most notably QoS (Quality of Service) and is a little bit of a step back from my previous routers. I'd like more absolute control over my traffic, but that's me. To compensate, I marked my Xbox and my Work PC as needing preferred packets, so rather than prioritizing specific traffic, the router will prioritize these machines by MAC address.

image

While it does lack in some places, it makes up in others. The interface is fast, and easy to use.

image

You can access lots of logs, diagnostics, and stats for everything. However, I have spent most of my time in the Network Map.

Screenshot (130)

Not to harp on this feature, but I really like this real-time filterable network map. From here I can see who's on which wireless channels, reserve DHCP leases, filter devices by type. It's a gimmick, but it's a gimmick that works and works well.

Screenshot (131)

I also registered my router with the LinksysSmartWifi.com site. This allows me to remotely manage the router from anywhere (without a dyndns.org account or opening the firewall) as well as from my iPhone. This also potentially means I could debug those network issues that only pop up when I'm travelling and my wife is trying to get on the internet. ;)

All in all, I'm very satisfied with this new router.

  • I've got greater wireless coverage than ever before.
  • I've got good management tools, inside, outside, and while mobile.
  • The speed is as good as anything I've ever used.
  • It has 90% of the features I need, and I'm confident I'll get more advanced features with updates or via open source projects.

For now, the Linksys WRT1900AC Wireless Router is sold only at Best Buy or on Linksys.com direct. It's worth the money if you want the fastest router out there.

* Disclaimer: I use affiliate links to buy gadgets and tacos. Click them and you support me, my lunch, and my blog.


Sponsor: Big thanks to Red Gate for sponsoring the feed this week. 24% of database devs don’t use source control. Do you? Database source control is now standard. SQL Source Control is an easy way to start - it links your database to any source control system. Try it free!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.