Scott Hanselman

Project-less scripted C# with ScriptCS and Roslyn

April 24, '13 Comments [31] Posted in NuGet | NuGetPOW | Open Source | VS2012
Sponsored By
ScriptCS inside of SublimeText2 with the ScriptCS package giving SyntaxHighlighting

Glenn Block is working on something interesting that combines C#, NuGet, Roslyn (the new "compiler as a service") and his love of text editors and scripts. Now, with help from Justin Rusbatch (@jrusbatch) and Filip Wojcieszyn (@filip_woj) they are having all kinds of fun...using C# as a scripting language.

Every few years someone tries to turn C# into a competent scripting world, myself included. Often this has included batch files and MacGyver magic, file associations and hacks. Clearly the .NET community wants something like this, but we are collectively still trying to figure out what it should look like. PowerShell aficionados - and I count myself amongst them - might look at such efforts as a threat or a pale reinvention of PowerShell, but the fact remains that C# at the command line, be it as a script or a REPL, is an attractive concept.

Simply put by example, ScriptCS lets me do this:

C:\temp>copy con hello.csx
Console.WriteLine("Pants");
^Z
1 file(s) copied.

C:\temp>scriptcs hello.csx
Pants

That's Hello World. There's no namespace, no class, just some C# in a .csx file. Roslyn takes care of the compilation and the resulting code and .exe never hits the disk.

Self-hosting Web APIs

So that's interesting, but what about bootstrapping a web server using NancyFX to host a Web API?

Go and clone this repo:

git clone https://github.com/scriptcs/scriptcs-samples.git

Look in the Nancy folder. There's a packages.config. Just like a node.js application has a packages.json file with the dependencies in has, a .NET app usually has a packages.config with the name. In node, you type npm install to restore those packages from the main repository. Here I'll type scriptcs -install...

C:\temp\scriptcs-samples\nancy>scriptcs -install
Installing packages...
Installed: Nancy.Hosting.Self 0.16.1.0
Installed: Nancy.Bootstrappers.Autofac 0.16.1.0
Installed: Autofac 2.6.3.862
Installation successful.

Now, running start.csx fires up an instance of Nancy listening on localhost:1234. There's no IIS, no ASP.NET.

C:\temp\scriptcs-samples\nancy>scriptcs start.csx
Found assembly reference: Autofac.Configuration.dll
Found assembly reference: Autofac.dll
Found assembly reference: Nancy.Bootstrappers.Autofac.dll
Found assembly reference: Nancy.dll
Found assembly reference: Nancy.Hosting.Self.dll
Nancy is running at http://localhost:1234/
Press any key to end

There is also the notion of a "ScriptPack" such that you can Require<T> a library and hide a lot of the bootstrapping and complexity. For example, I could start up WebAPI after installing a Web API package that includes some starter code. Note this is all from the command line. I'm using "copy con file" to get started.

C:\temp\foo>scriptcs -install ScriptCs.WebApi
Installing packages...
Installed: ScriptCs.WebApi
Installation completed successfully.
...snip...
Added ScriptCs.WebApi, Version 0.1.0, .NET 4.5
Packages.config successfully created!

C:\temp\foo>copy con start.csx
public class TestController : ApiController {
public string Get() {
return "Hello world!";
}
}

var webApi = Require<WebApi>();
var server = webApi.CreateServer("http://localhost:8080");
server.OpenAsync().Wait();

Console.WriteLine("Listening...");
Console.ReadKey();
server.CloseAsync().Wait();
^Z
1 file(s) copied.

C:\temp\foo>scriptcs start.csx
Found assembly reference: Newtonsoft.Json.dll
...snip...
Listening...

Pretty slick. Add in a little Live Reload-style action and we could have a very node-ish experience, all from the command line and from within your text editor of choice, except using C#.

Note that this is all using the same CLR and .NET that you've already got, running at full speed. Only the compilation is handled differently to give this script-like feel.

Installing ScriptCS

The easiest way to install and use ScriptCS is to use Chocolatey (a system-wide NuGet-based application/component installer. "Chocolatey NuGet," get it?) And yes, it's Chocolatey spelled incorrectly with an "-ey."

You can use Chocolatey to do things like "cinst 7zip" or "cinst git" but we'll be using it just to get ScriptCS set up. It's also easily removed if it freaks you out and it installs no services and won't change anything major up save your PATH.

First paste this into a cmd.exe prompt:

@powershell -NoProfile -ExecutionPolicy unrestricted -Command "iex ((new-object net.webclient).DownloadString('https://chocolatey.org/install.ps1'))" && SET PATH=%PATH%;%systemdrive%\chocolatey\bin

This will PowerShell, run https://chocolatey.org/install.ps1 and add Chocolatey to your path.

Then, run

cinst ScriptCS

Which will put ScriptCS in a path like C:\Chocolatey\lib\ScriptCs.0.0.0 while Chocolatey makes it available in your PATH.

Sublime Text or Visual Studio

You can get syntax highlighting for your CSX files inside of Sublime Text 2 with the "ScriptCS" package you can install from package control. If you're using Visual Studio you can get the Roslyn CTP to turn on CSX syntax highlighting.

You can use PackageControl in SublimeText2 and install the ScriptCS package

You can even debug your running ScriptCS projects by opening the ScriptCS.exe as a project. (Did you know you can open an EXE as a project?) Add the .csx script to the command line via Project Properties, drag in the scripts you're working on and debug away.

Debugging requires the Roslyn SDK, although personally, I've been doing just fine with scripts at the command line which requires nothing more than the basic install and a text editor.

It's not clear where ScriptCS is going, but it'll be interesting to see! Go get involved at scriptcs.net. This kind of stuff gets me excited about the prospect of a compiler as a service, and also cements my appreciation of C# as my enabling language of choice. Between C# and JavaScript, you can really get a lot done, pretty much anywhere.

I'll have a video walkthrough on how this works as I explain it to Rob Conery up on TekPub soon! (Here's a referral coupon for 20% off of Tekpub!)

What do you think?

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Exposed: A Blog Comment Spammer's Source Template

April 22, '13 Comments [53] Posted in Musings
Sponsored By

I've been getting a LOT of Blog Comment Spam lately, just in the at two weeks. I run all my comments through the Akismet Service, and I pay for it. However, this particular flavor of spam has been making it through consistently. It has a pattern, through, and I'd been trying to figure it out when this LARGE comment showed up.

Apparently while they were messing about trying to spam me, they posted their entire source template.

I'm embedding it below as a Gist, rather than copy/pasting it into my blog engine. It's so spammy, I'd hate to get delisted from Google looking rather like a splog.

Note the comments for the Gist as well.

One fellow says

"I used to do comment spam and this is not the most advanced one."

Really? Does one put Comment Spammer on their resume?

Another comment says that we're hating on spammers. We should embrace them because:

"Sure for the 1% of super popular blogs out there this might be unnecessary, but in a world filled with bloggers blogging blogs most people never read, the fake recognition and pleasantry might be just what these writers need."

I'm pretty sure that fake comment spam isn't as emotionally uplifting as you think.

Start scrolling down! If you are viewing this in an RSS reader, you MAY need to visit this post directly to see it.

Your comments, Dear Reader? Cue spam comment-related jokes...now.


Sponsor: The Windows Azure Developer Challenge is on.  Complete 5 programming challenges for a chance at spot prizes, Stage prizes and the Grand Prize. Over $16,000 is up for grabs with 65 chances to win!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Penny Pinching Video: Moving my Website's Images to the Azure CDN (and using a custom domain)

April 22, '13 Comments [36] Posted in Azure
Sponsored By

I talked about Pinching pennies when scaling in The Cloud last week when I added jQuery lazy loading to my podcast's Website. Next, I moved my website to the same data center as my SQL Database (in fact, they should have always been together!). Now, I'm moving all my show images to the Azure CDN. There's been ~370 shows, and if someone visits the archives page and scrolls around it's about 8 megs of pics.

CDN Metrics are looking greatAdditionally, I have a very international group who listen to my podcast, so by moving the images to the CDN I'll get load balancing and edge caching as well. Asians will get the images served from an Asian data center, etc.

Now, to be clear, Azure bandwidth is pretty cheap, with even 100GB costing around ~$11, but I also wanted to learn how to use Blob Storage and the CDN. I've only used it to store Virtual Machines and it's been hidden from me. Bandwidth from the Azure CDN is about the same price, but I get the geo-replication for free. Remember also that "ingress," that is, incoming traffic, is free. When you're estimating your bandwidth costs, you're only worried about outgoing traffic.

I also thought it would be cool to have a custom Hanselminutes subdomain, so I want the images to be served from http://images.hanselminutes.com, because it's cool. Plus, it means I could switch CDNs in the future and not change my URLs.

Here's the steps I used:

  • I created a new storage account called 'hanselminutescdn,' then a container called 'images.'
  • I went to 'Configure' and setup the images.hanselminutes.com CNAME to point to the new storage account. I went to my DNS provider (DNSimple.com) and added a CName from images.hanselman.com to point to hanselminutescdn.blob.core.windows.net, and then verified the domain per the Azure portable instructions. This involved adding another verification-specific CNAME.

A custom Azure CNAME for my Domain

  • I downloaded the Azure SDK for .NET and used Visual Studio to upload the images.
    image 
  • Later after the video, I got CloudBerry Explorer for Windows Azure and used their "sync" option to keep my local show images folder and Azure in sync. Very slick. Their app also lets me easily set HTTP Headers on my images if I want. I may pay for the Pro version.
    Cloud Berry Explorer
  • I changed my paths in my HTML to images.hanselminutes.com instead of just /images of the website's domain.
  • I discovered that CDNs are case sensitive, then ran this PowerShell Script to make ALL my show images files all lowercase.
dir | Rename-Item { $_.Name.ToLowerInvariant() }
  • I cleared out the files and reuploaded the new lower-case ones.
  • PROFIT.

OK, I haven't figured out that last step, but soon...very soon. ;)

IMPORTANT UPDATE: Commenter Nate Jackson points out that while I've setup access to blob storage, I haven't yet actually enable the Edge Caching abilities of the CDN itself. For this, it seems I have to visit the old management portal (which explains why I missed it completely. It's not intuitive!).

You get to the old portal from your account dropdown:

Switching over to the Previous Portal

There's instructions on the Azure docs site the say I need to create a "CDN endpoint" that associates the CDN with my now-public blob storage container.

I visit the Hosted Accounts + CDN button, then click New Endpoint.

Creating a CDN Endpoint

The URL template looks like this.

http://<CDNNamespace>.vo.msecnd.net/<myPublicContainer>/<BlobName>

So after clicking new CDN endpoint and enabling the CDN, I'm told my CDN endpoint name is "az415467" per the old portal:

Making a new CDN endpoint I'm given a unique URL

Given the URL template a URL from my container would be:

http://az415467.vo.msecnd.net/images/255.jpg

Then (if you want) click Add Custom Domain:

Associating my custom domain with my new CDN endpoint

Then point the domain to the CDN, not to public blob storage:

Pointing my custom domain to the CDN

At this point the old management portal looks like this:

Now my blob is associated with my custom domain name

And I can confirm it works when I visit

http://az415467.vo.msecnd.net/images/255.jpg  as well as http://images.hanselminutes.com/images/255.jpg

So the idea is that there's storage in general, there's blob storage made public (which is what I did) which can be geo-replicated, and then there's the formal geo-load-balanced Content Delivery Network, which I failed to configure!

CDN -> pulls from -> Public Blob Endpoint -> hosted in -> Your Storage Container itself

I will update the YouTube video soon. Big thanks to Nate Jackson for catching this oversight and educating me!

Related Links


Sponsor: The Windows Azure Developer Challenge is on.  Complete 5 programming challenges for a chance at spot prizes, Stage prizes and the Grand Prize. Over $16,000 is up for grabs with 65 chances to win!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Setting up Two-Factor Authentication for your Google account AND Microsoft account

April 19, '13 Comments [37] Posted in Tools
Sponsored By

Two factor auth for Microosft and Google within the Google Authenticator app

I use Two-Factor Authentication for my Google Apps account and I use the Google Authenticator application on my iPhone to generate the second factor.

Microsoft Accounts (formerly Live Accounts) just launched Two-Factor Auth and you should set it up now. That means SkyDrive, Outlook.com/Hotmail as well as the Windows Azure Dashboard can now be fronted by two-factor auth.

If you already use two-factor for Google, you can ADD your Microsoft account to the Google Authenticator application on your Android or iPhone. That means I can use one Authenticator application for all accounts which is extremely convenient.

The process for setting up two step authentication on a Microsoft account is:

  1. Get an Authenticator app.
  2. Head over to https://account.live.com/proofs/Manage and login to your Microsoft account.
  3. Run your Authenticator app and scan the barcode with your phone's camera
  4. Enter the number you're given and click Pair.

Microsoft accounts can scan a bar code to setup their two factor auth.

PRO TIP: If you have two factor auth turned on for BOTH Microsoft Accounts and Google Accounts, make sure you click Edit and change the display name of your accounts so you can tell them apart! I appended [MS] and [GOOG].

You can also set this up and use the same app for Dropbox, LastPass and more sites every day.

The process for Google is similar. Get the app installed, and go to the Google 2-step verification page. I've been running two-step since it came out and the annoyance is minor compared to the comfort of a little extra security.

Note that some apps (like the mail app on your phone) may not support two-factor auth, so you'll need to create an application-specific password for those apps. It's a one-time password just for the apps that need them and you can revoke those passwords anytime.

Have fun and be secure!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Penny Pinching Video: Moving an Azure Website between data centers

April 19, '13 Comments [10] Posted in Azure
Sponsored By

I talked about Pinching pennies when scaling in The Cloud last week when I added jQuery lazy loading to my podcast's Website. I wanted to avoid paying any unnecessary bandwidth costs. The result was great and I'll be under my bandwidth this month.

I'm continuing to look for ways to optimize and pinch pennies in the cloud. I realized recently that while my Website was running in the West US Azure datacenter, the database (managed by Carl Franklin's podcasting company) was running in North Central US. This means I was paying for the bandwidth of my database calls. Not to mention, it was slower, not the best idea, plus I was calling into a SQL Server over the open internet (although I had opened the firewall to do so).

This is unusual to have a website and SQL Database so far apart, of course, as you'll usually make your site and database at the same time in the same place. Azure also goes out of it's way to keep these linked resources together as you build them.

However, Carl had setup the database and original website a while back, and I only just redesigned it and moved it to Azure recently. Additionally, the administrative backend for the Hanselminutes podcast was in North Central, so we found ourselves in this position.

Azure Websites capacity opened up in the North Central datacenter, so I took lunch to move my site. You can't just click "move," but it's actually very easy to redeploy. The whole process including DNS changes took less than 15 minutes as you can see in the YouTube video above.

Here's the steps I used:

  • I made a new site in the new Data Center
  • I made it Shared so I could use a custom domain (or you can use Reserved)
  • I took the domain names off the West US site, and moved them within the Portal to the North Central one
    • If this site was super important I would have had a load balancer and kept both sites up while I waited, but total downtime was like 5 min so I didn't sweat it for this.
  • I ensured the database within North Central was a "Linked Resource" within my Website
  • I made sure my new website had the right connection strings in configuration.
  • I downloaded the new website's publish profile and imported it anew into WebMatrix (or Visual Studio, etc)
  • Published the site using the new publish profile.
  • Cleared DNS and visited the site and confirmed it worked.
  • Deleted the old site.

It worked well and I'm happy with the result. My next penny pinching step (and a nice geo-load balanced optimization) will be to move all the images to the CDN so that folks overseas get edge caching...that means that Australians will get the images for the site served from a nearby datacenter. I'll get this extra benefit for less than I am paying for website bandwidth.

Related Links

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.