Scott Hanselman

Upgrading an existing .NET project files to the lean new CSPROJ format from .NET Core

August 17, '18 Comments [6] Posted in DotNetCore
Sponsored By

Evocative random source code photoIf you've looked at csproj (C# (csharp) projects) in the past in a text editor you probably looked away quickly. They are effectively MSBuild files that orchestrate the build process. Phrased differently, a csproj file is an instance of an MSBuild file.

In Visual Studio 2017 and .NET Core 2 (and beyond) the csproj format is MUCH MUCH leaner. There's a lot of smart defaults, support for "globbing" like **/*.cs, etc and you don't need to state a bunch of obvious stuff. Truly you can take earlier msbuild/csproj files and get them down to a dozen lines of XML, plus package references. PackageReferences (references to NuGet packages) should be moved out of packages.config and into the csproj.  This lets you manage all project dependencies in one place and gives you and uncluttered view of top-level dependencies.

However, upgrading isn't as simple as "open the old project file and have VS automatically migrate you."

You have some options when migrating to .NET Core and the .NET Standard.

First, and above all, run the .NET Portability Analyzer and find out how much of your code is portable. Then you have two choices.

  • Great a new project file with something like "dotnet new classlib" and then manually get your projects building from the top (most common ancestor) project down
  • Try to use an open source 3rd party migration tool

Damian on my team recommends option one - a fresh project - as you'll learn more and avoid bringing cruft over. I agree, until there's dozens of projects, then I recommend trying a migration tool AND then comparing it to a fresh project file to avoid adding cruft. Every project/solution is different, so expect to spend some time on this.

The best way to learn this might be by watching it happen for real. Wade from Salesforce was tasked with upgrading his 4+ year old .NET Framework (Windows) based SDK to portable and open source .NET Core. He had some experience building for older versions of Mono and was thoughtful about not calling Windows-specific APIs so he knows the code is portable. However he needs to migrate the project files and structure AND get the Unit Tests running with "dotnet test" and the command line.

I figured I'd give him a head start by actually doing part of the work. It's useful to do this because, frankly, things go wrong and it's not pretty!

I started with Hans van Bakel's excellent CsProjToVS2017 global tool. It does an excellent job of getting your project 85% of the way there. To be clear, don't assume anything and not every warning will apply to you. You WILL need to go over every line of your project files, but it is an extraordinarily useful tool. If you have .NET Core 2.1, install it globally like this:

dotnet tool install Project2015To2017.Cli --global

Then its called (unfortunately) with another command "csproj-to-2017" and you can pass in a solution or an individual csproj.

After you've done the administrivia of the actual project conversion, you'll also want to make educated decisions about the 3rd party libraries you pull in. For example, if you want to make your project cross-platform BUT you depend on some library that is Windows only, why bother trying to port? Well, many of your favorite libraries DO have "netstandard" or ".NET Standard" versions. You'll see in the video below how I pull Wade's project's reference forward with a new version of JSON.NET and a new NuUnit. By the end we are building and the command line and running tests as well with code coverage.

Please head over to my YouTube and check it out. Note this happened live and spontaneously plus I had a YouTube audience giving me helpful comments, so I'll address them occasionally.

LIVE: Upgrading an older .NET SDK to .NET Core and .NET Standard

If you find things like this useful, let me know in the comments and maybe I'll do more of them. Also, do you think things like this belong on the Visual Studio Twitch Channel? Go follow my favs on Twitch CSharpFritz and Noopkat for more live coding fun!


Friend of the Blog: Want to learn more about .NET for free? Join us at DotNetConf! It's a free virtual online community conference September 12-14, 2018. Head over to https://www.dotnetconf.net to learn more and for a Save The Date Calendar Link.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Developing locally with ASP.NET Core under HTTPS, SSL, and Self-Signed Certs

August 2, '18 Comments [10] Posted in ASP.NET | DotNetCore
Sponsored By

Last week on Twitter @getify started an excellent thread pointing out that we should be using HTTPS even on our local machines. Why?

You want your local web development set up to reflect your production reality as much as possible. URL parsing, routing, redirects, avoiding mixed-content warnings, etc. It's very easy to accidentally find oneself on http:// when everything in 2018 should be under https://.

I'm using ASP.NET Core 2.1 which makes local SSL super easy. After installing from http://dot.net I'll "dotnet new razor" in an empty folder to make a quick web app.

Then, when I "dotnet run" I see two URLs serving pages:

C:\Users\scott\Desktop\localsslweb> dotnet run
Hosting environment: Development
Content root path: C:\Users\scott\Desktop\localsslweb
Now listening on: https://localhost:5001
Now listening on: http://localhost:5000
Application started. Press Ctrl+C to shut down.

One is HTTP over port 5000 and the other is HTTPS over 5001. However, if I hit https://localhost:5001, I may see an error:

Your connection to this site is not secure

That's because this is an untrusted SSL cert that was generated locally:

Untrusted cert

There's a dotnet global tool built into .NET Core 2.1 to help with certs at dev time, called "dev-certs."

C:\Users\scott> dotnet dev-certs https --help

Usage: dotnet dev-certs https [options]

Options:
-ep|--export-path Full path to the exported certificate
-p|--password Password to use when exporting the certificate with the private key into a pfx file
-c|--check Check for the existence of the certificate but do not perform any action
--clean Cleans all HTTPS development certificates from the machine.
-t|--trust Trust the certificate on the current platform
-v|--verbose Display more debug information.
-q|--quiet Display warnings and errors only.
-h|--help Show help information

I just need to run "dotnet dev-certs https --trust" and I'll get a pop up asking if I want to trust this localhost cert..

You want to trust this local cert?

On Windows it'll get added to the certificate store and on Mac it'll get added to the keychain. On Linux there isn't a standard way across distros to trust the certificate, so you'll need to perform the distro specific guidance for trusting the development certificate.

Close your browser and open up again at https://localhost:5001 and you'll see a trusted "Secure" badge in your browser.

Secure

Note also that by default HTTPS redirection is included in ASP.NET Core, and in Production it'll use HTTP Strict Transport Security (HSTS) as well, avoiding any initial insecure calls.

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
app.UseExceptionHandler("/Error");
app.UseHsts();
}

app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseCookiePolicy();

app.UseMvc();
}

That's it. What's historically been a huge hassle for local development is essentially handled for you. Given that Chrome is marking http:// sites as "Not Secure" as of Chrome 68 you'll want to consider making ALL your sites Secure by Default. I wrote up how to get certs for free with Azure and Let's Encrypt.


Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Example Code - Opinionated ContosoUniversity on ASP.NET Core 2.0's Razor Pages

July 25, '18 Comments [13] Posted in ASP.NET | DotNetCore | Open Source
Sponsored By

The best way to learn about code isn't just writing more code - it's reading code! Not all of it will be great code and much of it won't be the way you would do it, but it's a great way to expand your horizons.

In fact, I'd argue that most people aren't reading enough code. Perhaps there's not enough clean code bases to check out and learn from.

I was pleased to stumble on this code base from Jimmy Bogard called Contoso University at https://github.com/jbogard/ContosoUniversityDotNetCore-Pages.

There's a LOT of good stuff to read in this repo so I won't claim to have read it all or as deeply as I could. In fact, there's a good solid day of reading and absorbing here.However, here's some of the things I noticed and that I appreciate. Some of this is very "Jimmy" code, since it was written for and by Jimmy. This is a good thing and not a dig. We all collect patterns and make libraries and develop our own spins on architectural styles. I love that Jimmy collects a bunch of things he's created or contributed to over the years and put it into a nice clear sample for us to read. As Jimmy points out, there's a lot in https://github.com/jbogard/ContosoUniversityDotNetCore-Pages to explore:

Clone and Build just works

A low bar, right? You'd be surprised how often I git clone someone's repository and they haven't tested it elsewhere. Bonus points for a build.ps1 that bootstraps whatever needs to be done. I had .NET Core 2.x on my system already and this build.ps1 got the packages I needed and built the code cleanly.

It's an opinioned project with some opinions. ;) And that's great, because it means I'll learn about techniques and tools that I may not have used before. If someone uses a tool that's not the "defaults" it may me that the defaults are lacking!

  • Build.ps1 is using a build script style taken from PSake, a powershell build automation tool.
  • It's building to a folder called ./artifacts as as convention.
  • Inside build.ps1, it's using Roundhouse, a Database Migration Utility for .NET using sql files and versioning based on source control http://projectroundhouse.org
  • It's set up for Continuous Integration in AppVeyor, a lovely CI/CD system I use myself.
  • It uses the Octo.exe tool from OctopusDeploy to package up the artifacts.

Organized and Easy to Read

I'm finding the code easy to read for the most part. I started at Startup.cs to just get a sense of what middleware is being brought in.

public void ConfigureServices(IServiceCollection services)
{
    services.AddMiniProfiler().AddEntityFramework();

    services.AddDbContext<SchoolContext>(options =>
        options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));

    services.AddAutoMapper(typeof(Startup));

    services.AddMediatR(typeof(Startup));

    services.AddHtmlTags(new TagConventions());

    services.AddMvc(opt =>
        {
            opt.Filters.Add(typeof(DbContextTransactionPageFilter));
            opt.Filters.Add(typeof(ValidatorPageFilter));
            opt.ModelBinderProviders.Insert(0, new EntityModelBinderProvider());
        })
        .SetCompatibilityVersion(CompatibilityVersion.Version_2_1)
        .AddFluentValidation(cfg => { cfg.RegisterValidatorsFromAssemblyContaining<Startup>(); });
}
Here I can see what libraries and helpers are being brought in, like AutoMapper, MediatR, and HtmlTags. Then I can go follow up and learn about each one.

MiniProfiler

I've always loved MiniProfiler. It's a hidden gem of .NET and it's been around being awesome forever. I blogged about it back in 2011! It sits in the corner of your web page and gives you REAL actionable details on how your site behaves and what the important perf timings are.

MiniProfiler is the profiler you didn't know you needed

It's even better with EF Core in that it'll show you the generated SQL as well! Again, all inline in your web site as you develop it.

inline SQL in MiniProfiler

Very nice.

Clean Unit Tests

Jimmy is using XUnit and has an IntegrationTestBase here with some stuff I don't understand, like SliceFixture. I'm marking this as something I need to read up on and research. I can't tell if this is the start of a new testing helper library, as it feels too generic and important to be in this sample.

He's using the CQRS "Command Query Responsibility Segregation" pattern. Here starts with a Create command, sends it, then does a Query to confirm the results. It's very clean and he's got a very isolated test.

[Fact]
public async Task Should_get_edit_details()
{
    var cmd = new Create.Command
    {
        FirstMidName = "Joe",
        LastName = "Schmoe",
        EnrollmentDate = DateTime.Today
    };

    var studentId = await SendAsync(cmd);

    var query = new Edit.Query
    {
        Id = studentId
    };

    var result = await SendAsync(query);

    result.FirstMidName.ShouldBe(cmd.FirstMidName);
    result.LastName.ShouldBe(cmd.LastName);
    result.EnrollmentDate.ShouldBe(cmd.EnrollmentDate);
}

FluentValidator

https://fluentvalidation.net is a helper library for creating clear strongly-typed validation rules. Jimmy uses it throughout and it makes for very clean validation code.

public class Validator : AbstractValidator<Command>
{
    public Validator()
    {
        RuleFor(m => m.Name).NotNull().Length(3, 50);
        RuleFor(m => m.Budget).NotNull();
        RuleFor(m => m.StartDate).NotNull();
        RuleFor(m => m.Administrator).NotNull();
    }
}

Useful Extensions

Looking at a project's C# extension methods is a great way to determine what the author feels are gaps in the underlying included functionality. These are useful for returning JSON from Razor Pages!

public static class PageModelExtensions
{
    public static ActionResult RedirectToPageJson<TPage>(this TPage controller, string pageName)
        where TPage : PageModel
    {
        return controller.JsonNet(new
            {
                redirect = controller.Url.Page(pageName)
            }
        );
    }

    public static ContentResult JsonNet(this PageModel controller, object model)
    {
        var serialized = JsonConvert.SerializeObject(model, new JsonSerializerSettings
        {
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        });

        return new ContentResult
        {
            Content = serialized,
            ContentType = "application/json"
        };
    }
}

PaginatedList

I've always wondered what to do with helper classes like PaginatedList. Too small for a package, too specific to be built-in? What do you think?

public class PaginatedList<T> : List<T>
{
    public int PageIndex { get; private set; }
    public int TotalPages { get; private set; }

    public PaginatedList(List<T> items, int count, int pageIndex, int pageSize)
    {
        PageIndex = pageIndex;
        TotalPages = (int)Math.Ceiling(count / (double)pageSize);

        this.AddRange(items);
    }

    public bool HasPreviousPage
    {
        get
        {
            return (PageIndex > 1);
        }
    }

    public bool HasNextPage
    {
        get
        {
            return (PageIndex < TotalPages);
        }
    }

    public static async Task<PaginatedList<T>> CreateAsync(IQueryable<T> source, int pageIndex, int pageSize)
    {
        var count = await source.CountAsync();
        var items = await source.Skip((pageIndex - 1) * pageSize).Take(pageSize).ToListAsync();
        return new PaginatedList<T>(items, count, pageIndex, pageSize);
    }
}

I'm still reading all the source I can. Absorbing what resonates with me, considering what I don't know or understand and creating a queue of topics to read about. I'd encourage you to do the same! Thanks Jimmy for writing this large sample and for giving us some code to read and learn from!


Sponsor: Scale your Python for big data & big science with IntelĀ® Distribution for Python. Near-native code speed. Use with NumPy, SciPy & scikit-learn. Get it Today!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

AltCover and ReportGenerator give amazing code coverage on .NET Core

July 20, '18 Comments [10] Posted in DotNetCore | Open Source
Sponsored By

I'm continuing to explore testing and code coverage on open source .NET Core. Earlier this week I checked out coverlet. There is also the venerable OpenCover and there's some cool work being done to get OpenCover working with .NET Core, but it's Windows only.

Today, I'm exploring AltCover by Steve Gilham. There are coverage tools that use the .NET Profiling API at run-time, instead, AltCover weaves IL for its coverage.

As the name suggests, it's an alternative coverage approach. Rather than working by hooking the .net profiling API at run-time, it works by weaving the same sort of extra IL into the assemblies of interest ahead of execution. This means that it should work pretty much everywhere, whatever your platform, so long as the executing process has write access to the results file. You can even mix-and-match between platforms used to instrument and those under test.

AltCover is a NuGet package but it's also available as .NET Core Global Tool which is awesome.

dotnet tool install --global altcover.global

This makes "altcover" a command that's available everywhere without adding it to my project.

That said, I'm going to follow the AltCover Quick Start and see how quickly I can get it set up!

I'll Install into my test project hanselminutes.core.tests

dotnet add package AltCover

and then run

dotnet test /p:AltCover=true

90.1% Line Coverage, 71.4% Branch CoverageCool. My tests run as usual, but now I've got a coverage.xml in my test folder. I could also generate LCov or Cobertura reports if I'd like. At this point my coverage.xml is nearly a half-meg! That's a lot of good information, but how do I see  the results in a human readable format?

This is the OpenCover XML format and I can run ReportGenerator on the coverage file and get a whole bunch of HTML files. Basically an entire coverage mini website!

I downloaded ReportGenerator and put it in its own folder (this would be ideal as a .NET Core global tool).

c:\ReportGenerator\ReportGenerator.exe -reports:coverage.xml -targetdir:./coverage

Make sure you use a decent targetDir otherwise you might end up with dozens of HTML files littered in your project folder. You might also consider .gitignoring the resulting folder and coverage file. Open up index.htm and check out all this great information!

Coverage Report says 90.1% Line Coverage

Note the Risk Hotspots at the top there! I've got a CustomPageHandler with a significant NPath Complexity and two Views with a significant Cyclomatic Complexity.

Also check out the excellent branch coverage as expressed here in the results of the coverage report. You can see that EnableAutoLinks was always true, so I only ever tested one branch. I might want to add a negative test here and explore if there's any side effects with EnableAutoLinks is false.

Branch Coverage

Be sure to explore AltCover and its Full Usage Guide. There's a number of ways to run it, from global tools, dotnet test, MSBuild Tasks, and PowerShell integration!

There's a lot of great work here and it took me literally 10 minutes to get a great coverage report with AltCover and .NET Core. Kudos to Steve on AltCover! Head over to https://github.com/SteveGilham/altcover and give it a STAR, file issues (be kind) or perhaps offer to help out! And most of all, share cool Open Source projects like this with your friends and colleagues.


Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

.NET Core Code Coverage as a Global Tool with coverlet

July 18, '18 Comments [5] Posted in DotNetCore | Open Source
Sponsored By

Last week I blogged about "dotnet outdated," an essential .NET Core "global tool" that helps you find out what NuGet package reference you need to update.

.NET Core Global Tools are really taking off right now. They are meant for devs - this isn't a replacement for chocolatey or apt-get - this is more like npm's global developer tools. They're putting together a better way to find and identify global tools, but for now Nate McMaster has a list of some great .NET Core Global Tools on his GitHub. Feel free to add to that list!

.NET tools can be installed like this:

dotnet tool install -g <package id>

So for example:

C:\Users\scott> dotnet tool install -g dotnetsay
You can invoke the tool using the following command: dotnetsay
Tool 'dotnetsay' (version '2.1.4') was successfully installed.
C:\Users\scott> dotnetsay

Welcome to using a .NET Core global tool!

You know, all the important tools. Seriously, some are super fun. ;)

Coverlet is a cross platform code coverage tool that's in active development. In fact, I automated my build with code coverage for my podcast site back in March. I combined VS Code, Coverlet, xUnit, plus these Visual Studio Code extensions

for a pretty nice experience! All free and open source.

I had to write a little PowerShell script because the "dotnet test" command for testing my podcast site with coverlet got somewhat unruly. Coverlet.msbuild was added as a package reference for my project.

dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=lcov /p:CoverletOutput=./lcov .\hanselminutes.core.tests

I heard last week that coverlet had initial support for being a .NET Core Global Tool, which I think would be convenient since I could use it anywhere on any project without added references.

dotnet tool install --global coverlet.console

At this point I can type "Coverlet" and it's available anywhere.

I'm told this is an initial build as a ".NET Global Tool" so there's always room for constructive feedback.

From what I can tell, I run it like this:

coverlet .\bin\Debug\netcoreapp2.1\hanselminutes.core.tests.dll --target "dotnet" --targetargs "test --no-build"

Note I have to pass in the already-built test assembly since coverlet instruments that binary and I need to say "--no-build" since we don't want to accidentally rebuild the assemblies and lose the instrumentation.

Coverlet can generate lots of coverage formats like opencover or lcov, and by default gives a nice ASCII table:

88.1% Line Coverage in Hanselminutes.core

I think my initial feedback (I'm not sure if this is possible) is smarter defaults. I'd like to "fall into the Pit of Success." That means, even I mess up and don't read the docs, I still end up successful.

For example, if I type "coverlet test" while the current directory is a test project, it'd be nice if that implied all this as these are reasonable defaults.

.\bin\Debug\whatever\whatever.dll --target "dotnet" --targetargs "test --nobuild"

It's great that there is this much control, but I think assuming "dotnet test" is a fair assumption, so ideally I could go into any folder with a test project and type "coverlet test" and get that nice ASCII table. Later I'd be more sophisticated and read the excellent docs as there's lots of great options like setting coverage thresholds and the like.

I think the future is bright with .NET Global Tools. This is just one example! What's your favorite?


Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb
Page 1 of 15 in the DotNetCore category Next Page

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.