Scott Hanselman

Dependabot for .NET Core dependency tracking in GitHub

October 22, '18 Comments [2] Posted in DotNetCore
Sponsored By

Bump Microsoft.ApplicationInsights.AspNetCore from 2.5.0-beta1 to 2.5.0-beta2I've been exploring automated dependency tracking lately. I usually use my podcast's ASP.NET Core website that I host on Github as a guinea pig. I tried Nukeeper and the dotnet outdated global tool - both of which are fantastic and worth exploring.

This week I'm trying Dependbot. I have no relationship with this company. Public repos and personal account repos are free and their pricing is very clear and organization accounts start at just $15 with a free trial.

I'm really impressed with how clever Dependabot is. It's almost like a person in its behavior. Yes, I realize that's kind of the point, but it's no less surprising to see. A well-written bot is a joy to behold.

For example, here is a PR (Pull Request) where Dependbot says "Bumps Microsoft.ApplicationInsights.AspNetCore from 2.5.0-beta1 to 2.5.0-beta2."

Basic stuff, right? But that's not all.

It not only does the basics where it noticed that a version bump occurred in a NuGet package, but it also copied the release notes from that NuGet package's release on GitHub! It included links to what was fixed between versions, links to the change logs, AND a complete linked commit list. I mean, that's just lovely.

A few days later, Dependabot went and closed the PR because the dependancy had updated (I was slow) then it commented telling me this PR was superseded by another.

Superseded by #20

Dependabot, like any good bot, also includes commands you can send to it via "Chats" in GitHub PR comments. You can tell it to use specific labels, control milestones. You can also control behavior in the Dependabot Dashboard and have it automerge things like minor versions, or just lock things down to security-only updates.

All in all, it's a very smart bot that supports basically all the languages. .NET support is in Beta, but I haven't had any issues with it. You should definitely check it out. And let me tell you, once you've got everything automated you'll wonder how you ever managed before.


Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Customer Notes: Diagnosing issues under load of Web API app migrated to ASP.NET Core on Linux

October 16, '18 Comments [16] Posted in ASP.NET | DotNetCore
Sponsored By

When the engineers on the ASP.NET/.NET Core team talk to real customers about actual production problems they have, interesting stuff comes up. I've tried to capture a real customer interaction here without giving away their name or details.

The team recently had the opportunity to help a large customer of .NET investigate performance issues they’ve been having with a newly-ported ASP.NET Core 2.1 app when under load. The customer's developers are experienced with ASP.NET on Windows but in this case they needed help getting started with performance investigations with ASP.NET Core in Linux containers.

As with many performance investigations, there were a variety of issues contributing to the slowdowns, but the largest contributors were time spent garbage collecting (due to unnecessary large object allocations) and blocking calls that could be made asynchronous.

After resolving the technical and architectural issues detailed below, the customer's Web API went from only being able to handle several hundred concurrent users during load testing to being able to easily handle 3,000 and they are now running the new ASP.NET Core version of their backend web API in production.

Problem Statement

The customer recently migrated their .NET Framework 4.x ASP.NET-based backend Web API to ASP.NET Core 2.1. The migration was broad in scope and included a variety of tech changes.

Their previous version Web API (We'll call it version 1) ran as an ASP.NET application (targeting .NET Framework 4.7.1) under IIS on Windows Server and used SQL Server databases (via Entity Framework) to persist data. The new (2.0) version of the application runs as an ASP.NET Core 2.1 app in Linux Docker containers with PostgreSQL backend databases (via Entity Framework Core). They used Nginx to load balance between multiple containers on a server and HAProxy load balancers between their two main servers. The Docker containers are managed manually or via Ansible integration for CI/CD (using Bamboo).

Although the new Web API worked well functionally, load tests began failing with only a few hundred concurrent users. Based on current user load and projected growth, they wanted the web API to support at least 2,000 concurrent users. Load testing was done using Visual Studio Team Services load tests running a combination of web tests mimicking users logging in, doing the stuff of their business, activating tasks in their application, as well as pings that the Mobile App's client makes regularly to check for backend connectivity. This customer also uses New Relic for application telemetry and, until recently, New Relic agents did not work with .NET Core 2.1. Because of this, there was unfortunately no app diagnostic information to help pinpoint sources of slowdowns.

Lessons Learned

Cross-Platform Investigations

One of the most interesting takeaways for me was not the specific performance issues encountered but, instead, the challenges this customer had working in a Linux environment. The team's developers are experienced with ASP.NET on Windows and comfortable debugging in Visual Studio. Despite this, the move to Linux containers has been challenging for them.

Because the engineers were unfamiliar with Linux, they hired a consultant to help deploy their Docker containers on Linux servers. This model worked to get the site deployed and running, but became a problem when the main backend began exhibiting performance issues. The performance problems only manifested themselves under a fairly heavy load, such that they could not be reproduced on a dev machine. Up until this investigation, the developers had never debugged on Linux or inside of a Docker container except when launching in a local container from Visual Studio with F5. They had no idea how to even begin diagnosing issues that only reproduced in their staging or production environments. Similarly, their dev-ops consultant was knowledgeable about Linux infrastructure but not familiar with application debugging or profiling tools like Visual Studio.

The ASP.NET team has some documentation on using PerfCollect and PerfView to gather cross-platform diagnostics, but the customer's devs did not manage to find these docs until they were pointed out. Once an ASP.NET Core team engineer spent a morning showing them how to use PerfCollect, LLDB, and other cross-platform debugging and performance profiling tools, they were able to make some serious headway debugging on their own. We want to make sure everyone can debug .NET Core on Linux with LLDB/SOS or remotely with Visual Studio as easily as possible.

The ASP.NET Core team now believes they need more documentation on how to diagnose issues in non-Windows environments (including Docker) and the documentation that already exists needs to be more discoverable. Important topics to make discoverable include PerfCollect, PerfView, debugging on Linux using LLDB and SOS, and possibly remote debugging with Visual Studio over SSH.

Issues in Web API Code

Once we gathered diagnostics, most of the perf issues ended up being common problems in the customer’s code. 

  1. The largest contributor to the app’s slowdown was frequent Generation 2 (Gen 2) GCs (Garbage Collections) which were happening because a commonly-used code path was downloading a lot of images (product images), converting those bytes into a base64 strings, responding to the client with those strings, and then discarding the byte[] and string. The images were fairly large (>100 KB), so every time one was downloaded, a large byte[] and string had to be allocated. Because many of the images were shared between multiple clients, we solved the issue by caching the base64 strings for a short period of time (using IMemoryCache).
  2. HttpClient Pooling with HttpClientFactory
    1. When calling out to Web APIs there was a pattern of creating new HttpClient instances rather than using IHttpClientFactory to pool the clients.
    2. Despite implementing IDisposable, it is not a best practice to dispose HttpClient instances as soon as they’re out of scope as they will leave their socket connection in a TIME_WAIT state for some time after being disposed. Instead, HttpClient instances should be re-used.
  3. Additional investigation showed that much of the application’s time was spent querying PostgresSQL for data (as is common). There were several underlying issues here.
    1. Database queries were being made in a blocking way instead of being asynchronous. We helped address the most common call-sites and pointed the customer at the AsyncUsageAnalyzer to identify other async cleanup that could help.
    2. Database connection pooling was not enabled. It is enabled by default for SQL Server, but not for PostgreSQL.
      1. We re-enabled database connection pooling. It was necessary to have different pooling settings for the common database (used by all requests) and the individual shard databases which are used less frequently. While the common database needs a large pool, the shard connection pools need to be small to avoid having too many open, idle connections.
    3. The Web API had a fairly ‘chatty’ interface with the database and made a lot of small queries. We re-worked this interface to make fewer calls (by querying more data at once or by caching for short periods of time).
  4. There was also some impact from having other background worker containers on the web API’s servers consuming large amounts of CPU. This led to a ‘noisy neighbor’ problem where the web API containers didn’t have enough CPU time for their work. We showed the customer how to address this with Docker resource constraints.

Wrap Up

As shown in the graph below, at the end of our performance tuning, their backend was easily able to handle 3,000 concurrent users and they are now using their ASP.NET Core solution in production. The performance issues they saw overlapped a lot with those we’ve seen from other customers (especially the need for caching and for async calls), but proved to be extra challenging for the developers to diagnose due to the lack of familiarity with Linux and Docker environments.

Performance and Errors Charts look good, up and to the right
Throughput and Tests Charts look good, up and to the right

Some key areas of focus uncovered by this investigation were:

  • Being mindful of memory allocations to minimize GC pause times

  • Keeping long-running calls non-blocking/asynchronous

  • Minimizing calls to external resources (such as other web services or the database) with caching and grouping of requests

Hope you find this useful! Big thanks to Mike Rousos from the ASP.NET Core team for his work and analysis!


Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

New prescriptive guidance for Open Source .NET Library Authors

October 16, '18 Comments [5] Posted in DotNetCore | Open Source
Sponsored By

Open-source library guidanceThere's a great new bunch of guidance just published representing Best Practices for creating .NET Libraries. Best of all, it was shepherded by JSON.NET's James Newton-King. Who better to help explain the best way to build and publish a .NET library than the author of the world's most popular open source .NET library?

Perhaps you've got an open source (OSS) .NET Library on your GitHub, GitLab, or Bitbucket. Go check out the open-source library guidance.

These are the identified aspects of high-quality open-source .NET libraries:

  • Inclusive - Good .NET libraries strive to support many platforms and applications.
  • Stable - Good .NET libraries coexist in the .NET ecosystem, running in applications built with many libraries.
  • Designed to evolve - .NET libraries should improve and evolve over time, while supporting existing users.
  • Debuggable - .NET libraries should use the latest tools to create a great debugging experience for users.
  • Trusted - .NET libraries have developers' trust by publishing to NuGet using security best practices.

The guidance is deep but also preliminary. As with all Microsoft Documentation these days it's open source in Markdown and on GitHub. If you've got suggestions or thoughts, share them! Be sure to sound off in the Feedback Section at the bottom of the guidance. James and the Team will be actively incorporating your thoughts.

Cross-platform targeting

Since the whole point of .NET Core and the .NET Standard is reuse, this section covers how and why to make reusable code but also how to access platform-specific APIs when needed with multi-targeting.

Strong naming

Strong naming seemed like a good idea but you should know WHY and WHEN to strong name. It all depends on your use case! Are you publishing internally or publically? What are your dependencies and who depends on you?

NuGet

When publishing on the NuGet public repository (or your own private/internal one) what do you need to know about SemVer 2.0.0? What about pre-release packages? Should you embed PDBs for easier debugging? Consider things like Dependencies, SourceLink, how and where to Publish and how Versioning applies to you and when (or if) you cause Breaking changes.

Also be sure to check out Immo's video on "Building Great Libraries with .NET Standard" on YouTube!


Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

C# and .NET Core scripting with the "dotnet-script" global tool

October 12, '18 Comments [12] Posted in DotNetCore | Open Source
Sponsored By

dotnet scriptYou likely know that open source .NET Core is cross platform and it's super easy to do "Hello World" and start writing some code.

You just install .NET Core, then "dotnet new console" which will generate a project file and basic app, then "dotnet run" will compile and run your app? The 'new' command will create all the supporting code, obj, and bin folders, etc. When you do "dotnet run" it actually is a combination of "dotnet build" and "dotnet exec whatever.dll."

What could be easier?

What about .NET Core as scripting?

Check out dotnet script:

C:\Users\scott\Desktop\scriptie> dotnet tool install -g dotnet-script
You can invoke the tool using the following command: dotnet-script
C:\Users\scott\Desktop\scriptie>copy con helloworld.csx
Console.WriteLine("Hello world!");
^Z
1 file(s) copied.
C:\Users\scott\Desktop\scriptie>dotnet script helloworld.csx
Hello world!

NOTE: I was a little tricky there in step two. I did a "copy con filename" to copy from the console to the destination file, then used Ctrl-Z to finish the copy. Feel free to just use notepad or vim. That's not dotnet-script-specific, that's Hanselman-specific.

Pretty cool eh? If you were doing this in Linux or OSX you'll need to include a "shebang" as the first line of the script. This is a standard thing for scripting files like bash, python, etc.

#!/usr/bin/env dotnet-script
Console.WriteLine("Hello world");

This lets the operating system know what scripting engine handles this file.

If you you want to refer to a NuGet package within a script (*.csx) file, you'll use the Roslyn #r syntax:

#r "nuget: AutoMapper, 6.1.0"
Console.WriteLine("whatever);

Even better! Once you have "dotnet-script" installed as a global tool as above:

dotnet tool install -g dotnet-script

You can use it as a REPL! Finally, the C# REPL (Read Evaluate Print Loop) I've been asking for for only a decade! ;)

C:\Users\scott\Desktop\scriptie>dotnet script
> 2+2
4
> var x = "scott hanselman";
> x.ToUpper()
"SCOTT HANSELMAN"

This is super useful for a learning tool if you're teaching C# in a lab/workshop situation. Of course you could also learn using http://try.dot.net in the browser as well.

In the past you may have used ScriptCS for C# scripting. There's a number of cool C#/F# scripting options. This is certainly not a new thing:

In this case, I was very impressed with the easy of dotnet-script as a global tool and it's simplicity. Go check out https://github.com/filipw/dotnet-script and try it out today!


Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb

Headless CMS and Decoupled CMS in .NET Core

October 3, '18 Comments [20] Posted in DotNetCore
Sponsored By

Headless by Wendy used under CC https://flic.kr/p/HkESxWI'm sure I'll miss some, so if I do, please sound off in the comments and I'll update this post over the next week or so!

Lately I've been noticing a lot of "Headless" CMSs (Content Management System). A ton, in fact. I wanted to explore this concept and see if it's a fad or if it's really something useful.

Given the rise of clean RESTful APIs has come the rise of Headless CMS systems. We've all evaluated CMS systems (ones that included both front- and back-ends) and found the front-end wanting. Perhaps it lacks flexibility OR it's way too flexible and overwhelming. In fact, when I wrote my podcast website I considered a CMS but decided it felt too heavy for just a small site.

A Headless CMS is a back-end only content management system (CMS) built from the ground up as a content repository that makes content accessible via a RESTful API for display on any device.

I could start with a database but what if I started with a CMS that was just a backend - a headless CMS. I'll handle the front end, and it'll handle the persistence.

Here's what I found when exploring .NET Core-based Headless CMSs. One thing worth noting, is that given Docker containers and the ease with which we can deploy hybrid systems, some of these solutions have .NET Core front-ends and "who cares, it returns JSON" for the back-end!

Lynicon

Lyncicon is literally implemented as a NuGet Library! It stores its data as structured JSON. It's built on top of ASP.NET Core and uses MVC concepts and architecture.

It does include a front-end for administration but it's not required. It will return HTML or JSON depending on what HTTP headers are sent in. This means you can easily use it as the back-end for your Angular or existing SPA apps.

Lyncion is largely open source at https://github.com/jamesej/lyniconanc. If you want to take it to the next level there's a small fee that gives you updated searching, publishing, and caching modules.

ButterCMS

ButterCMS is an API-based CMS that seamlessly integrates with ASP.NET applications. It has an SDK that drops into ASP.NET Core and also returns data as JSON. Pulling the data out and showing it in a few is easy.

public class CaseStudyController : Controller
{
    private ButterCMSClient Client;

    private static string _apiToken = "";

    public CaseStudyController()
    {
        Client = new ButterCMSClient(_apiToken);
    }

    [Route("customers/{slug}")]
    public async Task<ActionResult> ShowCaseStudy(string slug)
    {
        var json = await Client.ListPageAsync("customer_case_study", slug)
        dynamic page = ((dynamic)JsonConvert.DeserializeObject(json)).data.fields;
        ViewBag.SeoTitle = page.seo_title;
        ViewBag.FacebookTitle = page.facebook_open_graph_title;
        ViewBag.Headline = page.headline;
        ViewBag.CustomerLogo = page.customer_logo;
        ViewBag.Testimonial = page.testimonial;
        return View("Location");
    } 
}

Then of course output into Razor (or putting all of this into a RazorPage) is simple:

<html>
  <head>
    <title>@ViewBag.SeoTitle</title>
    <meta property="og:title" content="@ViewBag.FacebookTitle" /> 
  </head>
  <body>
    <h1>@ViewBag.Headline</h1>
    <img width="100%" src="@ViewBag.CustomerLogo">
    <p>@ViewBag.Testimonial</p>
  </body>
</html>

Butter is a little different (and somewhat unusual) in that their backend API is a SaaS (Software as a Service) and they host it. They then have SDKs for lots of platforms including .NET Core. The backend is not open source while the front-end is https://github.com/ButterCMS/buttercms-csharp.

Piranha CMS

Piranha CMS is built on ASP.NET Core and is open source on GitHub. It's also totally package-based using NuGet and can be easily started up with a dotnet new template like this:

dotnet new -i Piranha.BasicWeb.CSharp
dotnet new piranha
dotnet restore
dotnet run

It even includes a new Blog template that includes Bootstrap 4.0 and is all set for customization. It does include optional lightweight front-end but you can use those as guidelines to create your own client code. One nice touch is that Piranha also includes image resizing and cropping.

Umbraco Headless

The main ASP.NET website currently uses Umbraco as its CMS. Umbraco is a well-known open source CMS that will soon include a Headless option for more flexibility. The open source code for Umbraco is up here https://github.com/umbraco.

Orchard Core

Orchard is a CMS with a very strong community and fantastic documentation. Orchard Core is a redevelopment of Orchard using open source ASP.NET Core. While it's not "headless" it is using a Decoupled Architecture. Nothing would prevent you from removing the UI and presenting the content with your own front-end. It's also cross-platform and container friendly.

Squidex

"Squidex is an open source headless CMS and content management hub. In contrast to a traditional CMS Squidex provides a rich API with OData filter and Swagger definitions." Squidex is build with ASP.NET Core and the CQRS pattern and works with both Windows and Linux on today's browsers.

Squidex is open source with excellent docs at https://docs.squidex.io. Docs are at https://docs.squidex.io. They are also working on a hosted version you can play with here https://cloud.squidex.io. Samples on how to consume it are here https://github.com/Squidex/squidex-samples.

The consumption is super clean:

[Route("/{slug},{id}/")]
public async Task<IActionResult> Post(string slug, string id)
{
    var post = await apiClient.GetBlogPostAsync(id);

    var vm = new PostVM
    {
        Post = post
    };

    return View(vm);
}

And then the View:

@model PostVM
@{
    ViewData["Title"] = Model.Post.Data.Title;
}

<div>
    <h2>@Model.Post.Data.Title</h2>

    @Html.Raw(Model.Post.Data.Text)
</div>

What .NET Core Headless CMSs did I miss? Let me know.

This definitely isn't a fad. It makes a lot of sense to me architecturally. Given the proliferation of "backend as a service" systems, DocumentDBs like Cosmos and Mongo, it follows that a headless CMS could easily fit into my systems. One less DB schema to think about, no need to roll my own auth/auth.

*Photo "headless" by Wendy used under CC https://flic.kr/p/HkESxW


Sponsor: Telerik DevCraft is the comprehensive suite of .NET and JavaScript components and productivity tools developers use to build high-performant, modern web, mobile, desktop apps and chatbots. Try it!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by SherWeb
Previous Page Page 2 of 18 in the DotNetCore category Next Page

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.