Scott Hanselman

Automatically Signing a Windows EXE with Azure Trusted Signing, dotnet sign, and GitHub Actions

November 28, 2025 Comment on this post [5] Posted in Azure | DotNetCore
Sponsored By

WindowsEdgeLight on a SurfaceMac Tahoe (in Beta as of the time of this writing) has this new feature called Edge Light that basically puts a bright picture of an Edge Light around your screen and basically uses the power of OLED to give you a virtual ring light. So I was like, why can't we also have nice things? I wrote (vibed, with GitHub Copilot and Claude Sonnet 4.5) a Windows Edge Light App (source code at https://github.com/shanselman/WindowsEdgeLight and you can get the latest release here https://github.com/shanselman/WindowsEdgeLight/releases or the app will check for new releases and autoupdate with Updatum).

However, as is with all suss loose executables on the internet, when you run random stuff you'll often get the Window Defender 'new phone, who dis' warning which is scary. After several downloads and no viruses or complaints, my executable will eventually gain reputation with the Windows Defender Smart Screen service, but having a Code Signing Certificate is said to help with that. However, code signing certs are expensive and a hassle to manage and renew.

Someone told me that Azure Trusted Signing was somewhat less of a hassle - it's less, but it's still non-trivial. I read this post from Rick (his blog is gold and has been for years) earlier in the year and some of it was super useful and other stuff has been made simpler over time.

I wrote 80% of this blog post, but since I just spent an hour getting code signing to work and GitHub Copilot was going through and logging everything I did, I did use Claude 4.5 to help organize some of this. I have reviewed it all and re-written parts I didn't like, so any mistakes are mine.

Azure Trusted Signing is Microsoft's cloud-based code signing service that:

  • No hardware tokens - Everything happens in the cloud
  • Automatic certificate management - Certificates are issued and renewed automatically
  • GitHub Actions integration - Sign during your CI/CD pipeline. I used GH Actions.
  • Kinda Affortable - About $10/month for small projects. I would like it if this were $10 a year. This is cheaper than a yearly cert, but it'll add up after a while so I'm always looking for cheaper/easier options.
  • Trusted by Windows - Uses the same certificate authority as Microsoft's own apps, so you should get your EXE trusted faster

Prerequisites

Before starting, you'll need:

  1. Azure subscription
  2. Azure CLI - Install from here
  3. Identity validation documents - Driver's license or passport for individual developers. Note that I'm in the US, so your mileage may vary but I basically set up the account, scanned a QR code, took a picture of my license, then did a selfie, then waited.
  4. Windows PC - For local signing (optional) but I ended up using the dotnet sign tool. There are
  5. GitHub repository - For automated signing (optional)

Part 1: Setting Up Azure Trusted Signing

Step 1: Register the Resource Provider

First, I need to enable the Azure Trusted Signing service in my subscription. This can be done in the Portal, or at the CLI.

# Login to Azure
az login

# Register the Microsoft.CodeSigning resource provider
az provider register --namespace Microsoft.CodeSigning

# Wait for registration to complete (takes 2-3 minutes)
az provider show --namespace Microsoft.CodeSigning --query "registrationState"

Wait until the output shows "Registered".

Step 2: Create a Trusted Signing Account

Now create the actual signing account. You can do this via Azure Portal or CLI.

Option A: Azure Portal (Easier for first-timers)

  1. Go to Azure Portal
  2. Search for "Trusted Signing Accounts"
  3. Click Create
  4. Fill in:
    • Subscription: Your subscription
    • Resource Group: Create new or use existing (e.g., "MyAppSigning")
    • Account Name: A unique name (e.g., "myapp-signing")
    • Region: Choose closest to you (e.g., "West US 2")
    • SKU: Basic (sufficient for most apps)
  5. Click Review + Create, then Create

Option B: Azure CLI (Faster if you are a CLI person or like to drive stick shift)

# Create a resource group
az group create --name MyAppSigning --location westus2

# Create the Trusted Signing account
az trustedsigning create \
  --resource-group MyAppSigning \
  --account-name myapp-signing \
  --location westus2 \
  --sku-name Basic

Important: Note your region endpoint. Common ones are:

  • East US: https://eus.codesigning.azure.net/
  • West US 2: https://wus2.codesigning.azure.net/
  • Your specific region: Check in Azure Portal under your account's Overview page

I totally flaked on this and messed around for 10 min before I realized that this URL matters and is specific to your account. Remember this endpoint.

Step 3: Complete Identity Validation

This is the most important step. Microsoft needs to verify you're a real person/organization.

  1. In Azure Portal, go to your Trusted Signing Account
  2. Click Identity validation in the left menu
  3. Click Add identity validation
  4. Choose validation type:
    • Individual: For solo developers (uses driver's license/passport)
    • Organization: For companies (uses business registration documents)
  5. For Individual validation:
    • Upload a clear photo of your government-issued ID
    • Provide your full legal name (must match ID exactly)
    • Provide your email address
  6. Submit and wait for approval

Approval Time:

  • Individual: Usually 1-3 business days
  • Organization: 3-5 business days
  • Me: This took about 4 hours, so again, YMMV. I used my personal account and my personal Azure (don't trust MSFT folks with unlimited Azure credits, I pay for my own) so they didn't know it was me. I went through the regular line, not the Pre-check line LOL.

You'll receive an email when approved. You cannot sign any code until this is approved.

Step 4: Create a Certificate Profile

Once your identity is validated, create a certificate profile. This is what actually issues the signing certificates.

  1. In your Trusted Signing Account, click Certificate profiles
  2. Click Add certificate profile
  3. Fill in:
    • Profile name: Descriptive name (e.g., "MyAppProfile")
    • Profile type: Choose Public Trust (required to prevent SmartScreen)
    • Identity validation: Select your approved identity
    • Certificate type: Code Signing
  4. Click Add

Important: Only "Public Trust" profiles prevent SmartScreen warnings. "Private Trust" is for internal apps only. This took me a second to realize also as it's not an intuitive name.

Step 5: Verify Your Setup

# List your Trusted Signing accounts
az trustedsigning show \
  --resource-group MyAppSigning \
  --account-name myapp-signing

# Should show status: "Succeeded"

Write down these values - you'll need them later:

  • Account Name: myapp-signing
  • Certificate Profile Name: MyAppProfile
  • Endpoint URL: https://wus2.codesigning.azure.net/ (or your region)
  • Subscription ID: Found in Azure Portal
  • Resource Group: MyAppSigning

Part 2: Local Code Signing

Now let's sign an executable on your my machine. You don't NEED to do this, but I wanted to try it locally to avoid a bunch of CI/CD runs, and I wanted to right-click the EXE and see the cert in Properties before I took it all to the cloud. The nice part about this was that I didn't need to mess with any certificates.

Step 1: Assign Yourself the Signing Role

You need permission to actually use the signing service.

Option A: Azure Portal

  1. Go to your Trusted Signing Account
  2. Click Access control (IAM)
  3. Click AddAdd role assignment
  4. Search for and select Trusted Signing Certificate Profile Signer. This is important. I searched for "code" and found nothing. Search for "Trusted"
  5. Click Next
  6. Click Select members and find your user account
  7. Click Select, then Review + assign

Option B: Azure CLI

# Get your user object ID
$userId = az ad signed-in-user show --query id -o tsv

# Assign the role
az role assignment create \
  --role "Trusted Signing Certificate Profile Signer" \
  --assignee-object-id $userId \
  --scope /subscriptions/YOUR_SUBSCRIPTION_ID/resourceGroups/MyAppSigning/providers/Microsoft.CodeSigning/codeSigningAccounts/myapp-signing

Replace YOUR_SUBSCRIPTION_ID with your actual subscription ID.

Step 2: Login with the Correct Scope

This is crucial - you need to login with the specific codesigning scope.

# Logout first to clear old tokens
az logout

# Login with codesigning scope
az login --use-device-code --scope "https://codesigning.azure.net/.default"

This will give you a code to enter at https://microsoft.com/devicelogin. Follow the prompts.

Why device code flow? Because Azure CLI's default authentication can conflict with Visual Studio credentials in my experience. Device code flow is more reliable for code signing.

Step 3: Download the Sign Tool

Option A: Install Globally (Recommended for regular use)

# Install as a global tool (available everywhere)
dotnet tool install --global --prerelease sign

# Verify installation
sign --version

Option B: Install Locally (Project-specific)

# Install to current directory
dotnet tool install --tool-path . --prerelease sign

# Use with .\sign.exe

Which should I use?

  • Global: If you'll sign multiple projects or sign frequently
  • Local: If you want to keep the tool with a specific project or don't want it in your PATH

Step 4: Sign Your Executable

Note again that code signing URL is specific to you. The tscp is your Trusted Signing Certificate Profile name and the tsa is your Trusted Signing Account name. I set *.exe to sign all the EXEs in the folder and note that the -b base directory is an absolute path, not a relative one. For me it was d:\github\WindowsEdgeLight\publish, and your mileage will vary.

# Navigate to your project folder
cd C:\MyProject

# Sign the executable
.\sign.exe code trusted-signing `
  -b "C:\MyProject\publish" `
  -tse "https://wus2.codesigning.azure.net" `
  -tscp "MyAppProfile" `
  -tsa "myapp-signing" `
  *.exe `
  -v Information

Parameters explained:

  • -b: Base directory containing files to sign
  • -tse: Trusted Signing endpoint (your region)
  • -tscp: Certificate profile name
  • -tsa: Trusted Signing account name
  • *.exe: Pattern to match files to sign
  • -v: Verbosity level (Trace, Information, Warning, Error)

Expected output:

info: Signing WindowsEdgeLight.exe succeeded.
Completed in 2743 ms.

Step 5: Verify the Signature

You can do this in PowerShell:

# Check the signature
Get-AuthenticodeSignature ".\publish\MyApp.exe" | Format-List

# Look for:
# Status: Valid
# SignerCertificate: CN=Your Name, O=Your Name, ...
# TimeStamperCertificate: Should be present

Right-click the EXEPropertiesDigital Signatures tab:

  • You should see your signature
  • "This digital signature is OK"

Common Local Signing Issues

I hit all of these lol

Issue: "Please run 'az login' to set up account"

  • Cause: Not logged in with the right scope
  • Fix: Run az logout then az login --use-device-code --scope "https://codesigning.azure.net/.default"

Issue: "403 Forbidden"

  • Cause: Wrong endpoint, account name, or missing permissions
  • Fix:
    • Verify endpoint matches your region (wus2, eus, etc.)
    • Verify account name is exact (case-sensitive)
    • Verify you have "Trusted Signing Certificate Profile Signer" role

Issue: "User account does not exist in tenant"

  • Cause: Azure CLI trying to use Visual Studio credentials
  • Fix: Use device code flow (see Step 2)

Part 3: Automated Signing with GitHub Actions

This is where the magic happens. I want to automatically sign every release. I'm using GitVersion so I just need to tag a commit and GitHub Actions will kick off a run. You can go look at a real run in detail at https://github.com/shanselman/WindowsEdgeLight/actions/runs/19775054123

Step 1: Create a Service Principal

GitHub Actions needs its own identity to sign code. We'll create a service principal (like a robot account). This is VERY different than your local signing setup.

Important: You need Owner or User Access Administrator role on your subscription to do this. If you don't have it, ask your Azure admin or a friend.

# Create service principal with signing permissions
az ad sp create-for-rbac \
  --name "MyAppGitHubActions" \
  --role "Trusted Signing Certificate Profile Signer" \
  --scopes /subscriptions/YOUR_SUBSCRIPTION_ID/resourceGroups/MyAppSigning/providers/Microsoft.CodeSigning/codeSigningAccounts/myapp-signing \
  --json-auth

This outputs JSON like this:

{
  "clientId": "12345678-1234-1234-1234-123456789abc",
  "clientSecret": "super-secret-value-abc123",
  "tenantId": "87654321-4321-4321-4321-cba987654321",
  "subscriptionId": "abcdef12-3456-7890-abcd-ef1234567890"
}

SAVE THESE VALUES IMMEDIATELY! You can't retrieve the clientSecret again. This is super important.

Alternative: Azure Portal Method

If CLI doesn't work:

  1. Azure PortalApp registrationsNew registration
  2. Name: "MyAppGitHubActions"
  3. Click Register
  4. Copy the Application (client) ID - this is AZURE_CLIENT_ID
  5. Copy the Directory (tenant) ID - this is AZURE_TENANT_ID
  6. Go to Certificates & secretsNew client secret
  7. Description: "GitHub Actions"
  8. Expiration: 24 months (max)
  9. Click Add and immediately copy the Value - this is AZURE_CLIENT_SECRET
  10. Go to your Trusted Signing Account → Access control (IAM)
  11. Add role assignmentTrusted Signing Certificate Profile Signer
  12. Select members → Search for "MyAppGitHubActions"
  13. Review + assign

Step 2: Add GitHub Secrets

Go to your GitHub repository:

  1. SettingsSecrets and variablesActions
  2. Click New repository secret for each:
  • AZURE_CLIENT_ID - From service principal output or App registration
  • AZURE_CLIENT_SECRET - From service principal output or Certificates & secrets
  • AZURE_TENANT_ID - From service principal output or App registration
  • AZURE_SUBSCRIPTION_ID - Azure Portal → Subscriptions

Security Note: These secrets are encrypted and never visible in logs. Only your workflow can access them. You'll never see them again.

Step 3: Update Your GitHub Workflow

This is a little confusing as it's YAML, which is Satan's markup, but it's what we have sunk to as a society.

Note the dotnet-version below. Yours might be 8 or 9, etc. Also, I am building both x64 and ARM versions and I am using GitVersion so if you want a more complete build.yml, you can go here https://github.com/shanselman/WindowsEdgeLight/blob/master/.github/workflows/build.yml I am also zipping mine up and prepping my releases so my loose EXE lives in a ZIP file.

Add signing steps to your .github/workflows/build.yml:

name: Build and Sign

on:
  push:
    tags:
      - 'v*'
  workflow_dispatch:

permissions:
  contents: write

jobs:
  build:
    runs-on: windows-latest
    
    steps:
    - name: Checkout code
      uses: actions/checkout@v4
      with:
        fetch-depth: 0
      
    - name: Setup .NET
      uses: actions/setup-dotnet@v4
      with:
        dotnet-version: '10.0.x'
        
    - name: Restore dependencies
      run: dotnet restore MyApp/MyApp.csproj

    - name: Build
      run: |
        dotnet publish MyApp/MyApp.csproj `
          -c Release `
          -r win-x64 `
          --self-contained

    # === SIGNING STEPS START HERE ===
    
    - name: Azure Login
      uses: azure/login@v2
      with:
        creds: '{"clientId":"${{ secrets.AZURE_CLIENT_ID }}","clientSecret":"${{ secrets.AZURE_CLIENT_SECRET }}","subscriptionId":"${{ secrets.AZURE_SUBSCRIPTION_ID }}","tenantId":"${{ secrets.AZURE_TENANT_ID }}"}'

    - name: Sign executables with Trusted Signing
      uses: azure/trusted-signing-action@v0
      with:
        azure-tenant-id: ${{ secrets.AZURE_TENANT_ID }}
        azure-client-id: ${{ secrets.AZURE_CLIENT_ID }}
        azure-client-secret: ${{ secrets.AZURE_CLIENT_SECRET }}
        endpoint: https://wus2.codesigning.azure.net/
        trusted-signing-account-name: myapp-signing
        certificate-profile-name: MyAppProfile
        files-folder: ${{ github.workspace }}\MyApp\bin\Release\net10.0-windows\win-x64\publish
        files-folder-filter: exe
        files-folder-recurse: true
        file-digest: SHA256
        timestamp-rfc3161: http://timestamp.acs.microsoft.com
        timestamp-digest: SHA256
    
    # === SIGNING STEPS END HERE ===
        
    - name: Create Release
      if: startsWith(github.ref, 'refs/tags/')
      uses: softprops/action-gh-release@v2
      with:
        files: MyApp/bin/Release/net10.0-windows/win-x64/publish/MyApp.exe
      env:
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

Key points:

  • endpoint: Use YOUR region's endpoint (wus2, eus, etc.)
  • trusted-signing-account-name: Your account name (exact, case-sensitive)
  • certificate-profile-name: Your certificate profile name (exact, case-sensitive)
  • files-folder: Path to your compiled executables
  • files-folder-filter: File types to sign (exe, dll, etc.)
  • files-folder-recurse: Sign files in subfolders

Step 4: Test the Workflow

Now trigger the workflow. You have two options:

Option A: Manual Trigger (Safest for testing)

Since the workflow includes workflow_dispatch:, you can trigger it manually without creating a tag:

# Trigger manually via GitHub CLI
gh workflow run build.yml

# Or go to GitHub web UI:
# Actions tab → "Build and Sign" workflow → "Run workflow" button

This is ideal for testing because:

  • No tag required
  • Won't create a release
  • Can test multiple times
  • Easy to debug issues

Option B: Create a Tag (For actual releases)

# Make sure you're on your main branch with no uncommitted changes
git status

# Create and push a tag
git tag v1.0.0
git push origin v1.0.0

Use this when you're ready to create an actual release with signed binaries. This is what I am doing on my side.

Step 5: Monitor the Build

Watch the progress with GitHub CLI:

# See latest runs
gh run list --limit 5

# Watch a specific run
gh run watch

# View detailed status
gh run view --log

Or visit: https://github.com/YOUR_USERNAME/YOUR_REPO/actions

Look for:

  • Azure Login - Should complete in ~5 seconds
  • Sign executables with Trusted Signing - Should complete in ~10-30 seconds
  • Create Release - Your signed executable is now available in /releases in your GitHib project

Common GitHub Actions Issues

I hit a few of these, natch.

Issue: "403 Forbidden" during signing

  • Cause: Service principal doesn't have permissions
  • Fix:
    1. Go to Azure Portal → Trusted Signing Account → Access control (IAM)
    2. Verify "MyAppGitHubActions" has "Trusted Signing Certificate Profile Signer" role
    3. If not, add it manually

Issue: "No files matched the pattern"

  • Cause: Wrong files-folder path or build artifacts in wrong location
  • Fix:
    1. Add a debug step before signing: - run: Get-ChildItem -Recurse
    2. Find where your EXE is actually located
    3. Update files-folder to match

Issue: Secrets not working

  • Cause: Typo in secret name or value not saved
  • Fix:
    1. Verify secret names EXACTLY match (case-sensitive)
    2. Re-create secrets if unsure
    3. Make sure no extra spaces in values

Issue: "DefaultAzureCredential authentication failed"

  • Cause: Usually wrong tenant ID or client ID
  • Fix: Verify all 4 secrets are correct from service principal output

Part 4: Understanding the Certificate

Certificate Lifecycle

Azure Trusted Signing uses short-lived certificates (typically 3 days). This freaked me out but they say this is actually a security feature:

  • If a certificate is compromised, it expires quickly
  • You never manage certificate files or passwords
  • Automatic renewal - you don't have to do anything

But won't my signature break after 3 days?

No, it seems that's what timestamping is for. When you sign a file:

  1. Azure issues a 3-day certificate
  2. The file is signed with that certificate
  3. A timestamp server records "this file was signed on DATE"
  4. Even after the certificate expires, the signature remains valid because the timestamp proves it was signed when the certificate was valid

That's why both local and GitHub Actions signing include:

timestamp-rfc3161: http://timestamp.acs.microsoft.com

What the Certificate Contains

Your signed executable has a certificate with:

  • Subject: Your name (e.g., "CN=John Doe, O=John Doe, L=Seattle, S=Washington, C=US")
  • Issuer: Microsoft ID Verified CS EOC CA 01
  • Valid Dates: 3-day window
  • Key Size: 3072-bit RSA (very secure)
  • Enhanced Key Usage: Code Signing

Verify Certificate on Any Machine

# Using PowerShell
Get-AuthenticodeSignature "MyApp.exe" | Select-Object -ExpandProperty SignerCertificate | Format-List

# Using Windows UI
# Right-click EXE → Properties → Digital Signatures tab → Details → View Certificate

This whole thing took me about an hour to 75 minutes. It was detailed, but not deeply difficult. Misspellings, case-sensitivity, and a few account issues with Role-Based Access Control did slow me down. Hope this helps!

Used Resources

Written in November 2025 based on real-world implementation for WindowsEdgeLight. Your setup might vary slightly depending on Azure region and account type. Things change, be stoic.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Webcam randomly pausing in OBS, Discord, and websites - LSVCam and TikTok Studio

October 09, 2024 Comment on this post [5] Posted in Bugs
Sponsored By

I use my webcam constantly for streaming and I'm pretty familiar with all the internals and how the camera model on Windows works. I also use OBS extensively, so I regularly use the OBS virtual camera and flow everything through Open Broadcasting Studio.

For my podcast, I use Zencastr which is a web-based app that talks to the webcam via the browser APIs. For YouTubes, I'll use Riverside or StreamYard, also webapps.

I've done this reliably for the last several years without any trouble. Yesterday, I started seeing the most weird thing and it was absolutely perplexing and almost destroyed the day. I started seeing regular pauses in my webcam stream but only in two instances.

  • The webcam would pause for 10-15 seconds every 90 or so seconds when access the Webcam in a browser
  • I would see a long pause/hang in OBS when double clicking on my Video Source (Webcam) to view its properties

Micah initially said USB but my usb bus and hubs have worked reliably for years. Thought something might have changed in my El Gato capture device, but that has also been rock solid for 1/2 a decade. Then I started exploring virtual cameras and looked in the windows camera dialog under settings for a list of all virtual cameras.

Interestingly, virtual cameras don't get listed under Cameras in Settings in Windows:

List of Cameras in Windows

From what I can tell, there's no user interface to list out all of your cameras - virtual or otherwise - in windows.

Here's a quick PowerShell script you can run to list out anything 'connected' that also includes the string "cam" in your local devices

Get-CimInstance -Namespace root\cimv2 -ClassName Win32_PnPEntity |
Where-Object { $_.Name -match 'Cam' } |
Select-Object Name, Manufacturer, PNPDeviceID

and my output

Name                                     Manufacturer        PNPDeviceID
---- ------------ -----------
Cam Link 4K Microsoft USB\VID_0FD9&PID_0066&MI_00\7&3768531A&0&0000
Digital Audio Interface (2- Cam Link 4K) Microsoft SWD\MMDEVAPI\{0.0.1.00000000}.{AF1690B6-CA2A-4AD3-AAFD-8DDEBB83DD4A}
Logitech StreamCam WinUSB Logitech USB\VID_046D&PID_0893&MI_04\7&E36D0CF&0&0004
Logitech StreamCam (Generic USB Audio) USB\VID_046D&PID_0893&MI_02\7&E36D0CF&0&0002
Logitech StreamCam Logitech USB\VID_046D&PID_0893&MI_00\7&E36D0CF&0&0000
Remote Desktop Camera Bus Microsoft UMB\UMB\1&841921D&0&RDCAMERA_BUS
Cam Link 4K (Generic USB Audio) USB\VID_0FD9&PID_0066&MI_03\7&3768531A&0&0003
Windows Virtual Camera Device Microsoft SWD\VCAMDEVAPI\B486E21F1D4BC97087EA831093E840AD2177E046699EFBF62B27304F5CCAEF57

However, when I list out my cameras using JavaScript enumerateDevices() like this

// Put variables in global scope to make them available to the browser console.
async function listWebcams() {
try {
const devices = await navigator.mediaDevices.enumerateDevices();
const webcams = devices.filter(device => device.kind === 'videoinput');

if (webcams.length > 0) {
console.log("Connected webcams:");
webcams.forEach((webcam, index) => {
console.log(`${index + 1}. ${webcam.label || `Camera ${index + 1}`}`);
});
} else {
console.log("No webcams found.");
}
} catch (error) {
console.error("Error accessing media devices:", error);
}
}
listWebcams();

I would get:

Connected webcams:
test.html:11 1. Logitech StreamCam (046d:0893)
test.html:11 2. OBS Virtual Camera (Windows Virtual Camera)
test.html:11 3. Cam Link 4K (0fd9:0066)
test.html:11 4. LSVCam
test.html:11 5. OBS Virtual Camera

So, what, what's LSVCam? And depending on how I'd call it I'd get the pause and

getUserMedia error: NotReadableError NotReadableError: Could not start video source

Some apps could see this LSVCam and others couldn't. OBS really dislikes it, browsers really dislike it and it seemed to HANG on enumeration of cameras. Why can parts of Windows see this camera and others can't?

I don't know. Do you?

Regardless, it turns that it appears once in my registry, here (this is a dump of the key, you just care about the Registry PATH)

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOT\CLSID\{860BB310-5D01-11d0-BD3B-00A0C911CE86}\Instance\LSVCam]
"FriendlyName"="LSVCam"
"CLSID"="{BA80C4AD-8AED-4A61-B434-481D46216E45}"
"FilterData"=hex:02,00,00,00,00,00,20,00,01,00,00,00,00,00,00,00,30,70,69,33,\
08,00,00,00,00,00,00,00,01,00,00,00,00,00,00,00,00,00,00,00,30,74,79,33,00,\
00,00,00,38,00,00,00,48,00,00,00,76,69,64,73,00,00,10,00,80,00,00,aa,00,38,\
9b,71,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00

If you want to get rid of it, delete HKEY_CLASSES_ROOT\CLSID\{860BB310-5D01-11d0-BD3B-00A0C911CE86}\Instance\LSVCam

WARNING: DO NOT delete the \Instance, just the LSVCam and below. I am a random person on the internet and you got here by googling, so if you mess up your machine by going into RegEdit.exe, I'm sorry to this man, but it's above me now.

Where did LSVCam.dll come from, you may ask? TikTok Live Studio, baby. Live Studio Video/Virtual Cam, I am guessing.

Directory of C:\Program Files\TikTok LIVE Studio\0.67.2\resources\app\electron\sdk\lib\MediaSDK_V1

09/18/2024 09:20 PM 218,984 LSVCam.dll
1 File(s) 218,984 bytes

This is a regression that started recently for me, so it's my opinion that they are installing a virtual camera for their game streaming feature but they are doing it poorly. It's either not completely installed, or hangs on enumeration, but the result is you'll see hangs on camera enumeration in your apps, especually browser apps that poll for cameras changes or check on a timer.

Nothing bad will happen if you delete the registry key BUT it'll show back up when you run TikTok Studio again. I still stream to TikTok, I just delete this key each time until someone on the TikTok Studio development team sees this blog post.

Hope this helps!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Open Sourcing DOS 4

April 25, 2024 Comment on this post [19] Posted in Open Source
Sponsored By

Beta DOS DisksSee the canonical version of this blog post at the Microsoft Open Source Blog!

Ten years ago, Microsoft released the source for MS-DOS 1.25 and 2.0 to the Computer History Museum, and then later republished them for reference purposes. This code holds an important place in history and is a fascinating read of an operating system that was written entirely in 8086 assembly code nearly 45 years ago.

Today, in partnership with IBM and in the spirit of open innovation, we're releasing the source code to MS-DOS 4.00 under the MIT license. There's a somewhat complex and fascinating history behind the 4.0 versions of DOS, as Microsoft partnered with IBM for portions of the code but also created a branch of DOS called Multitasking DOS that did not see a wide release.

https://github.com/microsoft/MS-DOS

A young English researcher named Connor "Starfrost" Hyde recently corresponded with former Microsoft Chief Technical Officer Ray Ozzie about some of the software in his collection. Amongst the floppies, Ray found unreleased beta binaries of DOS 4.0 that he was sent while he was at Lotus. Starfrost reached out to the Microsoft Open Source Programs Office (OSPO) to explore releasing DOS 4 source, as he is working on documenting the relationship between DOS 4, MT-DOS, and what would eventually become OS/2. Some later versions of these Multitasking DOS binaries can be found around the internet, but these new Ozzie beta binaries appear to be much earlier, unreleased, and also include the ibmbio.com source. 

Scott Hanselman, with the help of internet archivist and enthusiast Jeff Sponaugle, has imaged these original disks and carefully scanned the original printed documents from this "Ozzie Drop". Microsoft, along with our friends at IBM, think this is a fascinating piece of operating system history worth sharing. 

Jeff Wilcox and OSPO went to the Microsoft Archives, and while they were unable to find the full source code for MT-DOS, they did find MS DOS 4.00, which we're releasing today, alongside these additional beta binaries, PDFs of the documentation, and disk images. We will continue to explore the archives and may update this release if more is discovered. 

Thank you to Ray Ozzie, Starfrost, Jeff Sponaugle, Larry Osterman, our friends at the IBM OSPO, as well as the makers of such digital archeology software including, but not limited to Greaseweazle, Fluxengine, Aaru Data Preservation Suite, and the HxC Floppy Emulator. Above all, thank you to the original authors of this code, some of whom still work at Microsoft and IBM today!

If you'd like to run this software yourself and explore, we have successfully run it directly on an original IBM PC XT, a newer Pentium, and within the open source PCem and 86box emulators. 

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Updating to .NET 8, updating to IHostBuilder, and running Playwright Tests within NUnit headless or headed on any OS

March 07, 2024 Comment on this post [54] Posted in ASP.NET | DotNetCore
Sponsored By

All the Unit Tests passI've been doing not just Unit Testing for my sites but full on Integration Testing and Browser Automation Testing as early as 2007 with Selenium. Lately, however, I've been using the faster and generally more compatible Playwright. It has one API and can test on Windows, Linux, Mac, locally, in a container (headless), in my CI/CD pipeline, on Azure DevOps, or in GitHub Actions.

For me, it's that last moment of truth to make sure that the site runs completely from end to end.

I can write those Playwright tests in something like TypeScript, and I could launch them with node, but I like running end unit tests and using that test runner and test harness as my jumping off point for my .NET applications. I'm used to right clicking and "run unit tests" or even better, right click and "debug unit tests" in Visual Studio or VS Code. This gets me the benefit of all of the assertions of a full unit testing framework, and all the benefits of using something like Playwright to automate my browser.

In 2018 I was using WebApplicationFactory and some tricky hacks to basically spin up ASP.NET within .NET (at the time) Core 2.1 within the unit tests and then launching Selenium. This was kind of janky and would require to manually start a separate process and manage its life cycle. However, I kept on with this hack for a number of years basically trying to get the Kestrel Web Server to spin up inside of my unit tests.

I've recently upgraded my main site and podcast site to .NET 8. Keep in mind that I've been moving my websites forward from early early versions of .NET to the most recent versions. The blog is happily running on Linux in a container on .NET 8, but its original code started in 2002 on .NET 1.1.

Now that I'm on .NET 8, I scandalously discovered (as my unit tests stopped working) that the rest of the world had moved from IWebHostBuilder to IHostBuilder five version of .NET ago. Gulp. Say what you will, but the backward compatibility is impressive.

As such my code for Program.cs changed from this

public static void Main(string[] args)
{
CreateWebHostBuilder(args).Build().Run();
}

public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
WebHost.CreateDefaultBuilder(args)
.UseStartup<Startup>();

to this:

public static void Main(string[] args)
{
CreateHostBuilder(args).Build().Run();
}

public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args).
ConfigureWebHostDefaults(WebHostBuilder => WebHostBuilder.UseStartup<Startup>());

Not a major change on the outside but tidies things up on the inside and sets me up with a more flexible generic host for my web app.

My unit tests stopped working because my Kestral Web Server hack was no longer firing up my server.

Here is an example of my goal from a Playwright perspective within a .NET NUnit test.

[Test]
public async Task DoesSearchWork()
{
await Page.GotoAsync(Url);

await Page.Locator("#topbar").GetByRole(AriaRole.Link, new() { Name = "episodes" }).ClickAsync();

await Page.GetByPlaceholder("search and filter").ClickAsync();

await Page.GetByPlaceholder("search and filter").TypeAsync("wife");

const string visibleCards = ".showCard:visible";

var waiting = await Page.WaitForSelectorAsync(visibleCards, new PageWaitForSelectorOptions() { Timeout = 500 });

await Expect(Page.Locator(visibleCards).First).ToBeVisibleAsync();

await Expect(Page.Locator(visibleCards)).ToHaveCountAsync(5);
}

I love this. Nice and clean. Certainly here we are assuming that we have a URL in that first line, which will be localhost something, and then we assume that our web application has started up on its own.

Here is the setup code that starts my new "web application test builder factory," yeah, the name is stupid but it's descriptive. Note the OneTimeSetUp and the OneTimeTearDown. This starts my web app within the context of my TestHost. Note the :0 makes the app find a port which I then, sadly, have to dig out and put into the Url private for use within my Unit Tests. Note that the <Startup> is in fact my Startup class within Startup.cs which hosts my app's pipeline and Configure and ConfigureServices get setup here so routing all works.

private string Url;
private WebApplication? _app = null;

[OneTimeSetUp]
public void Setup()
{
var builder = WebApplicationTestBuilderFactory.CreateBuilder<Startup>();

var startup = new Startup(builder.Environment);
builder.WebHost.ConfigureKestrel(o => o.Listen(IPAddress.Loopback, 0));
startup.ConfigureServices(builder.Services);
_app = builder.Build();

// listen on any local port (hence the 0)
startup.Configure(_app, _app.Configuration);
_app.Start();

//you are kidding me
Url = _app.Services.GetRequiredService<IServer>().Features.GetRequiredFeature<IServerAddressesFeature>().Addresses.Last();
}

[OneTimeTearDown]
public async Task TearDown()
{
await _app.DisposeAsync();
}

So what horrors are buried in WebApplicationTestBuilderFactory? The first bit is bad and we should fix it for .NET 9. The rest is actually every nice, with a hat tip to David Fowler for his help and guidance! This is the magic and the ick in one small helper class.

public class WebApplicationTestBuilderFactory 
{
public static WebApplicationBuilder CreateBuilder<T>() where T : class
{
//This ungodly code requires an unused reference to the MvcTesting package that hooks up
// MSBuild to create the manifest file that is read here.
var testLocation = Path.Combine(AppContext.BaseDirectory, "MvcTestingAppManifest.json");
var json = JsonObject.Parse(File.ReadAllText(testLocation));
var asmFullName = typeof(T).Assembly.FullName ?? throw new InvalidOperationException("Assembly Full Name is null");
var contentRootPath = json?[asmFullName]?.GetValue<string>();

//spin up a real live web application inside TestHost.exe
var builder = WebApplication.CreateBuilder(
new WebApplicationOptions()
{
ContentRootPath = contentRootPath,
ApplicationName = asmFullName
});
return builder;
}
}

The first 4 lines are nasty. Because the test runs in the context of a different directory and my website needs to run within the context of its own content root path, I have to force the content root path to be correct and the only way to do that is by getting the apps base directory from a file generated within MSBuild from the (aging) MvcTesting package. The package is not used, but by referencing it it gets into the build and makes that file that I then use to pull out the directory.

If we can get rid of that "hack" and pull the directory from context elsewhere, then this helper function turns into a single line and .NET 9 gets WAY WAY more testable!

Now I can run my Unit Tests AND Playwright Browser Integration Tests across all OS's, headed or headless, in docker or on the metal. The site is updated to .NET 8 and all is right with my code. Well, it runs at least. ;)

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Using WSL and Let's Encrypt to create Azure App Service SSL Wildcard Certificates

June 27, 2023 Comment on this post [3] Posted in Azure
Sponsored By

There are many let's encrypt automatic tools for azure but I also wanted to see if I could use certbot in wsl to generate a wildcard certificate for the azure Friday website and then upload the resulting certificates to azure app service.

Azure app service ultimately needs a specific format called dot PFX that includes the full certificate path and all intermediates.

Per the docs, App Service private certificates must meet the following requirements:

  • Exported as a password-protected PFX file, encrypted using triple DES.
  • Contains private key at least 2048 bits long
  • Contains all intermediate certificates and the root certificate in the certificate chain.

If you have a PFX that doesn't meet all these requirements you can have Windows reencrypt the file.

I use WSL and certbot to create the cert, then I import/export in Windows and upload the resulting PFX.

Within WSL, install certbot:

sudo apt update
sudo apt install python3 python3-venv libaugeas0
sudo python3 -m venv /opt/certbot/
sudo /opt/certbot/bin/pip install --upgrade pip
sudo /opt/certbot/bin/pip install certbot

Then I generate the cert. You'll get a nice text UI from certbot and update your DNS as a verification challenge. Change this to make sure it's two lines, and your domains and subdomains are correct and your paths are correct.

sudo certbot certonly --manual --preferred-challenges=dns --email YOUR@EMAIL.COM   
--server https://acme-v02.api.letsencrypt.org/directory
--agree-tos --manual-public-ip-logging-ok -d "azurefriday.com" -d "*.azurefriday.com"
sudo openssl pkcs12 -export -out AzureFriday2023.pfx
-inkey /etc/letsencrypt/live/azurefriday.com/privkey.pem
-in /etc/letsencrypt/live/azurefriday.com/fullchain.pem

I then copy the resulting file to my desktop (check your desktop path) so it's now in the Windows world.

sudo cp AzureFriday2023.pfx /mnt/c/Users/Scott/OneDrive/Desktop

Now from Windows, import the PFX, note the thumbprint and export that cert.

Import-PfxCertificate -FilePath "AzureFriday2023.pfx" -CertStoreLocation Cert:\LocalMachine\My 
-Password (ConvertTo-SecureString -String 'PASSWORDHERE' -AsPlainText -Force) -Exportable

Export-PfxCertificate -Cert Microsoft.PowerShell.Security\Certificate::LocalMachine\My\597THISISTHETHUMBNAILCF1157B8CEBB7CA1
-FilePath 'AzureFriday2023-fixed.pfx' -Password (ConvertTo-SecureString -String 'PASSWORDHERE' -AsPlainText -Force)

Then upload the cert to the Certificates section of your App Service, under Bring Your Own Cert.

Custom Domains in Azure App Service

Then under Custom Domains, click Update Binding and select the new cert (with the latest expiration date).

image

Next step is to make this even more automatic or select a more automated solution but for now, I'll worry about this in September and it solved my expensive Wildcard Domain issue.

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook bluesky subscribe
About   Newsletter
Hosting By
Hosted on Linux using .NET in an Azure App Service

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.