Async/await in Desktop Applications

async, WinForms, WPF, multi-threading Comments

This is a transcript of a demonstration I gave in our company internally about async/await "challenges" in UI programming. YOu can find the accompanying repository in my repository on Github.

Demo App

The form contains three buttons which are intended to download the the content from asynchronously and show it in a TextBox.

Cross-Thread issue 😦

private async void button1_Click(object sender, EventArgs e)
    // await a completed task => will continue synchronously
    await SomeFastAsyncOperation().ConfigureAwait(false);

    // await a slow task => will continue in another thread
    var t = await SomeSlowAsyncOperation().ConfigureAwait(false);

    // write text to text box
    textBox1.Text = t;

When clicking the first button, an InvalidOperationException will be thrown with the message

Cross-thread operation not valid: Control 'textBox1' accessed from a thread other than the thread it was created on.

That's because in Win32 UIs you must access controls from the same thread that created them. Because the after awaiting SomeSlowAsyncOperation the method continues not in the main thread but a nackground thread, accessing textBox1 is forbidden.

When you debug button1_Click, pay attention to the Threads tool window.


  1. Entering the method, you'll be on thread #1, the main thread.
  2. After calling SomeFastAsyncOperation, the code continues on thread #1. That's because the method returns an already completed task, so the code can continue synchronously.
  3. In contrast, SomeSlowAsyncOperation returns a not-completed task, therefore the succeeding code will continue not another one. (The UI thread will be released here and continue pumping the Win32 message queue)
  4. The property textBox1.Text will be set in said background thread and fail, because Win32 controls mist be accessed from the same thread that created them.

Blocking 😦

private async void button2_Click(object sender, EventArgs e)
    // await a completed task => will continue synchronously
    await SomeFastAsyncOperation().ConfigureAwait(false);

    // await a slow task => will continue in another thread
    var t = SomeSlowAsyncOperation().ConfigureAwait(false).GetAwaiter().GetResult();

    // write text to text box
    textBox1.Text = t;

Clicking the second button will freeze the application. The GetAwaiter().GetResult() invocation will try to re-enter the the main thread, which is waiting for the task, so we'll run into a dead-lock.

Run smoothly 😃

private async void button3_Click(object sender, EventArgs e)
    // force switch to threadpool thread
    await TaskScheduler.Default;

    // await a completed task => will continue synchronously
    await SomeFastAsyncOperation().ConfigureAwait(false);

    // await a slow task => will continue in another thread
    var t = await SomeSlowAsyncOperation().ConfigureAwait(false);

    // switch to main thread
    await _joinableTaskFactory.SwitchToMainThreadAsync();

    // write text to text box
    textBox1.Text = t;

Using the JoinableTaskFactory from Microsoft.VisualStudio.Threading (NuGet package here), we are able to "switch" back to the UI thread.

As the namespace implies, Microsoft.VisualStudio.Threading originates from the Visual Studio team. I stumbled over this library while reading the documentation for Visual Studio extensibility. Visual Studio is quite a complex application, and there are myriads of extensions available. To improve the start-up time, Microsoft strongly recommends to make use of asynchronous programming (see How to: Manage multiple threads in managed code and How to: Use AsyncPackage to load VSPackages in the background)

I won't go into details of how async/await works. Basically, the compiler generates a state machine, which Dixin explains pretty good in his blog serie Understanding C# async / await (1) Compilation (Part 2, Part 3).

In our case, two calls are interesting:

  1. await TaskScheduler.Default; will continue the succeeding code in a threadpool thread
    (actually, the library provides an extension method GetAwaiter(this TaskScheduler this). This works because the compiler uses a naming convention instead of requiring an interface implemenntation)
  2. await _joinableTaskFactory.SwitchToMainThreadAsync(); will continue the succeeding code in the main thread
    (actually, in the thread with instantiated _joinableTaskFactory).

As you could see, Microsoft.VisualStudio.Threading makes asynchronous programming in desktop applications, both WinForms and WPF, much simpler.

BTW, I've learned a lot reading the code of that library. It provides much more async helpers like AsyncEventHandlers.

Here are some more links if you want to learn more about async programming:

Generating workflow diagrams for TFS work items

TFS, Graphviz, Powershell Comments

In my current position as the Technical Lead of Product Development I have several responsibilities. One of them is the definition and implementation of our development processes. We're using Team Foundation Server, which supports rich customization of the process configuration, especially the workflow of work items.

To document the workflow of our work items, I wanted to create perspicuous charts. However, if you're a nerd like me, you don't want to use Powerpoint or Visio to create high gloss charts. Instead I like to automate the creation of the charts.

Fortunately, the XML format of work item template definitions (WITD) is well-documented, see All WORKFLOW XML elements reference. To get the XML file of a WITD, you can use either the Visual Studio Add-in TFS Process Template Editor or use witadmin:

witadmin exportwitd /collection:CollectionURL /p:Project /n:TypeName [/f:FileName]

On the other end, the Graphviz suite includes a nice small tool named dot to draw directed graphs as PNGs, reading the defintion of the graph from a text file.

The challenge was now to convert the XML of the WITD to the DOT language. But that's quite easy to accomplish using Powershell. But before I show the script, first a picture of the default workflow for bugs from the Scrum process template:

Pretzel and Azure

And here's the script:

Liquid error: One or more errors occurred.

If you pay close attention, you may notice that if only certain users or groups are permitted to change a work item to a specific state, the graph will show this too. E.g. if only members of the QA are allowed to move a but from the Done state, the graph will look like this:

Pretzel and Azure

Nevertheless, the script was written in a short time, it does what it should do without any error handling. However, it suits my needs. Maybe yours as well.

ResourceLib, PE Format, and WiX

WiX, ResourceLib Comments

Some time ago I reported a bug and provided a pull request to resourcelib, a managed library to read and write Win32 resources in executables or DLL's. And unawarely, the next morning I was a maintainer of that library.

This blog post is about an issue we've received: someone tried to patched the Win32 resources of a setup.exe , an executable installer created with WiX. However, after changing the resources with resourdelib, the setup didn't work anymore.

I've spent some time investigating this issue using dumpbin and reading the PE format specification. Because I don't know how good Google is at indexing Github issues, I'll also post my analysis here in my blog for reference. The original thread is here.

TL;DR: to me it looks like WiX is doing it wrong.

According to the output of dumpbin there are 7 sections in the executable the issuer provided:

# Name Range
1 .text 0x00000400 to 0x00049FFF
2 .rdata 0x0004A000 to 0x00068DFF
3 .data 0x00068E00 to 0x000697FF
4 .wixburn 0x00069800 to 0x000699FF
5 .tls 0x00069A00 to 0x00069BFF
6 .rsrc 0x00069C00 to 0x0006D7FF
7 .reloc 0x0006D800 to 0x000715FF

In a hex viewer you can see that after the last section, the file continues for another 104205 bytes, starting with 0x4D 0x53 0x43 0x46 (MSCF, the magic number starting a cab file).

I patched the StringFileInfo resource using resourcelib, which changed the content of the .rsrc section only. Afterwards the file ended at 0x000715FF, i.e. the following 104205 bytes were missing.

By the way, the .wixburn section contains following bytes:

  0046C000: 00 43 F1 00 02 00 00 00 04 12 28 81 2C 64 40 48  .C±.......(.,d@H
  0046C010: B2 B1 34 64 EC 08 65 64 00 16 07 00 00 00 00 00  ▓▒4d∞.ed........
  0046C020: 00 00 00 00 00 00 00 00 01 00 00 00 02 00 00 00  ................
  0046C030: 92 8E 01 00 7B 08 00 00                          ....{...

which means

Field Bytes Value
magic number 0-3 0x00f14300
Version 4-7 0x00000002
Bundled GUID 8-23 {81281204-642c-4840-b2b1-3464ec086564}
Engine (stub) size 24-27 0x00071600
Original checksum 28-31 0x00000000
Original signature offset 32-35 0x00000000
Original signature size 36-39 0x00000000
Container Type (1 = CAB) 40-43 1
Container Count 44-47 2
Byte count of manifest + UX container 48-51 102,034
Byte count of attached container 52-55 2,171

Intermediate result

  • the .wixburn section points to 104,205 bytes (102,034 + 2,171), starting at 0x00071600.
  • the last PE section ends at 0x000715ff.
  • after using the official Win32 API to edit resources, the file ends at 0x000715ff, and the following 104,205 byte are gone.

So after editing the resources, the exact payload WiX is referring to is eliminated.

Therefore my conclusion is that WiX

  • adds a (small) section .wixburn pointing beyond the last section, and
  • appends the payload (read: the cabinet file) at that location-

As far as I understand the specification, this procedure is not compliant with the PE format. That might be the reason why EndUpdateResource cuts the file after the last section when writing the changed resources.

Cleaning NuGet's cache

NuGet, PowerShell Comments

From the beginning NuGet used a per-solution folder packages to store all packages for the projects in a solution. (Does anyone else remember the numerous discussion whether that folder belongs into the VCS or not?).

That changed with NuGet 3 and project.json-based projects:

Global Packages Folder

With Project.JSON managed projects, there is now a packages folder that is shared for all projects that you work with. Packages are downloaded and stored in the %userprofile%\.nuget\packages folder. This means that if you are working on multiple UWP projects on your workstation, you only have one copy of the EntityFramework package and its dependencies on your machine. All .NET projects will acquire package references from this global shared folder. This also means that when you need to configure a new project, your project will not take time starting so that it can download a fresh copy of EntityFramework.nupkg Instead, it will simply and quickly reference the files you have already downloaded. ASP.NET 5 uses the %userprofile%\.dnx\packages folder and as that framework nears completion it will use the %userprofile%\.nuget\packages folder as well.

Well, I didn't pay much attention to that change, until I ran out of disk space last week and used WinDirTree to find the culprit. Indeed, the size of my packages folder was more than 6 GB.

Therefore I wrote a small PowerShell script which deletes all packages which haven't been accessed for a configurable number of days (150 by default):

Liquid error: One or more errors occurred.

Don't worry that you could delete a package which you would need later again. NuGet will just download the missing package again.

The script support the -WhatIf parameter, so calling

.\Clear-NuGetCache.ps1 -CutOffDays 90 -WhatIf

wouldn't delete a single byte but log which folders the script would remove.

Pretzel and Kudu on Azure

Pretzel Comments

I've published a couple of posts about that I'm using Pretzel to generate the HTML pages of my blog. However, I didn't talk about the hosting.

Actually, it's quite simple: The source files for the site are hosted in a git repository on GitHub. The generated site is hosted in Azure. Whenever I push changes to the git repository, the web site will be updated automatically.

Pretzel and Azure

The setup is a two-stage process: first, you have to create a Azure App Service and connect it to your git repository. The steps involved are documented very well in Continuous Deployment to Azure App Service.

The second step is to execute Pretzel on the Azure side. Enter Kudu. Kudu is the engine behind git deployments in Azure. It's well documented in the wiki at GitHub. By default, Kudu will locate the relevant csproj file, compile it, and copy the artifacts to wwwroot. That's why many web sizes running on Azure contain an empty "shim project".

However, you can simplify the setup by customizing Kudu's behavior. In my case I want Kudu to run pretzel.exe to generate the static HTML files from my sources:

  1. Add pretzel.exe (and all its dependencies) to your git repository (I've used a subfolder named _pretzel)

  2. Add a batch file deploy.cmd to execute pretzel.exe:

    @echo off
    echo Running Pretzel...
    _pretzel\pretzel.exe bake --destination=%DEPLOYMENT_TARGET%

    bake is the Pretzel's command to generate the files, and the destination folder is %DEPLOYMENT_TARGET%, which is the wwwroot folder.

  3. Instruct Kudu to execute that deploy.cmd by creating a file .deployment with following content:

    command = deploy.cmd

That's all. Whenever I push changes to the git repository, Kudu will get the current files, execute Pretzel, and the updated web site is public. The whole process takes less than a minute.

Of course this can be adapted to any other static site generator too, e.g. Jekyll.