NuGet package for 7Zip4Powershell

7-zip, Powershell, NuGet comments

nuget

A few days ago I mentioned 7-Zip for Powershell. I’ve now created a NuGet package and published it at NuGet.org.

It took me a while to figure it out, but finally it’s a “tools only” package, i.e. it adds no references to your project.

To use the new commands just add the package to your solution and import it in your Powershell script:

$SolutionDir = split-path -parent $PSCommandPath
Import-Module (Join-Path $SolutionDir "packages\7Zip4Powershell.1.0\tools\7Zip4Powershell.dll")

Fun with RavenDB and ASP.NET MVC: part I

RavenDB, aspnetmvc comments

I’m working on a small pet project with ASP.NET MVC, where hierarchical structured documents are stored in RavenDB. These documents can be retrieved by their unique URL, which is also stored in the document. Because there are different kinds of document class, they all derive from the common interface IRoutable. This interface defines a property Path, by which the document can be accessed.

public interface IRoutable {
    string Id { get; set; }
    string Path { get; set; }
}

public class Document : IRoutable {
    public string Id { get; set; }
    public string Path { get; set; }
}

using (var session = _store.OpenSession()) {
    session.Store(new Document { Path = "a" });
    session.Store(new Document { Path = "a/b" });
    session.Store(new Document { Path = "a/b/c" });
    session.Store(new Document { Path = "a/d" });
    session.Store(new Document { Path = "a/d/e" });
    session.Store(new Document { Path = "a/f" });
    session.SaveChanges();
}

Additionally there’s the requirement, that the incoming URL may consist of more parts than the document’s path, carrying some additional information about the request. Here are some examples of possible requests, and which document should match:

| Request | Found document | |-----------|----------------| | a/x | a | | a/b/c/y/z | a/b/c |

So, given the path, how can you find the correct document?

The solution to this consists of three parts:

  1. Identify documents in the database which can be accessed via their path
  2. Index those documents
  3. Find the document which matches best a given URL

Marking routable documents

Because there’s not a single class for pages stored in the database, I mark all documents implementing IRoutable by adding a flag IsRoutable to the document’s metadata. This is done by implementing IDocumentStoreListener, so the code is called by RavenDB whenever a document is stored:

public class DocumentStoreListener : IDocumentStoreListener {
    public const string IS_ROUTABLE = "IsRoutable";

    public bool BeforeStore(string key, object entityInstance, RavenJObject metadata, RavenJObject original) {
        var document = entityInstance as IRoutable;
        if (document == null) {
            return false;
        }
        if (metadata.ContainsKey(IS_ROUTABLE) && metadata.Value<bool>(IS_ROUTABLE)) {
            return false;
        }
        metadata.Add(IS_ROUTABLE, true);
        return true;
    }

    public void AfterStore(string key, object entityInstance, RavenJObject metadata) {
    }
}

Indexing routable documents

The next step is to create an index for all documents with the proper flag in their metadata:

public class IRoutable_ByPath : AbstractIndexCreationTask {
    public override IndexDefinition CreateIndexDefinition() {
        return new IndexDefinition {
            Map = @"from doc in docs where doc[""@metadata""][""" + DocumentStoreListener.IS_ROUTABLE + @"""].ToString() == ""True"" select new { doc.Path }"
        };
    }
}

Searching for documents

Ok, so much for the preparation. The interesting part starts when a request comes in. Here RavenDB’s boosting feature is quite handy. The more parts of the path match, the higher score the document will get. E.g. if the requested path is a/b/c/d/e, following RavenDB search will be queried:

| search term | boost | |-------------|-------| | a/b/c/d/e | 5 | | a/b/c/d | 4 | | a/b/c | 3 | | a/b | 2 | | a | 1 |

The code to create such a query looks like this:

public IRoutable GetRoutable(string path) {
    var query = _documentSession
        .Query<IRoutable, IRoutable_ByPath>();

    if (!String.IsNullOrEmpty(path)) {
        var pathParts = path.Split('/');
        for (var i = 1; i <= pathParts.Length; ++i) {
            var shortenedPath = String.Join("/", pathParts, startIndex: 0, count: i);
            query = query.Search(doc => doc.Path, shortenedPath, boost: i, options: SearchOptions.Or);
        }
    } else {
        query = query.Where(doc => doc.Path == String.Empty);
    }

    var document = query.Take(1).FirstOrDefault();
    return document;
}

This method will finally return the document with the longest matching path.

Based on the incoming request we have found a matching document. What we can do with the remaining part of the URL I’ll leave for the next installment.

I published the complete code with unit tests in this Github repository.

7-Zip for Powershell

7-zip, Powershell comments

powershell_logo

At work we deal with different big databases, and by big I mean between 3.5 and 8 GB. Those databases are zipped with 7-Zip and stored on a server. Depending on the scenario we unzip one of these databases to a local folder and attach it to the SQL Server. To simplify these steps we have a couple of Powershell scripts, among other things invoking the command line version of 7-Zip.

However, extracting an archive of several gigabytes over the network might take some time, and we’d like to see the progress in the Powershell console. Powershell provides a standard way to report progress by calling Cmdlet.WriteProgress, which then will be displayed by the current host (e.g. command line) appropriately.

Therefore with some support of a co-worker I’ve written two Cmdlets for extracting and compressing archives. The syntax is simple as this:

Expand-7Zip
    [-ArchiveFileName] <string>
    [-TargetPath] <string>
    [<CommonParameters>]

Compress-7Zip 
    [-ArchiveFileName] <string> 
    [-Path] <string>
    [[-Filter] <string>]
    [-Format <OutArchiveFormat> {SevenZip | Zip | GZip | BZip2 | Tar | XZ}]
    [-CompressionLevel <CompressionLevel> {None | Fast | Low | Normal | High | Ultra}]
    [-CompressionMethod <CompressionMethod> {Copy | Deflate | Deflate64 | BZip2 | Lzma | Lzma2 | Ppmd | Default}]
    [<CommonParameters>]

It works with both x86 and x64 and uses SevenZipSharp as a wrapper around 7zip’s API.

7zip4powershell

You can download the cmdlets with the required libraries here or grap the sources at Github.

Don’t copy my referenced assemblies

msbuild comments

There are cases where you don’t want referenced assemblies to be copied to your output folder.

E.g. you write a plugin for an existing application, which provides a library declaring its contracts. Since the application will load that contract assembly itself, there’s no need for a second copy of the assembly in your plugin folder.

“But you can configure Visual Studio not to copy a reference to the output folder,” you may say. Well, that’s right. In the properties of a referenced assembly you can simply set Copy Local to False.

But…

If you’re an avid user of NuGet like I am, every time you update a package, the old references are removed from your project and the new ones will be added, and Copy Local will be True again. Do you think you will always remember to change that property back to False whenever you update a package? I don’t know about you, but I won’t.

Therefore, here’s a little trick I use in such cases. I have a little MSBuild file which suppresses any referenced assembly to be copied to the output folder:

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

  <!-- make all references non-private, so they won't be copied to the output folder -->
  <Target Name="ClearReferenceCopyLocalPaths" AfterTargets="ResolveAssemblyReferences">
    <ItemGroup>
      <ReferenceCopyLocalPaths Remove="@(ReferenceCopyLocalPaths)" />
    </ItemGroup>
  </Target>

</Project>

Save that snippet to a file i.e. PreBuild.targets, and include it in your project file like this:

  <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
  <Import Project="$(SolutionDir)\.nuget\nuget.targets" />
  <Import Project="PreBuild.targets" />

That’s it. From now on the output folder will only contain your assembly (along with other build artifacts such as the pdb file, of course)

Creating Dynamic Windows 7 Taskbar Overlay Icons, the MVVM Way

WPF, MVVM comments

metrotwit

Since Windows 7 the icon of an application can get an overlay bitmap. You can use that to indicate some state of the application, or–like MetroTwit–to show the number of unread items.

Overlay Icon in WPF

In WPF, the API is pretty simple:

<Window.TaskbarItemInfo>
    <TaskbarItemInfo 
        Overlay="{StaticResource ResourceKey=MyOverlayImage}" />
</Window.TaskbarItemInfo>

However, in one of my projects I have to display dynamic text in the overlay, similar to MetroTwit, but above example only shows a static resource.

While searching in the internet I found Pete Brown’s article Creating Dynamic Windows 7 Taskbar Overlay Icons. He uses a WPF DataTemplate to define the content of the overlay, and in his code-behind he takes that template, renders it to a bitmap and assigns it to the TaskbarItemInfo’s Overlay property. See his article for the detailed steps.

Though I think Pete’s solution pretty clever, it lacks the separation of logic and presentation. In my application I don’t want to create images in the code-behind, code-aside, whatever. It follows the MVVM pattern, so the creation of the overlay image shouldn’t be the concern of my viewmodel.

Solution

Extending TaskbarItemInfo doesn’t work because it is sealed. Therefore I took the same route as in my previous post, attaching dependency properties:

public class TaskbarItemOverlay  {
    public static readonly DependencyProperty ContentProperty =
        DependencyProperty.RegisterAttached(
            "Content", 
            typeof(object), 
            typeof(TaskbarItemOverlay), 
            new PropertyMetadata(OnPropertyChanged));

    public static readonly DependencyProperty TemplateProperty =
        DependencyProperty.RegisterAttached(
        "Template", 
        typeof(DataTemplate), 
        typeof(TaskbarItemOverlay), 
        new PropertyMetadata(OnPropertyChanged));


    public static object GetContent(DependencyObject dependencyObject) {
        return dependencyObject.GetValue(ContentProperty);
    }

    public static void SetContent(DependencyObject dependencyObject, object content) {
        dependencyObject.SetValue(ContentProperty, content);
    }

    public static DataTemplate GetTemplate(DependencyObject dependencyObject) {
        return (DataTemplate)dependencyObject.GetValue(TemplateProperty);
    }

    public static void SetTemplate(DependencyObject dependencyObject, DataTemplate template) {
        dependencyObject.SetValue(TemplateProperty, template);
    }

    private static void OnPropertyChanged(DependencyObject dependencyObject, DependencyPropertyChangedEventArgs e) {
        var taskbarItemInfo = (TaskbarItemInfo) dependencyObject;
        var content = GetContent(taskbarItemInfo);
        var template = GetTemplate(taskbarItemInfo);

        if (template == null || content == null) {
            taskbarItemInfo.Overlay = null;
            return;
        }

        const int ICON_WIDTH = 16;
        const int ICON_HEIGHT = 16;

        var bmp =
            new RenderTargetBitmap(ICON_WIDTH, ICON_HEIGHT, 96, 96, PixelFormats.Default);
        var root = new ContentControl {
            ContentTemplate = template, 
            Content = content
        };
        root.Arrange(new Rect(0, 0, ICON_WIDTH, ICON_HEIGHT));
        bmp.Render(root);

        taskbarItemInfo.Overlay = bmp;
    }
}

The first lines a boilerplate code to define the attached properties. There are two of them, Content and Template. The former defines, well, the content we’re going to bind to our model. The latter defines the template used to render the content.

The actual work is done in the method OnPropertyChanged. It takes the template together with the content, renders it, and assigns the resulting bitmap to the Overlay property of the TaskbarItemInfo element.

Usage

I have created a small application to demonstrate the use of the attached properties. The XAML of the window is this:

<Window 
    x:Class="WpfApplication1.MainWindow"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:src="clr-namespace:WpfApplication1" 
    Title="MainWindow" Height="350" Width="525">
    <Window.Resources>
        <DataTemplate x:Key="OverlayIcon">
            <Grid Width="16" Height="16">
                <Ellipse 
                    Fill="Red" 
                    Stroke="White" 
                    StrokeThickness=".5" />
                <TextBlock 
                    Text="{Binding}" 
                    TextAlignment="Center" 
                    Foreground="White" 
                    FontWeight="Bold" 
                    Height="16" 
                    VerticalAlignment="Center" 
                    FontSize="10"/>
            </Grid>
        </DataTemplate>
    </Window.Resources>
    <Window.TaskbarItemInfo>
        <TaskbarItemInfo 
            src:TaskbarItemOverlay.Content="{Binding Count}" 
            src:TaskbarItemOverlay.Template="{StaticResource OverlayIcon}" />
    </Window.TaskbarItemInfo>
    <Viewbox>
        <TextBlock Text="{Binding Count}" />
    </Viewbox>
</Window>

TaskbarItemOverlay

In the window’s resources we define the template for the overlay. Notice that the Text is bound! Later you can see the TaskbarItemInfo with the attached properties in action: Content binds to the Count property of my viewmodel, and Template references the DataTemplate defined in the resources.

The code-behind is straight forward. I won’t repeat it here, but you can see it at GitHub. Basically it increments the Count property of the viewmodel every seconds in a background thread. You can see the result in the image to the left.

The source code is attached, but also available at GitHub.