Archive for the ‘.NET’ Category

SignedXml.CheckSignature() can’t find the certificate for verification

No Comments »

The System.Security.Cryptography.Xml namespace has been there since the dawn of time, and the look and feel of the API is rough at best. To top that, the MSDN documentation isn’t really something to lean on either, and it’s been a long, long time since I last tried to get acquainted with it.

I’ve done some work to integrate Windows Identity Foundation (WIF) with the ESB we’re using at work, and I wanted to implement message security (encryption and signing). Since the messages are XML-based, and that I really didn’t want any homegrown security solutions, I found it natural to use EncryptedXml and SignedXml from the aforementioned namespace to do the job. My weapon of choice was x509 certificates and the proof-of-possession key (AES-256-CBC) from the SAML 1.1 token issued by the STS, and by tweaking the EncryptedXml configuration, I managed to massage the information about the x509 key down to a bare minimum (to keep the message size down).

Now, creating the detached signature on the publisher side was a no-brainer as well, but on the other end of the “cable” I wasn’t able to verify the signature. I tried “everything”, but to no avail. I bugged Barry Dorrans (@blowdart) on Twitter, since he’s as a guy that knows “a thing or two” about .NET Security. Barry had a couple of pointers, but none that resolved my issue.

So, weaponed with the .NET Reflector and some moral support from Barry via email, I started to dig into the System.Security assembly. It reeks of all sorts of code smells, and it took a while before I found the piece of information I was looking for:

Using SignedXml.CheckSignature() w/o sending in an X509Certificate2 instance explicitly, will only look in Current User\Other People or Local Machine\Other People. Be aware that Other People is an alias for AddressBook too.


Queued WCF Services, MSMQ, IIS7 and NetMsmqActivator. Oh My.

2 Comments »

First of all; credits for this solution goes to a former colleague, and now a contractor working in my department; Erlend Rotvik of WebStep Fokus.

The title might be a bit cryptic, so let me elaborate a bit on the subject; until last week or so, it was “common knowledge” that using the NetMsmqActivator together with Queued WCF services hosted in different IIS7 sites was a no-go. The reason for this “truth” was that there wasn’t a way to specify how the NetMsmqActivator should dispatch messages to the different IIS sites.

If you experience exceptions like this one:

or:

or get this one in your face unexpectedly:

you’ve probably being screwed by the NetMsmqActivator with wrong binding information set.

When you set up an net.msmq on your IIS7 site, the binding information you enter is normally localhost, so Erlend played a bit with the syntax and found out that if he used an asterisk (*) he could filter on specific queues. Since we prefix all MSMQ queues based on the project / services they belong to, he tried to put the prefix before the asterisk – and lo and behold; now the NetMsmqActivator only dispatched messages from the queues that matched the prefix! As far as I know, this isn’t documented, so head over to a local IIS7 installation of yours and try it out!

Example binding information:

localhost/MySite.*

would tell the NetMsmqDispatcher to only dispatch messages in queues named MySite.* to the given IIS Web Site.

 


TaskWsdlImportExtension–a hidden gem in the C# vNext async CTP samples

4 Comments »

NoWhen I first heard about the new async functionality in C# at PDC 2010 last week, my immediate reaction was; how can this be used to ease the development of asynchronous WCF/WF services (and clients).

Well, it turns out that someone on Microsoft has thought about the same thing; in the samples that accompanies the recently released C# async CTP, there’s a sample named (C# WCF) Stock Quotes. This picked my interest, and lo and behold; when I opened up the solution, I immediately noticed the TaskWsdlImportExtension.

Basically, the project contains an extension to WCF that plugs into the WSDL import pipeline, and customizes the output of the generated code you get when you use the Add Service Reference functionality (ASR) in Visual Studio. Now, I have to admit that I’m not the greatest fan of ASR and I normally write my own clients instead, but you can get the extension importer to work with svcutil.exe on the command line as well by pointing to a app.config wiring up the extension with the /SvcutilConfig:<file> parameter (or creating file named exactly Svcutil.exe.config in the same directory as you are doing the import in). More information can it can be found in the WSDL Import section in this MSDN article.

With this extension wired up, the C# client proxy code that is generated looks like this:

Now, that looks kinda complicated – and it is, but you don’t really need to understand what is to start using it. Since System.Threading.Tasks is a new feature in .NET 4.0, there is really nothing that stops you from using the extension right away – it doesn’t rely on the async/await keywords that we’ll see in C# vNext!

Now, if Microsoft hasn’t thought of it already; it would be nice to see a similar way of using System.Threading.Task/C# async in the implementation as well. I’m guessing it wouldn’t be too hard to create and it would probably involve a custom dispatcher invoker that would replace the default dispatcher in WCF (hint; I’m looking into this now).


Easier Unit Testing of WCF Services with ServiceTestContext

2 Comments »

Hi, and apologies for being so awfully quiet the last couple of months. Expect the traffic to pick up again (I’ll explain the silence in a blog post later).

Now, when unit testing WCF Services, I’ve often ended up with cluttering my tests with a lot of plumbing code to wire up the SUT; that is – the WCF service I want to exercise.

Now, being a lazy guy, wiring up (redundant) plumbing code again and again, I often end up trying to extract the essence and put together a tool or helper class.

So, this is my first shot of a fluent helper class that lets you test your WCF services.

The screenshot below pretty much sums up the functionality. It should be pretty self explaining; You end up writing an Action<TContract> implementation that acts as the client.

It will wire up an OperationContextScope automatically, but it can be disabled if you don’t need it.

ShouldReturnHttp200

The state/quality of the code is “Proof of Concept” and can be found here.


The book shelf of a Connected Systems MVP

3 Comments »

A few days ago, Gøran Hansen of Capgemini and a an active member of the Norwegian Microsoft scene – as well as active in the Twittersphere, wrote a blog post called “A Software Craftsman’s Bookshelf” containing a picture of his book shelf with Software Development-related books, as well as a brief review of the titles. He tagged a bunch of other people – including me, so here’s my contribution to this book shelf meme

(I actually wonder why a UI-geek like Gøran chose a dull jpeg for visualizing his book shelf, so I’m stepping up – to show off that Mr. Non-UI guy can use the Stitch functionality in Deep Zoom Composer. The final product is hosted on DeepZoomPix – a Microsoft site for hosting Deep Zoom pictures.)

Update: Seems like the stupid wordpress.com blog hosting strips javascripts and object tags, so until I’ll get around to move this blog to a more sane hosting provider, I’ll have to put up a preview picture that hyperlinks to the DeepZoomPix site :-(

bookshelf_preview

(A click on the image will bring you to the real Deep Zoom image)

I actually thought about rotating the stitched picture counter-clockwise, so that the titles of the books would be easier to read, but I postphoned it to a moment when I have more time to stuff like this :-)

A description of the books + Amazon links, as well as a list of people I’d like to tag will be added later.


Syntax highlighting with MGrammar

4 Comments »

Since I started exploring the possibilities of the various bits of codename “Oslo”, there has been one thing that has really annoyed me (and this is not Oslo’s fault). The lack of a decent tool to do syntax highlighting of M, MGrammar & custom DSLs is vital to be able to communicate the intentions of a bit of source code when you blog about it.

Since I’ve been using the bits in System.Dataflow in a couple of projects now, I knew of the existence of the Lexer etc. in the assembly. I started to investigate further with .NET Reflector and found one class that seemed quite relevant for the tool I wanted to write; System.Dataflow.LexerReader. You initialize the LexerReader with a ParserContext and a stream of input data (typically the source code) and iterate over the tokens that the Lexer discover.

So, the basic requirements for the utility I wanted to create were:

  • Take a compiled MGrammar (Mgx) as input.
  • Take a piece of source code that complies to the MGrammar as input.
  • Output a HTML fragment with syntax highlighted source code.

Since the MGrammar language has a notion of attributes, and more specific; supports the @{Classification} attribute that lets the language developer classify/group the different tokens into Keywords, Literals, Strings, Numerics etc., I started digging into the System.Dataflow to hopefully find a mechanism to retrieve the metadata during the Lexing phase.

After some hours of intensive searching with .NET Reflector and the Visual Studio debugger, I found the solution; when you iterate over the LexerReader instance, you end up with ParseTokenReference instances that both describes the token and its content. It doesn’t contain the classification information directly, and that was the big puzzle I had to solve. It turned out that the DynamicParser instance, that I used to load up the Mgx file and build the ParseContext had a GetTokenInfo() method that took an integer as the only parameter; tokenTag – and the ParseTokenReference instance had a .Tag property. Bingo!

So, I’ve put together a small spike that I’m intending to clean up – it’s located here at the moment and will be licensed under the Apache License.

Below is a sample output  from the utility – the input is a MGrammar that I wrote for a answer to a thread in the Oslo/MSDN forum.

For the first version it will probably be a command line tool – but it would probably be a good idea to create both a ASP.NET frontend and a Windows Live Writer addin for it.

module LarsW.Languages
{
    language nnnAuthLang
    {
        syntax Main = ar:AuthRule* => Rules { valuesof(ar) };
        syntax AuthRule = ad:AllowDeny av:AuthVerb
            tOpenParen rl:RoleList tCloseParen tSemiColon
                          => AuthRule { Type {ad}, AuthType{av}, Roles
                          { valuesof(rl)} };
        syntax RoleList = ri:RoleItem  => List { ri }
                        | ri:RoleItem tComma rl:RoleList
                          => List { ri, valuesof(rl) };
        syntax RoleItem = tRoleName;
        syntax AllowDeny = a:tAllow => a
                         | d:tDeny => d;
        syntax AuthVerb = tText;
        token tText = ("a".."z"|"A".."Z")+;
        @{Classification["Keyword"]}token tAllow = "Allow";
        @{Classification["Keyword"]}token tDeny = "Deny";
        token tOpenParen = "(";
        token tCloseParen = ")";
        token tSemiColon = ";";
        token tComma = ",";
        token Whitespace = " "|"\t"|"\r"|"\n";
        token tRoleName = Language.Grammar.TextLiteral;
        interleave Skippable = Whitespace;
    }
}

kick it on DotNetKicks.com


Parsing the command line with MGrammar – part 2

2 Comments »

In the first installment of this series we took a look at the basic grammar for parsing the command line with MGrammar. In this part I’ll show you how we can load in a compiled version of the MGrammar and parse the input (i.e. the command line) to produce a valid MGraph that we in turn can process in the backend code.

A quick reminder from part 1; the code is located here:
http://github.com/larsw/larsw.commandlineparser

You can download the code either by using git or downloading it as an archive. Once you’ve done that, open the solution LarsW.CommandLineParser.sln in Visual Studio 2008.

imageMost likely you will be presented with the following dialog box, informing you that opening the solution (or more correct the LarsW.CommandLineParser C# project inside) can pose a security risk. The reason for this is that I’ve included the a MSBuild task for compiling MGrammar files (.mg) into .mgx is that included in the Oslo SDK. Select the “Load project normally” and press OK.

We can first take a look at the extra plumbing I’ve added to the project to get the .mg file to compile. Right-click the LarsW.CommandLineParser project in the Solution Explorer, and choose Unload Project. Next, right-click it again, and choose Edit LarsW.CommandLineparser.csproj. This should bring up the project file will be shown as raw XML in the editor window.

In the first <PropertyGroup> I’ve added seven lines that I borrowed from a project created with the “M” template. They basically set’s up the path to various M-specific tools and auxiliary files.

The only line of these that really matter and that I had to tweak in order to get this right is the <MgTarget> element. Out of the box this is set to Mgx, that instructs the Mg compiler to spit out the result of the compilation as a .mgx file. As we will see later, the value needs to be set to MgResource in order to get the DynamicParser to load the .mgx as a resource.

If you navigate to the end of the project file, I’ve also added an <Import> element that imports some MGrammar specific MSBuild tasks and the most important thing; in the last <ItemGroup> section I’ve changed the element type from <None> to <MgCompile> for the cmd.mg file.

Well, we’ve been mucking around in the MSBuild plumbing too long now, haven’t we? Right-click the project again and choose Reload Project. When the project has loaded up again, build to ensure that everything is fine and dandy. Even though I haven’t stated it before, it should be obvious that the project depends on the latest (as of now that is the January 2009 CTP Refresh) Oslo SDK.

The core component is the CommandLineProcessor class.

It loads up the language (the compiled version of the cmd.mg) with DynamicParser.LoadFromResource(). The reason why we had to specify MgxResource as the MgTarget earlier is that if we don’t, and add the compiled .mgx file as a plain resource, the .LoadFromResource() method won’t find it. As of now, it seems that it will only look for resources with the .resource extension.

We then pass in the command line with a StringReader instance to the .Parse<T>() method on the DynamicParser instance. Even though it’s not specified, the T has to be object or a type that implements System.Dataflow.ISourceInfo. The internal/inner Node classes in GraphBuilder is what that will be handed out per default, but you can also create your own GraphBuilder and produce nodes from your own domain model.

So, by calling parser.Parse<object>(null, commandLineReader, ErrorReporter.Standard) we will get an instance to the root of the Abstract Syntax Tree (AST) returned if the input matches the grammar. The AST is basically a representation of the MGraph.

The next step is to traverse the AST and act upon the different node types. The grammar for this project is quite trivial and is mostly done by the private ProcessParameter() method in the CommandLineProcessor class. I suggest that you take a look at it if you’re interested in doing something similar.

So, just create an instance of the CommandLineProcessor and pass in an instance of an arbitrary class that contains method that will handle the command line arguments. To specify that a method is a argument handler, decorate it with the CommandLineArgumentHandler attribute. It will take in three parameters; short form & long form of the argument keyword and a description. For now the description isn’t used for anything but the idea is that the command line processor can auto generate a usage screen for you (typically shown with –?).

That’s about it – if you find it useful or modify the code, please let me know. With git you can push me a change set and I will try to merge it if you’ve come up with a cool feature.


Configurable PrincipalPermission attribute

2 Comments »

I while ago, a question came up in the WCF Forum about configuring the role and/or user name properties of the PrincipalPermission attribute. As I answered, it is possible to create a custom version of the attribute (deriving from the CodeAccessSecurityAttribute, since the PrincipalPermission attribute is sealed) and pull the property values from the {web|app}.config file.

I implemented a solution for this about a year ago and planned to put up a blog post about it, but it never made it out to the public (the main cause is probably that I experienced a blog-block period of my life :-P).

The same requirement may be a viable solution i system I’m currently working on for a customer, so I dug through my archives and found the old code.

I’ve polished it a bit made it available here under the Apache License 2.0.

The extended version, PrincipalPermissionEx can be used in two modes; either as a “normal” derivable PrincipalPermission attribute or an attribute that uses the configuration system (or a combination of both).

Instead of using the generic PrincipalPermission attribute, you’ll make derived version for each system role with a sensible name – making it more reliable and resistant to typos; e.g.

[MustHaveSuperUserPrivilegesPermission]
public void PrivilegedOperation(…)
{
}

instead of:

[PrincipalPermission(Role = "MYDOMAIN\SuperUsers")]
public void PrivilegedOperation(…)
{
}


Take a look at the supplied sample code to see how this is implemented.

The usage of PrincipalPermission-based authorization is useful in a variety of scenarios; it can be applied to WCF services, ASP.NET & Smart Client applications. Note that if you put the user name/role in the configuration file, you will need to ensure that the file is locked down with an appropriate ACL to prevent tampering by malicious users. This might not apply to solutions hosted on a locked down server (i.e. IIS-hosted web applications and services) but for smart / desktop clients where the user might have higher privileges to files on the local file system it is necessary to be aware of this.

As always, feedback is welcome :-)

kick it on DotNetKicks.com


LINQ to XML: XPathSelectElement Annoyance

No Comments »

It may be me – since I’m no XPath (or XSLT) pro, but the following is in my book a bug – or at least an annoyance category 3:

Given the following XML document loaded into an XDocument:

<?xml version="1.0" encoding="utf-8"?>
<Elements>
  <Element Id="1" />
  <Element Id="2" />
  <Element Id="3" />
  <Element Id="4" />
  <Element Id="5" />
</Elements>

The following XPath should  yield the first element of the list:

"//Element[@Id = '1']"


Guess what? If use the .XPathSelectElement() extension method, the result will be null – nada!

"//Element[@Id='1']"


The same query without the whitespace around the equal sign will give you the right result.

If you’re an XPath pro I would like your opinion on the matter – or else I’m turning this issue over to http://connect.microsoft.com/

Sigh.


Codename “Velocity” WF/WCF Persistence Provider

4 Comments »

So, it’s been a bit quiet here lately. The natural cause of it is (in no particular order):

  • A lot of work
  • Spending quality time with my son
  • Hacking on different kinds of technology bits (mainly pieces released at the PDC 2008)

I’ve also tried to get a clear picture of my “blind spots” when it comes to WCF. Even though I feel quite competent, there are still tons of stuff that I don’t touch daily so I still have to “rehearse”.

Since I have “Get to know Workflow Foundation – for real” on my TODO list I spent some time playing with durable services.

The persistence provider mechanism that is located in System.WorkflowServices is not exclusive to to Workflows / Workflow services. It can also be used with “vanilla” WCF Services.

The idea is that the framework can persist the service instance after you have invoked a method and when a future method invocation comes down the wire, it can pull it from the persistence store – revive it and pass the call to the “same” instance. A perfect fit for the scenario of long running services.

So how do you enable durable services? It is quite easy. First, you decorate your service implementation with [DurableService] and one of the mechanisms that specifies that the type is serializable (I chose [Serializable] for the sake of simplicity).

image

In this code snippet we also see that there is another attribute that can be used to tell the persistence mechanism that a call to an operation creates the instance or tears it down; [DurableOperation].

The next thing you have to do is to wire up a persistence provider using either configuration or programmatically.

Out of the box there exists only one Persistence Provider; One suited for persisting the service instances to SQL Server – System.ServiceModel.Persistence.SqlPersistenceProviderFactory. You will have to set up a SQL Server Database instance with the schema located in C:\Windows\Microsoft.NET\Framework\v3.5\SQL\EN.

But that was a digression – now back to my custom “Velocity” Persistence Provider. If you don’t know what Codename “Velocity” is, I suggest that you head over here and read more about it. The short description:

It is Microsoft’s attempt to create an in-memory, high-performance, distributed caching supporting different scenarios that can suite many needs in both a web farm or other places where caching is needed. The current version that is publicly available is CTP2. We should expect a new CTP in March (around the time of MIX’09) and the RTW/RTM in the mid of 2009.

To implement a custom persistence provider, you will have to create two classes; the persistence provider implementation and its factory. It is the fully qualified type name of the factory that is specified when you set up the configuration.

The following configuration snippet shows how a custom service behavior is set up. You will have to set the behaviorConfiguration attribute on the service element to “defaultServiceBehavior” in this case.

image

The code for the provider is available here (Licensed under the Apache License 2.0).

Cheers :-)
kick it on DotNetKicks.com