Monday, 28 September 2015

VSO Build vNext – versioning assemblies with MSBuild

In this blog post I will show you how to version assemblies during the Visual Studio Online vNext build.

Why to version assemblies?

Versioning assemblies will help you to detect, which library version has been deployed to your environment. You can obviously do it manually for each of your projects, but it can be a mundane and error prone task. Instead, I’ll show you how to easily automate the process.

In result, each project library will be automatically versioned. The version number format will follow the recommended standard:

[Major version[.Minor version[.Build Number[.Revision]]]]

How to version assemblies?

By default the assembly version is defined in the AssemblyInfo.cs class, which can be found in the Properties folder.
[assembly: AssemblyVersion("")]
If you have many projects it’s much easier to have a single file with all common assembly information shared by all projects. This way you will only need to update the property once and it will be picked up by all projects.

To do that create a separate file called CommonAssemblyInfo.cs and place it the solution’s root folder. Move the AssemblyVersion definition to that file. Then, link the file from all projects:

  1. Right click project
  2. Add Existing…
  3. Select the created CommonAssemblyInfo.cs
  4. Click small arrow next to the Add button
  5. Select Add as Link
  6. Drag the linked file to the Properties folder.
The result should look like this:
Obviously you can move more common properties to the CommonAssemblyInfo file e.g. AssemblyCompany, AssemblyCopyright etc.

Automate versioning

Now that we store the assembly version in a single file we can easily automate the versioning process. The idea is that before executing the actual build we will run another MSBuild script that updates the version in the Common file. The script will be using the AssemblyInfo target task from the MSBuild Community tasks.

The script takes 2 parameteres: BuildId & Revision:

<?xml version="1.0" encoding="utf-8" ?>
<Project xmlns="">
    <!-- path to MSBuild community tasks --> 
  <Import Project=".\tasks\MSBuild.Community.Tasks.Targets"/>
  <Target Name="UpdateAssemblyInfo">
    <Message Text="Updating Assembly versions with Build Id $(BuildId) and Revision $(Revision)"></Message>
    <!-- update assembly and file versions in C-Sharp CommonAssemblyInfo  --> 
    <AssemblyInfo OutputFile="..\CommonAssemblyInfo.cs"
                     AssemblyFileVersion="1.0.$(BuildId).$(Revision.Replace('C',''))" >                  
The script must be added to your source together with community tasks files, so you can reference it in your VSO build definition.

VSO Build definition

Now that we have all components it’s time to define the VSO build. The first step would be execution of our custom build task:
As you can see we use 2 environment variables here:
  • $(Build.BuildId) – id of the build
  • $(Build.SourceVersion) - The latest version control change set that is included in this build .If you are using TFS it will have the format “C1234”, so you need to remove the “C” prefix (see our build script above)
Then, we can use the regular MSBuild step to build the solution. All assemblies that have the CommonassemblyInfo.cs file linked should have the correct version number set. Now you can add more steps to the build definition: running unit tests, publishing artifacts, etc.

Alternative approach

You can also achieve the same functionality using PowerShell instead of MSBuild. There is a good example here. Which one you choose is up to your personal preference – I prefer my solution, as it requires less code.

Tuesday, 4 November 2014

SharePoint 2013 sticky footer

Adding a footer to SharePoint masterpage may be a bit tricky, since SharePoint automatically recalculates height and scrolling properties of some default div containers on page load. Today I will show how to add a so called "sticky footer" to a SharePoint masterpage using Javascript. The sticky footer will be displayed always at the bottom of the page, even if there is little content. We will base on the SharePoint 2013 "Seattle" masterpage.

Masterpage structure changes

First we need to add a footer container (div) to our masterpage, that will contain the footer content. We add this at the end of the default "s4-workspace" div, right after the "s4-bodyContainer" div:
<div id="s4-workspace" class="ms-core-overlay">
    <div id="s4-bodyContainer">
    <div id="footer">Your footer content goes here</div>
Now you need to populate your footer with content and set its CSS properties e.g. height.

Javascript code

Now that we have our footer container let's position it with some javascript code:
// generic function for resizing elements within their containers
function adjustContainerHeightToObject(container, content, offset){
 var container = $(container);
 var content = $(content, container);
 if (container.height() > content.height()) {
  content.height(container.height() + offset);

// specific function for resizing the s4-body container
function resizeMainContent(){
    // as offset we provide the negative value of the height of our footer
    adjustContainerHeightToObject('#s4-workspace', '#s4-bodyContainer', -50); // for footer with 50px height

// call resize function on page load

_spBodyOnLoadFunctionNames.push() vs. $(document).ready()

Notice that instead of using regular jQuery ready() event we are using SharePoint's custom mechanism for calling function after the page loads. This will ensure that all SharePoint resizing code has already executed when calling our function.

Wednesday, 13 August 2014

SSIS Error: The requested OLE DB provider Microsoft.Jet.OLEDB.4.0 is not registered. If the 64-bit driver is not installed, run the package in 32-bit mode.

I have a SSIS solution that reads from an Excel file. I recently deployed it to a different server and tried executing it from Visual Studio. I got the following error:

[Excel Source [2]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0209303. There may be error messages posted before this with more information on why the AcquireConnection method call failed.

[Connection manager "Excel Connection Manager"] Error: The requested OLE DB provider Microsoft.Jet.OLEDB.4.0 is not registered. If the 64-bit driver is not installed, run the package in 32-bit mode. Error code: 0x00000000. An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040154 Description: "Class not registered".


As the error message suggests you need to run the package in the 32-bit mode. To do that:
  1. Right-click your solution
  2. Click 'Properties'
  3. Select 'Debugging' node
  4. Set 'Run64BitRuntime' property to False
  5. Save and re-run your solution

Tuesday, 22 April 2014

SSIS integration with Dynamics CRM using ExecuteMultipleRequest for bulk operations

There are several tutorials on the Web explaining how to integrate SSIS with Dynamics CRM using the script component. All of them however show you only the basic setup, where records from a data source are processed 1 by 1 when executing CRM commands (e.g. creating CRM records). In this post I would like to show you have to leverage the ExecuteMultipleRequest class from CRM SDK to create bulk operations for records from the SSIS data source.

Tutorial scenario

  1. At first we will create a simple database with 1 table that stores user names
  2. Then we will create an SSIS project
  3. Next, we will add our db table as data source, so SSIS can read information about users
  4. Then, we will add a script component that creates contacts in CRM for each user from the table
  5. Finally, we will modify the script to import CRM contacts in batches
  6. At the end we will compare execution time of both scripts

Basic setup

Let's create a basic db table with only 2 columns:
 FirstName VARCHAR(100) NOT NULL,
Now populate your table with some dummy data, in my case I've added 1000 records.

SSIS project
  1. Open "Sql Server Data Tools" (based on Visual Studio 2010)
  2. Got to File -> New -> Project...
  3. Select "Integration Services Project", provide project name and click OK
  4. When the project is created add a Data Flow task to your main package:
Data Source
  1. Double click your Dat Flow task to open it
  2. Double click "Source Assitance" from the toolbox
  3. On the first screen of the wizard select "SQL Server" as source type and select "New..."
  4. On second screen provide you SQL server name and authentication details and select your database
  5. A new block will be added to you Data Flow, representing your DB table. It has an error icon on, cause we haven't selected the table yet. Also, you will see a new connection manager representing you DB connection:
  6. Double click the new block, from the dropdown select the Contacts table we created and hit OK. The error icon should disappear
Script component
  1. Drag and drop the Script Component from the toolbox to you Data Flow area
  2. Create a connection (arrow) from your data source to your script:
  3. Double click your script componet to open it
  4. Go to "Input Columns" tab and select all columns
  5. Go to "Inputs and Outputs" tab and rename "Input 0" to "ContactInput"

1-by-1 import

Now that we have basic components setup let's write some code! In this step we will create a basic code for importing Contacts into CRM. I'm assuming you have basic knowledge of CRM SDK, therefore the CRM specific code will not be explained in details.

Open the script component created in the previous steps and click "Edit Script...". A new instance of Visual Studio will open with a new, auto-generated script project. By default the main.cs file will be opened - this is the only file you need to modify. However, before modyfing the code you need to add references to following libraries:

  • Microsoft.Sdk.Crm.Proxy
  • Microsoft.Xrm.Client
  • Microsoft.Xrm.Sdk
  • Microsoft.Runtime.Serialization
Now we are ready to write the code. Let's start by creating a connection to you CRM organization. This will be created in the existing PreExecute() method like this:
OrganizationService _service;

public override void PreExecute()
    var crmConnection = CrmConnection.Parse(@"Url=https://******; Username=******; Password=*********;");
    _service = new OrganizationService(crmConnection);
Now that we have the connection created let's write code, that actually imports our contacts to CRM. This can be done be modyfing the existing method ContactInput_ProcessInputRow:
public override void ContactInput_ProcessInputRow(ContactInputBuffer Row)
    var contact = new Entity("contact");
    contact["firstname"] = Row.FirstName;
    contact["lastname"] = Row.LastName;
Obviously the code above requires some null-checks, error handling etc but in general that's all you need to do in order to import your contacts into CRM. If you close the VS instance with the script project it will be automatically saved and built.

You can now hit F5 in the original VS window to perform the actual migration.

Bulk import

In the basic setup described above there is 1 CRM call for each record passed to the script component. Calling web services over the network may be a very time consuming operation. CRM team is aware of that and that is why they introduced the ExecuteMultipleRequest class, which basically allows you to create a set of CRM requests on the client side and send them all at once in a single web service call. In response you will receive an instance of the RetrieveMultipleResponse class, allowing you to process response for each single request.

Let's modify the script code to leverage the power of the ExecuteMultipleRequest class. To do that overwrite the ContactInput_ProcessInput method. The default method implementation can be found in the ComponentWrapper.cs file and it as simple as this:

 public virtual void ContactInput_ProcessInput(ContactInputBuffer Buffer)
     while (Buffer.NextRow())
As you can see by default it calls the ContactInput_ProcessInputRow method that we implemented in the previous step for each record from the source. We need to modify it, so it creates a batch of CRM requests and then send it to CRM at once:
List<Entity> _contacts = new List<Entity>();

public override void ContactInput_ProcessInput(ContactInputBuffer Buffer)
    int index = 0;
    while (Buffer.NextRow())

        // Let's use buffer size 500. CRM allows up to 1000 requests per single call
        if (index == 500)
            index = 0;

private void ImportBatch()
    if (_contacts.Count > 0)
        // Create and configure multiple requests operation
        var multipleRequest = new ExecuteMultipleRequest()
            Settings = new ExecuteMultipleSettings()
                ContinueOnError = true, // Continue, if processing of a single request fails
                ReturnResponses = true // Return responses so you can get processing results
            Requests = new OrganizationRequestCollection()

        // Build a CreateRequest for each record
        foreach (var contact in _contacts)
            CreateRequest reqCreate = new CreateRequest();
            reqCreate.Target = contact;
            reqCreate.Parameters.Add("SuppressDuplicateDetection", false); // Enable duplicate detection 

        ExecuteMultipleResponse multipleResponses = (ExecuteMultipleResponse)_service.Execute(multipleRequest);            

        // TODO: process responses for each record if required e.g. to save record id


private Entity GetContactFromBuffer(ContactInputBuffer Row)
    Entity contact = new Entity("contact");
    contact["firstname"] = Row.FirstName;
    contact["lastname"] = Row.LastName;
    return contact;

Execution time comparison

As you can see the code for sending requests in batches is a bit longer (but still quite simple I believe) so you may be tempted to go with the simpler version. If you don't care about performance too much (little data, no time limitations) then it might be the way to go for you. However, it's always better to know your options and take a conscious decision. SSIS packages usually process large amount of data, which often takes a lot of time. If you add additional step performing CRM operations via CRM SDK (i.e. via CRM web services) you may be sure this will affect significantly the execution time.

I've measured the execution time for both methods. Importing 1000 contacts into CRM took:

  • 1-by-1 - 2:22s
  • Bulk import - 0:44s
In my simple scenario bulk import was 3x faster than 1-by-1. The more data you send to CRM the bigger the difference may be.

Thursday, 17 April 2014

C#: Retrieve user data from Active Directory

The code snippet below shows how to retrieve user information from ActiveDirectory using the PrincipalSearcher class:
var context = new PrincipalContext(ContextType.Domain, "");
var user = new UserPrincipal(context);

// search by alias
user.SamAccountName = "useralias";

// You can also search by other properties e.g. Display Name
//user.DisplayName = "John Doe";

// perform the search 
var search = new PrincipalSearcher(user);
user = (UserPrincipal)search.FindOne();

if (user != null) {
} else {
   Console.WriteLine("No user found");

Searching across multiple domains

The code above will search for users in the specified domain only. However, you will often want to search across multiple domains. In that case you will need to provide the parent domain name together with appropriate port. Let's say you have a hierarchy like this:
  - ...
To search across all children of the domain construct your PrincipalContext like this:
var context = new PrincipalContext(

Tuesday, 24 December 2013

WPF DataGrid - Custom template for generic columns

Recently I had to bind a WPF DataGrid to a System.Data.DataSet. This is quite straightforward and there are many tutorials on how to achieve this.

By default all table columns are auto-generated using 4 predefined templates (Text, Hyperlink, CheckBox, and ComboBox) that support read-only and edit modes. If you wish to customize the way some columns are rendered you can also define a custom template and assign it to some columns by hooking into the AutoGeneratingColumns event of the DataGrid as described here.

Problem with generic columns

As you can see creating custom templates for columns is pretty straightforward as long as column names are fixed. If your WPF app uses a table that doesn't change dynamically you are all good. The problem starts when you use your datagrid to display tables, whose columns` names change e.g. tables loaded from a file at runtime. This is because you can't use the column name in your custom template.

Solution 1 - Create template programmatically

In this solution you build your custom template in code and assign it to the chosen column at runtime, in the AutoGeneratingColoumn event handler.
private void DataGrid_AutoGeneratingColumn(object sender, DataGridAutoGeneratingColumnEventArgs e)
    // First get the corresponding DataColumn
    var colName = e.PropertyName;
    var table this.DataContext as DataTable;
    var tableColumn = table.Columns[colName];

    // choose columns to customize e.g. by type
       var templateColumn = new DataGridTemplateColumn();
       templateColumn.Header = colName;
       templateColumn.CellTemplate = this.BuildCustomCellTemplate(colName);
       templateColumn.SortMemberPath = colName;
       e.Column = templateColumn;

// builds custom template
private DataTemplate BuildCustomCellTemplate(string columnName)
    var template = new DataTemplate();

    var button = new FrameworkElementFactory(typeof (Button));
    template.VisualTree = button;

    var binding = new Binding();
    binding.Path = new PropertyPath(columnName);
    button.SetValue(ContentProperty, binding);

    return template;
The code above would create the following template for selected columns:

Obviously this is just an example - in real life you would need more than just a button that does nothing. In your code you can define full templates, use binding converters, assign commands etc. However, the code gets pretty complex. Therefore this solution is suitable for simple templates.

Solution 2 - Create template skeleton

Alternatively, you can create the template skeleton in XAML and replace all bindings in your event handler:

And the event handler:
private void DataGrid_AutoGeneratingColumn(object sender, DataGridAutoGeneratingColumnEventArgs e)

       // Create wrapping template in code and populate all bindings accordingly 
       string xaml = @"";
       var template = (DataTemplate)XamlReader.Load(string.Format(xaml, "{Binding " + colName + "}", "{StaticResource customCellTemplate}"));
       templateColumn.CellTemplate = template;
The advantage of this approach is that you can create more complex templates in XAML and in your event handler code only populate all required bindings. The limitation of this method is that the custom template needs to be defined at the application level. I found this solution here.

Monday, 2 December 2013

Claims based authorization in MVC4

Recently I worked on a sample MVC4 application that was using Claims based authentication. I used the Identity and Access Visual Studio extension to help me configuring Windows Identity Foundation (WIF) in my app. In short, the tool updates your web.config by adding sections system.identityModel and to enable WIF. In result, my application is redirecting all unauthenticated users to my Identity Provider, which then generates a security token that is returned back to my app.

Once I had the authentication part done I started working on the authorization. I wanted it to be role-based i.e. very similar to what you use by default in the default MVC model:

[Authorize(Roles = "Administrator")]
public class AdminController : Controller
    // Controller code here
In theory, if your Identity Provider issues a token containing the Identity Role claim ( with the value of user's current role the above default authorization code should work. And it actually does! This is because some basic claims from the token are automatically used to populate the user's identity object, including roles. So when your app's authorization code checks user's role it will use values provided in the token (if any were provided).

Membership database issue

The above solution worked fine for me at the beginning. What I was not aware of is the fact that, by default, the Authorize attribute also connects to you Membership database, regardless the token content. By default as membership database MVC uses the local ASPNETDB.mdf file. I realized that when I moved the application to a different server, without moving the mdf file. Suddenly I started getting the following SQL exception when calling the Authorize attribute:
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)
I guess there is an easy way to configure ASP not to connect to the database if roles are provided in the token. However, I decide to take a different approach to have more control over the code.

Custom authorization attribute

I decided to write a custom Authorization attribute, that would search for user's role directly in claims provided in the token:
public class ClaimsAuthorizeAttribute : AuthorizeAttribute
    private string claimType;
    private string claimValue;

    public ClaimsAuthorizeAttribute(string type, string value)
        this.claimType = type;
        this.claimValue = value;
    public override void OnAuthorization(AuthorizationContext filterContext)
        var identity = (ClaimsIdentity)Thread.CurrentPrincipal.Identity;
        var claim = identity.Claims.FirstOrDefault(c => c.Type == claimType && c.Value == claimValue);
        if (claim != null)
This approach is more flexible as it allows me to use different types of claims for authorization in future, not only role. The usage of the attribute is still very simple:
[ClaimsAuthorize(ClaimTypes.Role, "Administrator")]
public class AdminController : Controller
    // Controller code here

Additional notes

When using claims based authorization it is often advised to use the existing ClaimsPrincipalPermission attribute together with the configured ClaimsAuthorizationManager. In my case this seemed like an overkill, especially that I wanted to keep the code similar to the default authorization model.