Posted on 11. August 2017

Solution Packager and global optionset enum support in spkl Task Runner

I’ve published version 1.0.9 of spkl to NuGet - this adds the following new features:

  1. Global optionset enum generation for early bound classes.
  2. Solution Packager support

Global Optionset enum generation

This was a tricky one due to the CrmSvcUtil not making it easy to prevent multiple enums being output where a global optionset is used, but you can now add the following to your spkl.json early bound section to generate global optionset enums.

  "earlyboundtypes": [
      "generateOptionsetEnums": true,

In a future update, I’ll add the ability to filter out the enums to only those used.

Solution Packager Support

The solution packager allows you to manage your Dynamics metadata inside a Visual Studio project by extracting the solution into separate xml files. When you need to combine multiple updates from code comments, you can then use the packager to re-combine and import into Dynamics. To configure the solution packager task you can add the following to your spkl.json

  The solutions section defines a solution that can be extracted to individual xml files to make
  versioning of Dynamics metadata (entities, attributes etc) easier
  "solutions": [
      "profile": "default,debug",
      The unique name of the solution to extract, unpack, pack and import
      "solution_uniquename": "spkltestsolution",
      The relative folder path to store the extracted solution metadata xml files
      "packagepath": "package",
      Set to 'true' to increment the minor version number before importing from the xml files
      "increment_on_import": false

There are two .bat files provided that will call:

spkl unpack

This will extract the solution specifed in the spkl.json into the packagepath as multiple xml files

spkl import

This will re-pack the xml files and import into Dynamics - optionally increasing the version number of the solution to account for the new build.

Posted on 9. April 2015

Control Async Workflow Retries

The Dynamics CRM Async Server is a great mechanism to host integrations to external systems without affecting the responsiveness of the user interface. Once such example is calling SharePoint as I describe in my series – SharePoint Integration Reloaded.

A draw back of this approach (compared to using an integration technology such as BizTalk) is that any exceptions thrown during the integration will simply fail the workflow job with no retries. You might already know that the Async Server does in fact have a retry mechanism built in as described by this excellent post from the archives - As you'll see from this article there are some built in exceptions where CRM will automatically retry a number of times as defined by the 'AsyncMaximumRetries' deployment property. The interval between these retries is defined by the 'AsyncRetryBackoffRate' property.

So how can we make use of this retry mechanism with our custom code?

There is an undocumented and unsupported way of using this retry mechanism in your custom workflow activities. I first used this technique back in the CRM 4.0 days and I'm pleased to report that it still works with CRM 2015!

Although unsupported, the worst that could happen is that the workflow would stop retrying in future version of Dynamics CRM and revert to simply reporting the exception. However it still wouldn't be a good idea to rely on this functionality for mission critical operations.

So…plz show me the codez…

catch (Exception ex)
    // TODO: Rollback any non-transactional operations

    OrganizationServiceFault fault = new OrganizationServiceFault();
    fault.ErrorCode = -2147204346; // This will cause the Async Server to retry
    fault.Message = ex.ToString();
    var networkException = new FaultException(fault);
    throw networkException;

When an exception is thrown by your custom workflow activity you'll see the workflow enter into a 'Waiting' state and the 'Postpone until' attribute will be set to the time when the retry will take place. Once the maximum number of retries has been performed, it will enter the 'Failed' status.

You can use this technique to ensure that any temporary integration failures such as service connectivity issues will be dealt with by a graceful retry loop. It only remains to ensure that you before throwing the exception you rollback any operations performed already by your code (such as creating records) so that your retries will not create duplicate records.

Hope this helps!


Posted on 16. February 2015

Business Rules & “SecLib::RetrievePrivilegeForUser failed - no roles are assigned to user”

When publishing your Ribbon Workbench solution you may receive the following error:

"SecLib::RetrievePrivilegeForUser failed - no roles are assigned to user."

The first step in diagnosing these issues is to try and export the same solution using the CRM solutions area and then immediately re-import to see what error is shown. This is exactly what the Ribbon Workbench does behind the scenes.

When doing this you might see:

Upon Downloading the Log File you'll see that the error occurs on a 'Process' and the same error message is shown as reported by the Ribbon Workbench. The User ID is also given which can be used in a URL similar to the following to find the user record causing the issue.

https://<orgname><User GUID>%7d

You will most likely discover that this user has been disabled or has no roles assigned – this could be that they have left the company or changed job role. You will need to find any Workflows or Dialogs that are assigned to them and re-assign to yourself before you can import the solution.

In most cases, you should not include Dialogs or Workflows in the solution that is loaded by the Ribbon Workbench since this only slows the download/upload publish process down. There is one exception to this – and that is Business Rules. Business Rules are associated with their particular entity and cannot be excluded from the solution. Oddly like Workflows and Dialogs they also are owned by a user but it is not shown in the user interface – nor is there an option to re-assign. There is a handy option on a user record that allows you to 'Reassign Records'

You would think that this would reassign any Business Rules but unfortunately you'll get the following error:

"Published workflow definition must have non null activation id."

The only way to reassign these Business Rules is to perform an advanced find for Processes of Type 'Definition' and Category 'Business Rule'

The results can then be reassigned to yourself using the 'Assign' button on the results grid.


Posted on 3. February 2015

The dependent component Attribute (Id=transactioncurrencyid) does not exist. Failure trying to associate it with Workflow

I recently answered a question on this error on the Community forms and coincidently I've just come up against exactly the same issue!

When importing a solution you receive the error 'There was an error calculating dependencies for this component' and on downloading the log you see the full message similar to:

The dependent component Attribute (Id=transactioncurrencyid) does not exist. Failure trying to associate it with Workflow (Id=<GUID>) as a dependency. Missing dependency lookup type = AttributeNameLookup.

Although this message can appear for other attributes, this post is specifically to do with the transactioncurrencyid attribute being referenced.

When you add a money field to an entity CRM automatically adds a N:1 relationship to the currency entity to hold the currency that the value is stored against. The foreign key attribute is always named transactioncurrencyid.

In this increasingly 'agile' software development world attributes are added and remove fairly regularly during the development phase. If a money field is added to an entity and then removed, the transactioncurrencyid attribute is not removed. Because the relationship automatically created by the system it is not created when deploying to another environment via a solution import because there are no money fields. This leads to your development environment having the additional relationship. This wouldn't cause a problem apart from that when you create a workflow with a 'create' or 'update' step, the currency field is usually pre-populated with the default currency. Consequently when you try to import this workflow into another organization that does not have the currency relationship you will see this error.

The solution is to either delete the transactioncurrencyid field from your development environmentm and from the workflow create/update steps or simply add a dummy currency field to your target environment in order to create the relationship to currency.


Posted on 29. December 2014

SharePoint Integration Reloaded – Part 3

In Part 1 and Part 2 of this series we have discussed how the new server-to-server integration with SharePoint works under the covers. In this post I'll show you how to integrate with SharePoint directly from a sandboxed workflow activty/plugin rather than relying on the out of the box integration.

Using the standard integration, a new folder will be created for each record underneath the default site. In some solutions you'll find that you want to modify this behaviour so that folders are created in a custom location. You may for example want to have an opportunity folder created under a site that is specific to a particular client rather than all under the same site.

The challenge with integrating with SharePoint using a CRM Online Workflow Activity/Plugin is that you can't reference the SharePoint Assemblies which authenticating and calling the SharePoint web service somewhat harder. Thanks goes to fellow Dynamics CRM MVP Rhett for his blog that provided a starting point for this sample - The sample code in this post shows how to create a folder in SharePoint and then associate it with a Document Location. The authentication with SharePoint works via ADFS and since the out of the box integration uses a trust between CRM and SharePoint that is not accessible from a sandbox (even if you try and ILMerge it!) we have to provide a username and password that will act as our privileged user that can create folders in SharePoint. I have left a function where you can add your own credentials or implement a method to retrieve from a secure entity in CRM that only administrators have access to. Look in the code for the 'GetSecureConfigValue' function.

The sample contains a custom workflow activity that works in a CRM online 2013/2015 sandbox accepting the following parameters:

  • Site – A reference to the site that you want to create a folder in. You could store a look up to a site for each customer and populate this parameter from the related account.
  • Record Dynamic Url – The 'Dynamic Record Url' for the record that you want the SharePoint document location to be related to. This uses my Polymorphic input parameter technique. You simply need to pass the Record Url (Dynamic) for the record that you wish to create the folder for.
  • Document Library Name – The name of the document location to create the folder underneath. In the out of the box integration this is the entity logical name (e.g. account)
  • Record Folder Name – The name of the folder to create. You could use the client name, client ID etc. – but it will automatically have the GUID appended to it to ensure uniqueness just like the out of the box integration.

Calling the workflow activity might look like:

The workflow activity is deployed using the Developer Toolkit for Dynamics CRM and performs the following:

  1. Checks if the document location already exists for the given site/document library – if so it simply returns a reference to that
  2. Checks if a document location exists for the given document library – if not, one is created
  3. Creates a SharePoint folder using the SpService class. It is worth noting that if the folder already exists, no exception is thrown by SharePoint. The SpService class must first authenticate using the SpoAuthUtility class.
  4. Creates a Document Location for the newly created folder.

You could choose to run the workflow in Real Time or asynchronously on create of a record – the down side of real time is that it will increase the time that the record takes to save.

Check out the code in MSDN Samples- you'll need to do a Nuget package restore to pick up the referenced assemblies.

View/Download Code

That's all for now – have a Happy New Year!




Posted on 2. August 2014

Polymorphic Workflow Activity Input Arguments

I often find myself creating 'utility' custom workflow activities that can be used on many different types of entity. One of the challenges with writing this kind of workflow activity is that InArguments can only accept a single type of entity (unlike activity regarding object fields).

The following code works well for accepting a reference to an account but if you want to accept account, contact or lead you'd need to create 3 input arguments. If you wanted to make the parameter accept a custom entity type that you don't know about when writing the workflow activity then you're stuck!

[Output("Document Location")]
public InArgument<EntityReference> EntityReference { get; set; }

There are a number of workarounds to this that I've tried over the years such as starting a child work flow and using the workflow activity context or creating an activity and using it's regarding object field – but I'd like to share with you the best approach I've found.

Dynamics CRM workflows and dialogs have a neat feature of being about to add Hyperlinks to records into emails/dialog responses etc. which is driven by a special attribute called 'Record Url(Dynamic)'

This field can be used also to provide all the information we need to pass an Entity Reference.

The sample I've provide is a simple Workflow Activity that accepts the Record Url and returns the Guid of the record as a string and the Entity Logical Name – this isn't much use on its own, but you'll be able to use the DynamicUrlParser.cs class in your own Workflow Activities.

[Input("Record Dynamic Url")]
public InArgument<string> RecordUrl { get; set; }
The DynamicUrlParser class can then be used as follows:

var entityReference = new DynamicUrlParser(RecordUrl.Get<string>(executionContext));
RecordGuid.Set(executionContext, entityReference.Id.ToString());
EntityLogicalName.Set(executionContext, entityReference.GetEntityLogicalName(service));


The full sample can be found in my MSDN Code Gallery.


Posted on 6. June 2014

Monitor, Monitor, Monitor

I once heard someone say that "the great thing about Dynamics CRM is that it just looks after itself" Whilst CRM2013 is certainly very good at performing maintenance tasks automatically, if you have a customised system it is important to Monitor, Monitor, Monitor! There are some advanced ways of setting up monitoring using tools such as System Center but just some regular simple monitoring tasks will go a long way for very little investment on your part:

1) Plugin Execution Monitoring

There is a super little entity called 'Plugin-in Type Statistics' that often seems to be overlooked in the long list of advanced find entities. This entity is invaluable for tracing down issues before they cause problems for your users and as defined by the SDK it is "used by the Microsoft Dynamics CRM 2011 and Microsoft Dynamics CRM Online platforms to record execution statistics for plug-ins registered in the sandbox (isolation mode)."

The key here is that it only records statistics for your sandboxed plugins. Unless there is a good reason not to (security access etc.) I would recommend that all of your plugins be registered in sandbox isolation. Of course Dynamics CRM online only allows sandboxed plugins anyway so you don't want to put up barriers not to move to the cloud.

To monitor this you can use advanced to show a sorted list by execution time or failure count descending:

If you spot any issues you can then proactively investigate them before they become a problem. In the screen shot above there are a few plugins that are taking more than 1000ms (1 second) to execute, but their execution count is low. I look for plugins that have a high execution count and high execution time, or those that have a high failure percent.

2) Workflow & Asynchronous Job Execution Monitoring

We all know workflows often can start failing for various reasons. Because of their asynchronous nature these failures can go unnoticed by users until it's too late and you have thousands of issues to correct. To proactively monitor this you can create a view (and even add to a dashboard) of System Jobs filtered by Status = Failed or Waiting and where the Message contains data. The Message attribute contains the full error description and stack trace, but the Friendly message just contains the information that is displayed at the top of the workflow form in the notification box.

3) Client Latency & Bandwidth Monitoring

Now that you've got the server-side under control you should also look at the client connectivity of your users. There is a special diagnostics hidden page that can be accessed by using a URL of the format:


As described by the implementation guide topic, "Microsoft Dynamics CRM is designed to work best over networks that have the following elements:

  • Bandwidth greater than 50 KB/sec
  • Latency under 150 ms"

After you click 'Run' on this test page you will get results similar to that shown below. You can see that this user is just above these requirements!

You can read more about the Diagnostic Page in Dynamics CRM. You can also monitor the client side using the techniques I describe in my series on Fiddler:

If you take these simple steps to proactively monitor your Dynamics CRM solution then you are much less likely to have a problem that goes un-noticed until you get 'that call'!


Posted on 28. February 2014

Real Time Workflow or Plugin?

I have been asked "should I use Real Time Workflows instead of Plugins?" many times since CRM2013 first introduced this valuable new feature. Real Time Workflows (RTWFs) certainly have many attractive benefits over Plugins including:

  • Uses the same interface as standard workflows making it very quick & simple to perform straightforward tasks.
  • Can be written & extended without coding skills
  • Easily extend using custom workflow activities that can be re-used in many places.

If we can use custom workflow activities to extend the native workflow designer functionality (e.g. making updates to records over 1:N and N:N relationships) then it raises the question why should we use Plugins at all?

I am a big fan of RTWFs – they add considerable power to the framework that ultimately makes the solution more effective and lowers the implementation costs. That said I still believe that considering plugins is still an important part of any Dynamics CRM Solution design - the reasons can be split into the following areas:


Performance is one of those areas that it is very tempting to get bogged down by too early by 'premature optimisation'. In most cases performance should be considered initially by adhering to best practices (e.g. not querying or updating more data than needed) and ensuring that you use supported & documented techniques. If we ensure that our system is structured in an easy to follow and logical way then Dynamics CRM will usually look after performance for us and enable us to scale up and out when it is needed (Dynamics CRM Online will even handle this scaling for you!). It is true there are some 'gotchas' that are the exception to this rule, but on the whole I think primarily it is better to design for maintainability and simplicity over performance in the first instance. Once you have a working design you can then identify the areas that are going to be the main bottle necks and then focus optimisation efforts on those areas alone. If we try to optimise all parts of the system from the start when there is no need I find that it can actually end up reducing overall performance and end up with an overly complex system that has a higher total cost of ownership.

It is true that Plugins will out performance RTWFs in terms of through-put but if the frequency of the transactions are going to be low then this usually will not be an issue. If you have logic that is going to be firing at a high frequency by many concurrent users then this is the time consider selecting a plugin over a RTWF.

In some simple tests I found that RTWFs took twice (x2) as long as a Plugin when performing a simple update. The main reason for this is that the plugin pipeline is a very efficient mechanism for inserting logic inside of the platform transaction. A RTWF inserts an additional transaction inside that parent transaction that contains many more database queries required to set up the workflow context. The component based nature of workflow activity steps means the same data must be read multiple times to make each step work in isolation from the others. Additionally, if you update a record in a RTWF, it will apply an update immediately to the database within this inner transaction. This database update is in addition to the overall plugin update. Using a plugin there will only be a single update since the plugin is updating the 'in transit' pipeline target rather than the database.

Another reason that the RTWF takes longer to complete the transaction is that it appears to always retrieve the entire record from the database even when using only a single value.

Pre/Post Images

When using plugins you have a very fine control over determining what data is being updated and can see what the record looked like before the transaction and what it would look like after the transaction. RTWFs don't offer you this same control and so if you need to determine if a value has changed from a specific value to another specific value (say a specific state transition) it is harder to determine what the value was before the workflow started. When a RTWF reads a record from the database, it will load all values, but with a plugin you can select only a small number of attributes to include in the pipeline or query.


RTWFs allow you to select whether to run as the calling user or the owner of the workflow, where a Plugin gives you full control to execute an SDK call as the system user or an impersonated user at different points during the transaction.

Code vs Configuration

With RTWFs your business logic tends to become fragmented over a much higher surface area of multiple child RTWFs. This makes unit testing and configuration control much harder. With a plugin you can write unit tests and check the code into TFS. With every change you can quickly see the differences from the previous version and easily see the order that the code is executed.

I find that it is good practice to have a single RTWF per triggering event/entity combination and call child workflows rather than have many RTWFs kicked off by the same trigger, otherwise there is no way of determining the order that they will execute in.

A very compelling reason to use plugins is the deterministic nature of their execution. You can control the order that your code executes in by simply sequencing the logic in a single plugin and then unit test this sequence of events by using a mock pipeline completely outside of the Dynamics CRM Runtime.

So just tell me which is best?!

Which is best? Each have their strengths and weaknesses so the answer is "it depends" (as it so often is!).

After all this is said and done – a very compelling reason to use RTWS is the fact that business users can author and maintain them in a fraction of the time it takes to code and deploy a plugin and often will result in far fewer bugs due to the component based building blocks.

As a rule of thumb, I use two guiding principles:

  1. If there is already custom plugin code on an entity (or it is planned), then use a plugin over a RTWF to keep the landscape smaller and deterministic.
  2. If you know up-front that you need a very high throughput for a particular function with a high degree of concurrency, then consider a plugin over a RTWF for this specific area.


Posted on 15. November 2013

Using the CRM 2013 Developer Toolkit to target CRM 2011

If you find yourself constantly juggling multiple development environments, you'll no doubt come up against the issue of the Developer Toolkit only supporting a single installation – either for CRM 2011 or CRM 2013.

To overcome this, I have been using the CRM 2013 version again CRM 2011 with no issues. The only thing to remember is that if you create a new project that needs to target CRM 2011, you must remove the version 6 assemblies from your plugin and workflow projects and re-add the version 5 ones. You can do this using nuget with a specific version:

Install-Package Microsoft.CrmSdk.CoreAssemblies -Version 5.0.17

See my post for more information.


Posted on 20. September 2013

How to change process and stage programmatically


UPDATE: With CRM2015 - the new client side process API should be used to set the current step. The following approach should be only used for setting the process to the first step.

If you have more than one Dynamics CRM 2013 Business Process Flow for a given entity users can easily change the current process by using the 'Switch Process' Command Bar button:

Once changed, you can identify which process is running by hovering over the 'I' on the bottom right of the process flow:

If your users would like to change the Business Process Flow stage dynamically based on a given set of attribute values then it is not immediately obvious how this can be achieved. This post provides one solution using a custom workflow activity in combination with the new Synchronous Workflow feature of Dynamics CRM 2013.


The currently selected process stage on a given entity record is stored in CRM using the following attributes:

  • processid – GUID of the selected Business Process (workflow)
  • stageid – GUID of the selected Process Stage (processstage)

Initially I thought that these attributes might be able to be updated using a workflow directly, but it turns out you can only update them via code.

You will need to know how to write Custom Workflow Activities and add JavaScript Web Resources to follow these steps:

1. Using the Visual Studio Developer Tookit found in the SDK, create a new Custom Workflow Activity that accepts the following input arguments:

[Input("Process Name")]
public InArgument Process { get; set; }

[Input("Process Stage Name")]
public InArgument ProcessStage { get; set; }

These parameters need to be strings since you cannot pass in EntityReferences to the Workflow and ProcessStage entities.

2. Add to the Execute code the following:

using (var ctx = new OrganizationServiceContext(service))
    // Get the processid using the name provided
    var process = (from p in ctx.CreateQuery()
                   p.Name == Process.Get(executionContext)
                   p.StateCode == WorkflowState.Activated
                   select new Workflow
                       WorkflowId = p.WorkflowId

    if (process==null)
        throw new InvalidPluginExecutionException(String.Format("Process '{0}' not found",Process.Get(executionContext)));

    // Get the stage id using the name provided
    var stage = (from s in ctx.CreateQuery()
                 s.StageName == ProcessStage.Get(executionContext)
                 s.ProcessId.Id == process.WorkflowId
                 select new ProcessStage
                     ProcessStageId = s.ProcessStageId

    if (stage == null)
        throw new InvalidPluginExecutionException(String.Format("Stage '{0}' not found", Process.Get(executionContext)));

    // Change the stage
    Entity updatedStage = new Entity(context.PrimaryEntityName);
    updatedStage.Id = context.PrimaryEntityId;
    updatedStage["stageid"] = stage.ProcessStageId;
    updatedStage["processid"] = process.WorkflowId;

3. Deploy your custom workflow activity.

4. Create a New Workflow that is defined with the following parameters:

  • Activate As: Process
  • Run this workflow in the background : NO
  • Scope: Organization (if you want it to run for all users)
  • Start when: Before – Record fields change. Select the attributes that the process stage is dependant on so that it does not run on every change.

4. Add a set of conditions to determine if the stage should be changed and call your custom workflow activity providing the Process Name and Process Stage to change to.

5. Since the Process and Stage will not automatically change on the form, you need to add some JavaScript to refresh the form by saving it when the dependant fields are changed. You can do this by adding an onChange event to the dependant fields that calls:

UPDATE: Rob Boyers pointed out that this doesn't reliably refresh the process, however there isn't currently a supported way of doing it.
You could use the following unsupported method call to reload the whole form:



  1. This technique could also be written using a plugin - but the advantage of using a Synchronous workflow is that the logic that determines the stage can be easily changed without re-compiling code.
  2. You can use the same technique to read the process and stage name so that you can make a decision using the current position in the process. I've included a sample of this too in the download
  3. You can download the sample code from here -