Solution Packager and global optionset enum support in spkl Task Runner

I’ve published version 1.0.9 of spkl to NuGet - this adds the following new features:

Global optionset enum generation for early bound classes. Solution Packager support

Global Optionset enum generation This was a tricky one due to the CrmSvcUtil not making it easy to prevent multiple enums being output where a global optionset is used, but you can now add the following to your spkl.json early bound section to generate global optionset enums. { "earlyboundtypes": [ { ... "generateOptionsetEnums": true, ... } ] }

In a future update, I’ll add the ability to filter out the enums to only those used. Solution Packager Support The solution packager allows you to manage your Dynamics metadata inside a Visual Studio project by extracting the solution into separate xml files. When you need to combine multiple updates from code comments, you can then use the packager to re-combine and import into Dynamics. To configure the solution packager task you can add the following to your spkl.json /* The solutions section defines a solution that can be extracted to individual xml files to make versioning of Dynamics metadata (entities, attributes etc) easier / "solutions": [ { "profile": "default,debug", / The unique name of the solution to extract, unpack, pack and import / "solution_uniquename": "spkltestsolution", / The relative folder path to store the extracted solution metadata xml files / "packagepath": "package", / Set to 'true' to increment the minor version number before importing from the xml files */ "incrementonimport": false } ]

There are two .bat files provided that will call: spkl unpack This will extract the solution specifed in the spkl.json into the packagepath as multiple xml files spkl import This will re-pack the xml files and import into Dynamics - optionally increasing the version number of the solution to account for the new build.

Not all Business Process Flow entities are created equal

As you probably know by now, when you create Business Process Flows in 8.2+ you'll get a new custom entity that is used to store running instances (if not then read my post on the new Business Process Flow entities). When your orgs are upgraded to 8.2 from a previous version then the business process flow entities will be created automatically for you during the upgrade. They are named according to the format:


Notice that the prefix is new_. This bothered me when I first saw it because if you create a Business Process Flow as part of a solution then the format will be:


Here lies the problem. If you import a pre-8.2 solution into an 8.2 org, then the Business Process Flows will be prefixed with the solution prefix – but if the solution is in-place upgraded then they will be prefixed with new. Why is this a problem? Once you've upgraded the pre-8.2 org to 8.2 then the Business Process Flows will stay named as new_ and included in the solution. When you then import an update to the target org – the names will conflict with each other and you'll get the error:

"This process cannot be imported because it cannot be updated or does not have a unique name."

Source 8.1 OrgSolution with myprefix_

Empty 8.2 Org




BPF entity created - myprefixBPFxxx

Upgraded to 8.2 BPF entity created - newBPFxxx





"This process cannot be imported because it cannot be updated or does not have a unique name." newBPFxxx conflicts with myprefixBPFxxx

  How to solve Unfortunately, there isn't an easy way out of this situation. There are two choices:

If you have data in the target org that you want to keep – you'll need to recreate the BPFs in the source org so that they have the myprefix_ - you can do this by following the steps here - If you are not worried about data in the target org you can delete those BPFs and re-import the solution exported from the upgraded 8.2 source org.

The good news is that this will only happen to those of you who have source and target orgs upgraded at different times – if you upgrade your DEV/UAT/PROD at the same time you'll get BPFs entities all prefixed with new_ @ScottDurow

Simple, No fuss, Dynamics 365 Deployment Task Runner

Why? I've used the Dynamics Developer Toolkit since it was first released by MCS for CRM4! I love the functionality it brings however the latest version is still in beta, it isn't supported on VS2017 and there isn't a date when it's likely to be either (yes, you can hack it to make it work but that's not the point J). Rather than using an add-in Visual Studio project type, I've been attracted by the VS Code style simple project approach and so I decided to create a 'no-frills' alternative that uses a simple json config file (and that can be used in VS2017). What?

Deploy Plugins & Workflow Activities - Uses reflection to read plugin registration information directly from the assembly. This has the advantage that the plugin configuration is in the same file as the code. You can use the 'instrument' task to pull down the plugin configuration from Dynamics and add the metadata to your classes if you already have an existing project. Deploy Web Resources – deploy webresources from file locations defined in the spkl.json configuration. You can use the 'get-webresources' task to create the spkl.json if you already have webresources deployed. Generate Early Bound Types – Uses the spkl.json to define the entities to generate each time the task is run to make the process repeatable. Profile management – An optional profile can be supplied to select a different set of configuration from spkl.json. E.g. debug and release build profiles.

How? Let's assume you have a project in the following structure: Solution |-Webresources | |-html | | |-HtmlPage.htm | |-js | | |-Somefile.js |-Plugins | |-MyPlugin.cs |-Workflows | |-MyWorkflowActivity.cs

On both the Plugin and Workflows project, Run the following from the Nuget Console: Import-Package spkl This will add the spkl to the packages folder and the metadata CrmPluginConfigurationAttribute.cs that is used to mark up your classes so that spkl can deploy them. Some simple batch files are also included that you can use to get started. If you already have plugins deployed, you can run the following command line in the context of the Plugins folder: spkl instrument This will prompt you for a Dynamics Connection, and then search for any deployed plugins and their matching .cs file. If the MyPlugin.cs plugin is already deployed it might end up with the following Attribute metadata: [CrmPluginRegistration("Create","account", StageEnum.PreValidation,ExecutionModeEnum.Synchronous, "name,address1_line1", "Create Step",1,IsolationModeEnum.Sandbox, Description ="Description", UnSecureConfiguration = "Some config")] A spkl.json file will be created in the project directly similar to: { "plugins": [ { "solution": "Test", "assemblypath": "bin\Debug" } ] }

If you now build your plugins, you can then run the following to deploy spkl plugins You can run instrument for the workflow project using the same technique which will result in code similar to the following being added to your workflow activity classes: [CrmPluginRegistration( "WorkflowActivity", "FriendlyName","Description", "Group Name",IsolationModeEnum.Sandbox)]

…and then run the following to deploy: spkl workflow
To get any currently deployed webresources matched to your project files you can run the following from the Webresource project folder: spkl get-webresources /s:new
    Where new is the solution prefix you've used This will create a spkl.json similar to the following: { "webresources": [ { "root": "", "files": [ { "uniquename": "new/js/somefile.js", "file": "js\somefile.js", "description": "" }, { "uniquename": "new/html/HtmlPage.htm", "file": "html\HtmlPage.htm", "description": "" } ] } ] }

You can then deploy using: spkl webresources Profiles For Debug/Release builds you can define multiple profiles that can be triggered using the /p:<profilename> parameter. { "plugins": [ { "profile": "default,debug", "assemblypath": "bin\Debug" }, { "profile": "release", "solution": "Test", "assemblypath": " bin\Release" } ]


The default profile will be used if no /p: parameter is supplied. You can specify a profile using: spkl plugins /p:release
Referencing a specific assembly rather than searching the folder If you have multiple plugins in a single deployment folder and you just want to deploy one, you can explicitly provide the path rather than using the folder search. E.g. { "plugins": [ { "assemblypath": "bin\Debug\MyPlugin.dll"

Adding to a solution If you'd like to automatically add the items deployed to a solution after deployment you can use: { "webresources": [ { "root": "", "solution": "Test", Combining spkl.json Perhaps you want to have a single spkl.json rather than multiple ones per project. You can simply add them all together: { "webresources": […], "plugins": […] }

Multiple project deployments Since the spkl.json configuration files are searched from the current folder, you can deploy multiple plugins/webresources using a single spkl call from a root folder. I'll be updating the github documentation page as things move forwards.

Limitations of Calculated Fields and the Data Export Service

You probably already know that I'm a big fan of the Data Export Service. The single fact of having a 'near real time' replica of your data in a SQL Azure Database to query in any way you want is simply amazing. Today I came across an interesting limitation with Calculated Fields. Although Calculated Fields are created in the Dynamics database as SQL Server Computed Columns, they are output in the Replica Database fields as standard fields. This has a rather inconvenient side-effect when you have calculated fields that are linked to either date/time or a related record. Since the Azure Replica sync is event based, when a related record is updated there is no corresponding event on the referencing record that contains the calculated field therefore it does not get updated. Likewise, if a calculated field changes depending on the date/time then there is no event that triggers the azure replica to be updated. This means that although calculated fields maybe correct at the time the record was created, subsequent updates can make the field become stale and inaccurate. Lesson learned - you cannot guarantee the accuracy of calculated fields in the Azure Replica if they contain: The Now() function A related record field (e.g. Interestingly, calculated fields that use data on the same record do get updated, so the event integration must do a compare of any calculated fields to see if they have changed. @ScottDurow

Simplified Connection Management & Thread Safety (Revisited)

There is one certainty in the world and that is that things don't stay the same! In the Dynamics 365 world, this is no exception, with new features and SDK features being released with a pleasing regularity. Writing 'revisited' posts has become somewhat of a regular thing these days. In my previous post on this subject back in 2013 we looked at how you could use a connection dialog or connection strings to get a service reference from the Microsoft.Xrm.Client library and how it can be used in a thread safe way. Microsoft.Xrm.Tooling For a while now there has been a replacement for the Microsoft.Xrm.Client library – the Microsoft.Xrm.Tooling library. It can be installed from NuGet using: Install-Package Microsoft.CrmSdk.XrmTooling.CoreAssembly

When you use the CrmServerLoginControl, the user interface should look very familiar because it's the same that is used in all the SDK tools such that Plugin Registration Tool.

The sample in the SDK shows how to use this WPF control. The WPF control works slightly differently to the Xrm.Client ShowDialog() method – since it gives you much more flexibility over how the dialog should behave and allows embedding inside your WPF application rather than always having a popup dialog. Connection Strings Like the dialog, the Xrm.Tooling also has a new version of the connection string management – the new CrmServiceClient accepts a connection string in the constructor. You can see examples of these connection strings in the SDK. CrmServiceClient crmSvc = new CrmServiceClient(ConfigurationManager.ConnectionStrings["Xrm"].ConnectionString);

For Dynamics 365 online, the connection would be:

<connectionStrings> <add name="Xrm" connectionString="AuthType=Office365;; Password=passcode;Url=" /> </connectionStrings>

Thread Safety The key to understanding performance and thread safety of calling the Organization Service is the difference between the client proxy and the WCF channel. As described by the 'Improve service channel allocation performance' topic from the best practice entry in the SDK, the channel should be reused because creating it involves time consuming metadata download and user authentication. The old Microsoft.Xrm.Client was thread safe and would automatically reuse the WCF channel that was already authenticated. The Xrm.Tooling CrmServiceClient is no exception. You can create a new instance of CrmServiceClient and existing service channels will be reused if one is available on that thread. Any calls the same service channel will be locked to prevent thread issues. To demonstrate this, I first used the following code that ensures that a single CrmServiceClient is created per thread. Parallel.For(1, numberOfRequests, new ParallelOptions() { MaxDegreeOfParallelism = maxDop }, () => { // This is run for each thread var client = new CrmServiceClient(username, CrmServiceClient.MakeSecureString(password), "EMEA", orgname, useUniqueInstance: false, useSsl: false, isOffice365: true);

    return client;
(index, loopState, client) =&gt;
    // Make a large request that takes a bit of time
    QueryExpression accounts = new QueryExpression("account")
        ColumnSet = new ColumnSet(true)
    return client;
(client) =&gt;

With a Degree of Parallelism of 4 (the number of threads that can be executing in parallel) and a request count of 200, there will be a single CrmServiceClient created for each thread and the fiddler trace looks like this:

Now to prove that the CrmServiceClient handles thread concurrency automatically, I moved the instantiation into loop so that every request would create a new client: Parallel.For(1, numberOfRequests, new ParallelOptions() { MaxDegreeOfParallelism = maxDop }, (index) => { // This is run for every request var client = new CrmServiceClient(username, CrmServiceClient.MakeSecureString(password), "EMEA", orgname, useUniqueInstance: false, useSsl: false, isOffice365: true); // Make a large request that takes a bit of time QueryExpression accounts = new QueryExpression("account") { ColumnSet = new ColumnSet(true) }; client.RetrieveMultiple(accounts); });

Running this still shows a very similar trace in fiddler:

This proves that the CrmServiceClient is caching the service channel and returning a pre-authenticated version per thread. In contrast to this, if we set the useUniqueInstance property to true on the CrmServiceClient constructor, we get the following trace in fiddler:

So now each request is re-running the channel authentication for each query – far from optimal! The nice thing about the Xrm.Tooling library is that it is used exclusively throughout the SDK – where the old Xrm.Client was an satellite library that came from the legacy ADX portal libraries. Thanks to my friend and fellow MVP Guido Preite for nudging me to write this post! @ScottDurow

How to get assistance without the Relationship Assistant!

Dynamics 365 has brought with it a new and amazing feature called the 'Relationship Assistant'. It is part of a preview feature (unsupported and US only) called 'Relationship Insights' which promises to bring some amazing productivity tools to the Dynamics 365 platform. Relationship Assistant shows actionable cards in both the web client and mobile client using a mix of both plain old filter conditions and machine learning.

Read about Relationship Assistant Read about Relationship Insights

Machine Learning Cards One of the most exciting part of the Relationship Assistant is the use of machine learning to examine the contents of your emails and predict what you need to do next: Customer Question Card

Issue Detected Card

'Plain old query' Cards Whilst the machine learning aspects may be out of our reach to us mere mortals at this time, the cards that are based on simpler filter conditions such as 'Due Today' and 'Meeting Today' are items that can be easily shown in a dashboard without this preview feature. Here are some examples of information that can be gained from simple date queries: Due Today Card

Meeting Today Card

Missed Close Date Card

(Images taken from the Relationship Assistant Card reference - Create your own 'Relationship Assistant' Dashboard The main challenge with producing information shown above is the date aspect to the query. We can easily show a single set of records that use the 'Next X days' type of operator, but you could not easily use 'todays' date in a dashboard chart – at least not until CRM2015 introduced calculated fields. Now it is rather easy to produce a dashboard similar to the following:

The key feature of dashboards is that they are can be tailored to show your own data which can be drilled into to show the underlying records. This is comparable to the 'actionable' aspect of the relationship assistant where you could drill into the tasks due to 'today' and open them to work upon. Notice the field 'Due' that can have the value 'Completed', 'Due Next 7 Days', 'Due Today', 'Overdue', or 'Scheduled'. This field isn't stored as a persistent field in the database, but instead it is a calculated field so there are no nightly jobs or workflows required to update a field based on the current date. Adding a 'Due Status' field to an Activity Entity

Create a solution with the Activity Entity that you want to add the 'Due Status' field to Create a new field called 'Due Diff' – this will give us a field that shows the number of days before/after the activity due date.

Click 'Edit' and type the expression DiffInDays(scheduledstart, Now()) Note: This assumes that this is an Appointment and you want to use the scheduledstart date to control the due date. Add a new global Option Set that holds the possible values for the Due status Create a new Calculated Option Set field called 'Due' on the Activity record. Use the Existing Option Set created above.

Click 'Edit' on the Calculated Field type and add the following logic: Create a chart something like: Publish and add the charts to a dashboard!

Of course other more complex options exist but with all the excitement and awesomeness of Machine Learning it is important to remember that we can achieve great things with just the right kind of queries, charts and dashboards!

Hope this helps!