Posted on 16. October 2017

Counting Sheeps

One of the strangest part of the Dynamics CRM WebApi is the pluralisation of the entity names.

In the old OData endpoint, the entity set name was <EntityLogicalName>Set – however in the OData 4.0 endpoing, the Logical Name is pluralised by using a simplistic set of rules which often results in the incorrect plural name being picked.

This introduced a conundrum – Performance vs. correctness. Do we query the metadata for the Entity Set name at runtime – or use a duplicate set of over simplified rules in our JavaScript?

The New Version 9 Client Side API

The good news is that with version 9, the Xrm Api now supports:


Xrm.Utility.getEntitySetName("contact")

This will return "contacts" and so we can safely use this without worrying if the plural name is correct or not or indeed if it changes in the future.

Hope this helps!

 

 

Posted on 15. September 2017

Folders are back!

It's a long time since I've used the old SharePoint list component and for the most part, I've not missed it. Server to Server integration is slick and just works.

That said, the one thing that I do miss is support for folders - but whilst testing the new 9.0 Enterprise Edition I've noticed that folder support has been added in this latest release!

I was so excited I just had to share a little video of what it looks like

Fodlers are back

Maybe in the release after this, we'll get support for content types and metadata properties!

 

Posted on 11. August 2017

Solution Packager and global optionset enum support in spkl Task Runner

I’ve published version 1.0.9 of spkl to NuGet - this adds the following new features:

  1. Global optionset enum generation for early bound classes.
  2. Solution Packager support

Global Optionset enum generation

This was a tricky one due to the CrmSvcUtil not making it easy to prevent multiple enums being output where a global optionset is used, but you can now add the following to your spkl.json early bound section to generate global optionset enums.

{
  "earlyboundtypes": [
    {
      ...
      "generateOptionsetEnums": true,
      ...
    }
  ]
}

In a future update, I’ll add the ability to filter out the enums to only those used.

Solution Packager Support

The solution packager allows you to manage your Dynamics metadata inside a Visual Studio project by extracting the solution into separate xml files. When you need to combine multiple updates from code comments, you can then use the packager to re-combine and import into Dynamics. To configure the solution packager task you can add the following to your spkl.json

 /*
  The solutions section defines a solution that can be extracted to individual xml files to make
  versioning of Dynamics metadata (entities, attributes etc) easier
  */
  "solutions": [
    {
      "profile": "default,debug",
      /*
      The unique name of the solution to extract, unpack, pack and import
      */
      "solution_uniquename": "spkltestsolution",
      /*
      The relative folder path to store the extracted solution metadata xml files
      */
      "packagepath": "package",
      /*
      Set to 'true' to increment the minor version number before importing from the xml files
      */
      "increment_on_import": false
    }
  ]

There are two .bat files provided that will call:

spkl unpack

This will extract the solution specifed in the spkl.json into the packagepath as multiple xml files

spkl import

This will re-pack the xml files and import into Dynamics - optionally increasing the version number of the solution to account for the new build.

Posted on 24. June 2017

Not all Business Process Flow entities are created equal

As you probably know by now, when you create Business Process Flows in 8.2+ you'll get a new custom entity that is used to store running instances (if not then read my post on the new Business Process Flow entities).

When your orgs are upgraded to 8.2 from a previous version then the business process flow entities will be created automatically for you during the upgrade. They are named according to the format:

new_BPF_<ProcessId>

Notice that the prefix is new_. This bothered me when I first saw it because if you create a Business Process Flow as part of a solution then the format will be:

<SolutionPrefix>_BPF_<ProcessId>

Here lies the problem. If you import a pre-8.2 solution into an 8.2 org, then the Business Process Flows will be prefixed with the solution prefix – but if the solution is in-place upgraded then they will be prefixed with new.

Why is this a problem?

Once you've upgraded the pre-8.2 org to 8.2 then the Business Process Flows will stay named as new_ and included in the solution. When you then import an update to the target org – the names will conflict with each other and you'll get the error:

"This process cannot be imported because it cannot be updated or does not have a unique name."

Source 8.1 Org
Solution with myprefix_

Empty 8.2 Org

Export

Import

 

BPF entity created - myprefix_BPF_xxx

Upgraded to 8.2

BPF entity created - new_BPF_xxx

 

Export

Import

 

"This process cannot be imported because it cannot be updated or does not have a unique name."

new_BPF_xxx conflicts with myprefix_BPF_xxx

 

How to solve

Unfortunately, there isn't an easy way out of this situation. There are two choices:

  1. If you have data in the target org that you want to keep – you'll need to recreate the BPFs in the source org so that they have the myprefix_ - you can do this by following the steps here - https://support.microsoft.com/en-us/help/4020021/after-updating-to-dynamics-365-mismatched-business-process-flow-entity
  2. If you are not worried about data in the target org you can delete those BPFs and re-import the solution exported from the upgraded 8.2 source org.

The good news is that this will only happen to those of you who have source and target orgs upgraded at different times – if you upgrade your DEV/UAT/PROD at the same time you'll get BPFs entities all prefixed with new_

@ScottDurow

Posted on 15. May 2017

Continuous Integration using spkl Task Runner

This is the third video in a series showing you how to quickly setup VSTS Continuous Integration with spkl.

Watch in youtube

1. Learn more about the spkl task runner

2. Learn how to deploy plugins with the spkl task runnner

3. Learn how to deploy webresources with the spkl task runnner

Posted on 15. May 2017

Deploying Webresources using spkl Task Runner

This is the second video in a series showing you how get up and running with spkl with no fuss!

Watch in youtube

1. Learn more about the spkl task runner

2. Learn how to deploy plugins with the spkl task runnner

Posted on 12. May 2017

Deploying Plugins using spkl Task Runner

Following from my last blog post on the spkl Task Runner, this is the first video in a series showing you how get up and running with spkl with no fuss!

 

Posted on 3. May 2017

Simple, No fuss, Dynamics 365 Deployment Task Runner

Why?

I've used the Dynamics Developer Toolkit since it was first released by MCS for CRM4! I love the functionality it brings however the latest version is still in beta, it isn't supported on VS2017 and there isn't a date when it's likely to be either (yes, you can hack it to make it work but that's not the point J).

Rather than using an add-in Visual Studio project type, I've been attracted by the VS Code style simple project approach and so I decided to create a 'no-frills' alternative that uses a simple json config file (and that can be used in VS2017).

What?

  1. Deploy Plugins & Workflow Activities - Uses reflection to read plugin registration information directly from the assembly. This has the advantage that the plugin configuration is in the same file as the code. You can use the 'instrument' task to pull down the plugin configuration from Dynamics and add the metadata to your classes if you already have an existing project.
  2. Deploy Web Resources – deploy webresources from file locations defined in the spkl.json configuration. You can use the 'get-webresources' task to create the spkl.json if you already have webresources deployed.
  3. Generate Early Bound Types – Uses the spkl.json to define the entities to generate each time the task is run to make the process repeatable.
  4. Profile management – An optional profile can be supplied to select a different set of configuration from spkl.json. E.g. debug and release build profiles.

How?

Let's assume you have a project in the following structure:

Solution
    |-Webresources
    |    |-html
    |    |    |-HtmlPage.htm
    |    |-js
    |    |    |-Somefile.js
    |-Plugins
    |    |-MyPlugin.cs
    |-Workflows
    |    |-MyWorkflowActivity.cs

On both the Plugin and Workflows project, Run the following from the Nuget Console:

Import-Package spkl

This will add the spkl to the packages folder and the metadata CrmPluginConfigurationAttribute.cs that is used to mark up your classes so that spkl can deploy them. Some simple batch files are also included that you can use to get started.

If you already have plugins deployed, you can run the following command line in the context of the Plugins folder:

spkl instrument

This will prompt you for a Dynamics Connection, and then search for any deployed plugins and their matching .cs file. If the MyPlugin.cs plugin is already deployed it might end up with the following Attribute metadata:

[CrmPluginRegistration("Create","account",
    StageEnum.PreValidation,ExecutionModeEnum.Synchronous,
    "name,address1_line1", "Create Step",1,IsolationModeEnum.Sandbox,
    Description ="Description",
    UnSecureConfiguration = "Some config")]

A spkl.json file will be created in the project directly similar to:

{
  "plugins": [
    {
      "solution": "Test",
      "assemblypath": "bin\\Debug"
    }
  ]
}

If you now build your plugins, you can then run the following to deploy

spkl plugins

You can run instrument for the workflow project using the same technique which will result in code similar to the following being added to your workflow activity classes:

[CrmPluginRegistration(
        "WorkflowActivity", "FriendlyName","Description",
        "Group Name",IsolationModeEnum.Sandbox)]

…and then run the following to deploy:

spkl workflow			

To get any currently deployed webresources matched to your project files you can run the following from the Webresource project folder:

spkl get-webresources /s:new			

    Where new is the solution prefix you've used

This will create a spkl.json similar to the following:

{
  "webresources": [
    {
      "root": "",
      "files": [
        {
          "uniquename": "new_/js/somefile.js",
          "file": "js\\somefile.js",
          "description": ""
        },
        {
          "uniquename": "new_/html/HtmlPage.htm",
          "file": "html\\HtmlPage.htm",
          "description": ""
        }
      ]
    }
  ]
}

You can then deploy using:

spkl webresources

Profiles

For Debug/Release builds you can define multiple profiles that can be triggered using the /p:<profilename> parameter.

{
  "plugins": [
    {
      "profile": "default,debug",
      "assemblypath": "bin\\Debug"
    },
    {
      "profile": "release",
      "solution": "Test",
      "assemblypath": " bin\\Release"
    }
  ]
 
}

The default profile will be used if no /p: parameter is supplied. You can specify a profile using:

spkl plugins /p:release			

Referencing a specific assembly rather than searching the folder

If you have multiple plugins in a single deployment folder and you just want to deploy one, you can explicitly provide the path rather than using the folder search. E.g.

{
  "plugins": [
    {
      "assemblypath": "bin\\Debug\MyPlugin.dll"

Adding to a solution

If you'd like to automatically add the items deployed to a solution after deployment you can use:

{
  "webresources": [
    {
      "root": "",
      "solution": "Test",

Combining spkl.json

Perhaps you want to have a single spkl.json rather than multiple ones per project. You can simply add them all together:

{
  "webresources": […],
  "plugins": […]
}

Multiple project deployments

Since the spkl.json configuration files are searched from the current folder, you can deploy multiple plugins/webresources using a single spkl call from a root folder.

I'll be updating the github documentation page as things move forwards.

Posted on 15. March 2017

Limitations of Calculated Fields and the Data Export Service

You probably already know that I'm a big fan of the Data Export Service. The single fact of having a 'near real time' replica of your data in a SQL Azure Database to query in any way you want is simply amazing.

Today I came across an interesting limitation with Calculated Fields. Although Calculated Fields are created in the Dynamics database as SQL Server Computed Columns, they are output in the Replica Database fields as standard fields.

This has a rather inconvenient side-effect when you have calculated fields that are linked to either date/time or a related record. Since the Azure Replica sync is event based, when a related record is updated there is no corresponding event on the referencing record that contains the calculated field therefore it does not get updated. Likewise, if a calculated field changes depending on the date/time then there is no event that triggers the azure replica to be updated. This means that although calculated fields maybe correct at the time the record was created, subsequent updates can make the field become stale and inaccurate.

Lesson learned - you cannot guarantee the accuracy of calculated fields in the Azure Replica if they contain:

  1. The Now() function
  2. A related record field (e.g. accountid.name)

Interestingly, calculated fields that use data on the same record do get updated, so the event integration must do a compare of any calculated fields to see if they have changed.

@ScottDurow

Posted on 11. March 2017

Simplified Connection Management & Thread Safety (Revisited)

There is one certainty in the world and that is that things don't stay the same! In the Dynamics 365 world, this is no exception, with new features and SDK features being released with a pleasing regularity. Writing 'revisited' posts has become somewhat of a regular thing these days.

In my previous post on this subject back in 2013 we looked at how you could use a connection dialog or connection strings to get a service reference from the Microsoft.Xrm.Client library and how it can be used in a thread safe way.

Microsoft.Xrm.Tooling

For a while now there has been a replacement for the Microsoft.Xrm.Client library – the Microsoft.Xrm.Tooling library. It can be installed from NuGet using:

Install-Package Microsoft.CrmSdk.XrmTooling.CoreAssembly

When you use the CrmServerLoginControl, the user interface should look very familiar because it's the same that is used in all the SDK tools such that Plugin Registration Tool.

The sample in the SDK shows how to use this WPF control.

The WPF control works slightly differently to the Xrm.Client ShowDialog() method – since it gives you much more flexibility over how the dialog should behave and allows embedding inside your WPF application rather than always having a popup dialog.

Connection Strings

Like the dialog, the Xrm.Tooling also has a new version of the connection string management – the new CrmServiceClient accepts a connection string in the constructor. You can see examples of these connection strings in the SDK.

CrmServiceClient crmSvc = new CrmServiceClient(ConfigurationManager.ConnectionStrings["Xrm"].ConnectionString);

For Dynamics 365 online, the connection would be:

<connectionStrings>
    <add name="Xrm" connectionString="AuthType=Office365;Username=jsmith@contoso.onmicrosoft.com; Password=passcode;Url=https://contoso.crm.dynamics.com" />
</connectionStrings>

Thread Safety

The key to understanding performance and thread safety of calling the Organization Service is the difference between the client proxy and the WCF channel. As described by the 'Improve service channel allocation performance' topic from the best practice entry in the SDK, the channel should be reused because creating it involves time consuming metadata download and user authentication.

The old Microsoft.Xrm.Client was thread safe and would automatically reuse the WCF channel that was already authenticated. The Xrm.Tooling CrmServiceClient is no exception. You can create a new instance of CrmServiceClient and existing service channels will be reused if one is available on that thread. Any calls the same service channel will be locked to prevent thread issues.

To demonstrate this, I first used the following code that ensures that a single CrmServiceClient is created per thread.

Parallel.For(1, numberOfRequests,
    new ParallelOptions() { MaxDegreeOfParallelism = maxDop },
    () =>
    {
        // This is run for each thread
        var client = new CrmServiceClient(username,
               CrmServiceClient.MakeSecureString(password),
               "EMEA",
               orgname,
               useUniqueInstance: false,
               useSsl: false,
               isOffice365: true);
        
        return client;
    },
    (index, loopState, client) =>
    {
        // Make a large request that takes a bit of time
        QueryExpression accounts = new QueryExpression("account")
        {
            ColumnSet = new ColumnSet(true)
        };
        client.RetrieveMultiple(accounts);
        return client;
    },
    (client) =>
    {
    });

With a Degree of Parallelism of 4 (the number of threads that can be executing in parallel) and a request count of 200, there will be a single CrmServiceClient created for each thread and the fiddler trace looks like this:

Now to prove that the CrmServiceClient handles thread concurrency automatically, I moved the instantiation into loop so that every request would create a new client:

Parallel.For(1, numberOfRequests,
    new ParallelOptions() { MaxDegreeOfParallelism = maxDop },
    (index) =>
    {
        // This is run for every request
        var client = new CrmServiceClient(username,
               CrmServiceClient.MakeSecureString(password),
               "EMEA",
               orgname,
               useUniqueInstance: false,
               useSsl: false,
               isOffice365: true);
        // Make a large request that takes a bit of time
        QueryExpression accounts = new QueryExpression("account")
        {
            ColumnSet = new ColumnSet(true)
        };
        client.RetrieveMultiple(accounts);
    });

Running this still shows a very similar trace in fiddler:

This proves that the CrmServiceClient is caching the service channel and returning a pre-authenticated version per thread.

In contrast to this, if we set the useUniqueInstance property to true on the CrmServiceClient constructor, we get the following trace in fiddler:

So now each request is re-running the channel authentication for each query – far from optimal!

The nice thing about the Xrm.Tooling library is that it is used exclusively throughout the SDK – where the old Xrm.Client was an satellite library that came from the legacy ADX portal libraries.

Thanks to my friend and fellow MVP Guido Preite for nudging me to write this post!

@ScottDurow