Wednesday, April 4, 2018

Dynamics 365 S2S OAuth authentication with certificates

After participating in a recent thread on the Microsoft Dynamics 365 community on Facebook I decided to write up a blog post how to do S2S OAuth2.0 authentication with Dynamics 365 using certificates.


  • AzureRM PowerShell modules (specifically AzureRM.Resources)
  • Azure Active Directory administrative access
  • Optional: Download/clone my repo

Creating a self-signed certificate

Disclaimer: Most of the scaffolding of the certification code is copied from other blog posts. There are some tweaks that are my own.
I’ve created/modified a PowerShell script which creates a new self signed certificate with a 1 year validity starting from the date you run the script. You could increase this is you want, but I recommend using a certificate rollover strategy instead of relying on certificates with infinite period of validity. When you’ve got a proper certificate strategy it doesn’t really matter if it’s 1 year, 2 years, 3 months, or whatever, it’s something that should be automated and easy to maintain. The steps to the script are as follows:
  1. Specify an FQDN, and use something descriptive (“MSDYN365 cert” is not really descriptive)
  2. Enter a password used to encrypt exported certificate PFX
  3. Enter path to store exported PFX
  4. Enter desired AAD App name
  5. Enter desired AAD App homepage Uri (does not have to be a valid address)
  6. Enter desired AAD App identifier Uri (does not have to be a valid address)
  7. Log in to AAD with administrative credentials (need to have permissions to create an AAD app)
  8. Verify that login is successful.

At this point the following things have happened:

There is a new certificate stored in the personal store ([Win] + [R], type MMC, [CTRL] + [M], select Certificates and add, choose Computer Account, choose local computer, expand Certificates => Personal => Certificates. Here is the new self-signed certificate created
The certificate has been exported to the folder you entered during the script execution
Now, as an addition, I’m adding the certificate to my personal store just for this test. Just right click that pfx and choose install, and place it in the personal store of your user account.

Adding an application user in Dynamics 365

Now we go into Dynamics 365, then go to Settings and Security, and finally open Users.
Change the default view to Application Users, then select New from the ribbon.
Add a username and the application id of the newly created AAD application (you can find this through the Azure Portal as well). Additionally add a name and email address. More information about creating application users are found in the Microsoft Docs. When you save the user, the rest of the information will be filled out, which lets you know that it found the application and managed to load the application details from Azure AD.
Finally, give it a security role so it has permissions to do stuff.

Log in to Dynamics 365 using the newly created certificate

The complete code for this is available on my public github repo found here.
First of all, you need to collect the application id and the reply url from the azure ad application registered earlier. Additionally, you have to get the organization URL for your Dynamics 365 organization, and you need to get the certificate thumbprint from the certificate generated in the first step.

Don’t know how to find the signature?

Open opp MMC ([Win]+[R], type MMC and hit ok). Now add the certificate snap-in ([CTRL]+[M], add certificate, choose “My User Account”, hit Finish and OK). Expand personal certificates and find the name of the self signed certificate. Open the certificate information, go to the details tab and scroll down to the bottom where you find the signature. It should look something like this


Create a new .net framework console project in visual studio (or just copy/clone my repo), then do the following:
  • Add nuget package, search for microsoft.crmsdk.xrmtooling.coreassembly
  • Open app.config, add the following code into it (inside the <configuration></configuration> section)

      <add key="CertificateThumbPrint" value="certificate thumbprint here" />
      <add key="ClientId" value="application id from AAD app"/>
      <add key="RedirectUri" value="redirecturl from AAD app"/>
      <add key="DynamicsUrl" value="https://<organizationname>"/>
  • Add the values that you collected at the beginning of this section into the app config you just created
  • Add a reference to System.Configuration in your project
Awesome! You’re now one step closer to Certificateville, which is either a lot safer or a lot less safe than you think!
Now, if you haven’t cloned or copied my code already, copy the contents of this CS-file into your Program.cs
Add a breakpoint at the end of the code, then hit F5 and watch all good things come to fruition:

So what if I have a certificate file?

So the overloaded method to use certificates are missing somewhat in documentation (as of now). If you load a certificate from disk or similar, use certificate as an input. If you load it from the store, use the store and the thumbprint as an input. If you have a "physical" certificate then the 'storename' and the 'thumbprint' can be any value (storename is an enum so you have to have something other than null, but thumbprint can be null). See the code for a concrete example.

The wrap-up

Certificate authentication works like a charm with Dynamics 365 Online. If you combine this with certificate storage in Azure Key Vault then you can securely authenticate and integrate with Dynamics365 without having to worry about app user credentials and password expiration (you still have to worry about certificates though, which isn’t really trivial).
We might see support for managed service accounts in the future, but for now this is a decent way to prevent the whole password management scheme of application users.

Friday, February 2, 2018

Use reference tables for previews in Power BI

In my quest to provide customers with awesome reports I’ve been looking at why the performance in Power BI can be so slow at times. As I pointed out in earlier posts the OData.Feed can be quite the resource hog depending on the endpoint, so I’ve started to fall back on Json.Document to get much better preview times (but not without it’s drawbacks, like record count, connected tables and option set labels).

Just because I’m kinda new to this I like to perform a lot of the logic using M in the query editor, then I sew it all together with relationships and measures inside the reports I build. I noticed that the previews kept getting slower the more I logic I added, so get the ultimate magic tool and looked at what was happening.

First step: Json.Document(Web.Contents(“https://<org>”))


OK, so it makes a request to Dynamics and retrieves the data in 3.435 seconds. That’s not fantastic, but it’s not bad.

Second step: = Source[value]


OK, no requests made. That’s what I would expect from it

Third step: = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error)


Still no requests made, that’s great

Fourth step: = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"actualclosedate", "actualvalue", "budgetamount", "estimatedclosedate", "estimatedvalue", "name", "opportunityid", "pzl_countryregion", "statuscode"}, {"Column1.actualclosedate", "Column1.actualvalue", "Column1.budgetamount", "Column1.estimatedclosedate", "Column1.estimatedvalue", "", "Column1.opportunityid", "Column1.pzl_countryregion", "Column1.statuscode"})

Guess what? Still no requests.

Fifth step: = Table.TransformColumnTypes(#"Expanded Column1",{{"Column1.estimatedclosedate", type date}})

No requests. Great, so we only make 1 request per step, as long as we don’t merge in more columns? Not quite.

Refresh preview


Wait what? 5 requests, that’s 1 request per step. That makes the loading time increase from the original 3+ seconds into well over 15 seconds. I don’t have the answer for why it makes a new request for each and every step, but let me show you the consequence of this when you use OData.Feed instead of Json.Document

These are the steps taken:

  1. = OData.Feed(https://<org>
  2. = Table.SelectColumns(Source,{"actualclosedate", "actualvalue", "budgetamount", "createdon", "estimatedclosedate", "estimatedvalue", "name", "new_calculatedrevenue", "new_product", "opportunityid", "pzl_countryregion"})
  3. = Table.TransformColumnTypes(#"Removed Other Columns",{{"createdon", type datetime}})


Now that should be enough to make you think twice before reporting on Dynamics data in Power BI. As you can see it’s the metadata documents which takes the longest, but for some reason there are 4 requests to the “all opportunities” endpoint and the metadata endpoint, while there are 3 requests to the opportunity endpoint with reduced number of columns (fields). Also, there is one top=1000 request which completes before the last two. This tells me that when you refresh the data, the requests are made in sequence for each step, but the steps are performed in parallel. It almost smells like premature optimization.

So let’s break this down and do it step by step.

Step 1


OK, great, one request for the opportunities and one for the metadata, finally one to get the top 1000 records for the preview.

Step 2


Now we see that it makes the same three requests as it did in the previous step, but it also performs another set of requests against “all opportunities” and metadata before it performs a request against opportunities with reduced number of columns (fields).

Step 3


Finally, it does all the requests from previous steps, but it also adds another duplicate for this current step (which only converts the type, and doesn’t require collecting more data from the source system).

The magic workaround

Just by pure luck, when I tested this originally I got 4 duplicates of the “all opportunities”+select requests, and I couldn’t figure out where the 4th duplicate came in. Turns out, because I made a reference from this opportunity query into a new query, and the data is refreshed in all referenced queries as well as the parent(s).

That made me test one final thing. I created a new reference query after I had done all my logic in the main query.


Then I refreshed the data while in the reference query.

Step 1: = Query1


VoilĂ ! Only one set of requests, even though it’s technically the fourth step.

This means I just saved myself many duplicate requests, and the data loads much faster than for each subsequent step. This was only for 3 steps, imagine how much this will matter if you have 10, or 20 steps in your queries.

The wrap up

So in summary, when you refresh the preview for a query in a Power BI, all the requests are are repeated and added on for each step you have in that query. It’s an exponential growth of data gathering, and can make everything slow as nothing else.

Creating new queries with references removes the need to duplicate all the requests, and you can keep working with and massaging the data without having to worry about minutes of data reloading.

Wednesday, January 31, 2018

How to retrieve OData v4 fast in Power BI

I recently blogged about how Power BI has a bug which ignores the select statements when querying an OData v4 endpoint. Even though the columns would include any data, it still bugged me so much that I wanted to find a way around it.

Turns out that’s much easier than you would expect.

I recently explored some of the different data loading capabilities in PBI, and it turns out that there’s a Json.Document function which allows you to handle OData at an even lower level.

So what I did was take my original uri and query string: https://<org>$select=name

Then I put it into a Json.Document(Web.Contents())

And voila, this is the result


Navigating into that list, you can then right click the column and select “To Table”. This will present you with a table of records ready to be exploded:


So, time to do a little speed testing

Using: = OData.Feed("https://<org>$select=name")


Using: = Json.Document(Web.Contents("https://<org>$select=name"))


Enough said?

PowerBI desktop ignores select statements in OData v4

I’ve been working a bit with Power BI and Dynamics the last few weeks, and I’ve encountered an issue which I at first thought was because of the Dynamics REST OData service. Turns out this is an issue with Power BI.
The issue is, when you specify a select statement in the query URL Power BI ignores it, and retrieves ALL the fields/columns instead.
To demonstrate the issue, here is what happens when I use the old OData service from Dynamics CRM
And here is what happens when I use the modern REST service
This is what happens when I try to use the graph api
So as you can see, when we use OData v4 endpoints the select statement is ignored, and the consequence is that all fields/columns are included. However, the columns does not include any data, so it is much faster than retrieving all columns with data.

So what can we do about it?

Nothing, for now, but if you vote for my idea to fix it we can get something soon (hopefully). Unfortunately, the Power BI community forum doesn’t integrate with Microsoft Services, so you have to create a new account (if you don’t already have one). I BEG of you to please take the time to vote

Monday, January 29, 2018

Dynamic option sets in PowerBI

I recently had a requirement from a customer who needed to filter reports based on global option sets. The kicker for this requirement was that the option set would change rapidly for some time (don’t ask), so they needed a dynamic way to retrieve the labels and use them in a report.
Now, there are several solutions out there to retrieve the option set data, and I got some tips from the talented Ulrik “Why is he not an MVP yet” Carlsson. While his solutions work, I don’t want a static transformation in PBI, I don’t want a custom entity to store information already in the system, and I don’t want to use fetchxml as a source. Are those unreasonable demands? Maybe. Is that going to stop me? Never.
So to start with, I checked the global option set definitions for my organization, found from the following URL
Sure enough I found all the option sets in the system, so I tried to add that url as a source in PowerBI to see if I can get the option set values. After waiting some time for it to load I can see that there are no lists or subsets to load that gives me anything useful.
So I try to load up the direct url for the option set in question, using the metadata id of the option set. The url looks like this:
While this does give me a result, it’s not anything I can use.
So I load the URL into chrome to look at the data. Here I can clearly see all of the option sets, complete with locales and values. That means that the data is available, but I am not getting it in the format I want.

Alternative 1, parse the content as a JSON document

Instead of using an OData.Feed to load the data, I can just take the specific optionset metadata and load it into a Json.Document instead. To do this, create a new blank query, then paste the following value
= Json.Document(Web.Contents("https://<org>
This will provide you with “Options” as an attribute which contains a list of records. You can then click on the List to navigate to it.
You now get a list of records, where you can right click the column and select “To Table”. This will convert the list into a table, and from here you can expand the columns. I chose to expand Value and Label, as that will give me the option set int value and the label string. The label is a record itself, so you have to expand it to get more options. The choices here is LocalizedLabels and UserLocalizedLabel. The difference being that the LocalizedLabels contains all the labels for all the localizations defined in your system, while the UserLocalizedLabel contains only the label for the user’s language.


Expanding the UserLocalizedLabel gives us a new record, which we then can expand to Label, LanguageCode, and other additional metadata attributes. I’ve chosen to expand the first two, and we now have a useable table with both int values as well as label strings in the user’s language
You can now rename the columns and use them to join with the primary records in your PBI report.


Expanding the LocalizedLabels will give us lists of labels for each value. You can’t expand a list to new columns straight away, so you will have to expand the table with new rows. If you have additional languages, which is probably why you’re doing this, then you will duplicate the integer values for all option set values that have several language definitions.
In my example I’ve just duplicated the table because I don’t have multiple languages enabled, so don’t worry about duplicate values, it’s on purpose.
So the next step is to right click the Column1.Value column, and select group by. Give the new column a sensible name (I named mine labels), then choose operation “Max” and select the label column as the column. This is just to get a basis for the grouping.
Now, in the next stage, the formula should look somewhat like this:
= Table.Group(#"Appended Query", {"Column1.Value"}, {{"Labels", each List.Max([Column1.Label.LocalizedLabels.Label]), type text}})
You need to change it so it looks like this instead:
= Table.Group(#"Appended Query", {"Column1.Value"}, {{"Labels", each Text.Combine([Column1.Label.LocalizedLabels.Label], ","), type text}})
The result will look something like this:
Now, right click the Labels column and select “Split Columns” –> “By delimiter”. Choose to split each occurrence. The resulting table now looks like this:
You can now give the columns sensible names and use them in your PBI report.

Alternative 2, add query options to get the list of labels

The other solution to retrieve the odata feed is to add additional columns. As we saw earlier the OData.Feed(… just gave us metadata about the entity, but none of the options. This is because the options, which includes data, is excluded by default. To include them in the query simply change the source definition from this:
= OData.Feed(https://<org>
To this:
= OData.Feed(https://<org>, null, [MoreColumns = true])
The “null” is for additional headers, while the second is a PBI record which states that MoreColumns should be true (default false). All the available options are specified in the following article
The result of this is the following screen.
We now have an attribute named “More Columns”. Clicking on that will navigate into the record, which is a list of records. Right click the column and choose “To Table”. You will now have a table where you can expand the columns just like specified in alternative 1.

Wrap up

Getting option set labels isn’t obvious when you’re creating reports in Power BI thanks to how attributes and attribute metadata is defined in Dynamics 365. Luckily there are ways to retrieve additional data in a dynamic way which allows us to utilize the system in awesome ways.
Until next time!

Tuesday, January 9, 2018

Dynamics 365 App for Outlook stops working after URL change

I encountered an error where the Dynamics 365 App for Outlook stopped working for a customer after we changed the URL for his instance.

As you might already know, as a D365 administrator you can go to the Dynamics Admin Portal to change the organization settings, and one of those settings is the organization URL. When we did this for a customer who was going live with a Dynamics 365 v9 organization the App for Outlook stopped working.

To fix this simply go to Settings –> Dynamics 365 App for Outlook


Even thought the app isn’t working the status will say “Added to Outlook”. To fix this simply select the users with the app, then click the “ADD APP TO OUTLOOK” (or “ADD APP FOR ALL ELLIGIBLE USERS”) button to re-publish the app to the users. After a few minutes the app was, again, available and working for all users.


Wednesday, January 3, 2018

Using flow to automate document locations in Dynamics 365

I recently had the following requirement to solve in Dynamics 365

When a customer (account only) is created a new Teams channel should be created.

All documents regarding the customer should be put into this channel folder in a specific group.

This should happen automatically, and be easy to maintain and support.

I decided to solve this in Microsoft Flow, as this seems like the type of problem it is meant to solve, and the following is the brick walls I ran headfirst into trying to get it working.


  1. Server side document integration enabled
  2. Group site added to SharePoint sites
  3. Documents (Shared Documents) added as SharePoint Document Location
  4. Guid of the document location record in Dynamics

Creating a flow which triggers on account creation


Starting out I create a new flow from blank, and set the trigger to “When a record is created” in Dynamics 365. If this is the first time you create a flow then you might get a JSON error message in the organization drop down. To work around this simply go to “Connectors” from the toolbar, and find the Dynamics 365 connector. Click on the “When a record is created” trigger, and a flow editor will start with a working Organization drop-down.

Select the organization you want to connect to, and then choose Accounts from the Entity drop down. If this is the first time you use the Dynamics connector it can take a minute or two before the drop down is populated.

Creating a channel for Microsoft Teams

My first thought was to create a condition to check if there already is a channel with the same name. Unfortunately, the only actions for Microsoft Teams as of today is “Create a channel”, “List channels”, “List teams” and “Post message”.


While List channels might sound like a good way to check whether a channel exists, you’re getting all the channels from a specific team without filters. This also means that looping through the list will generate a lot of unnecessary actions in Flow (which again might start to cost money if you’re not careful). An alternative, as Mikael “I dare you to ask me about search” Svenson pointed out is to write your own formula to check the content for an existing channel. The result looks something like this


This works fine, but the formula used is barely human readable, and not something I feel comfortable leaving at a customer’s environment to maintained by someone else. I decided to try and strong-arm the Flow into doing what I wanted instead.

I put in the “Create channel” action, where I pick the team Id and I set the display name to the Account Name from the MSDYN365 trigger. This means that every time an account is created, a corresponding channel is created. Next I hit Save flow and Done.


Testing channel creation

First thing I did was create a new channel in teams. Then I replace the account name in the “Create a channel” action with the same name. I did this is to see what happens when it tries to create an existing channel. Creating a new account which triggered the flow produced the following error:


So when we try to create an existing channel we get an error 400 telling us that the channel already exists. That’s great, that means we have a good description that we can use in a condition to handle existing channels.

We’ll create a new Condition from that step, change to advanced method and paste in the following formula (if body contains NameAlreadyExists or does not contain “error”:)

@or(contains(body('Create_a_channel'), 'NameAlreadyExists'), not(contains(body('Create_a_channel'), '"error":')))

Then click on the elipsis (…) and click the “Configure run after” option, and make sure both “is successful” and “has failed” is checked. It should look something like this


In the failed step you can add actions to notify when an unexpected error occurs, for this example I’ve chosen to notify myself whenever the condition is false, where the formula for the email body is:

Next I add a Terminate action to prevent the flow from processing more steps. The result looks like this


Creating a new document location in Dynamics

After we’ve handled errors on creation of a new Teams channel, we have to create a new document location in Dynamics. Add the “Create a new record” for Dynamics 365 action. Choose the organization you’re working with, and choose Document Locations as the entity name.

Click on the Show advanced options expansion link to enable filling out everything we need.

  • Organization Name –> Select the organization you’re working with
  • Entity Name –> Document Locations
  • Parent Site or Location –> The Guid of the Documents (Shared Documents) record in Dynamics (prerequisites in the beginning of this post)
  • Parent Site or Location type –> Select sharepointdocumentlocations
  • Regarding –> Expand the “When a record is created” selection from the dynamic content, and select Account (Unique identitifer of the account)
  • Regarding Type –> Select accounts

The end result should look like the following:


Creating a SharePoint folder

Testing the flow it looks like everything runs fine, and checking in Dynamics we can see that the document location has been created successfully. However, if you try to browse to the document location on the account created you get an error message saying that the folder does not exist in SharePoint. After poking around a bit I found out that creating a Teams channel from Flow (which in turn uses the Graph API) does not create a corresponding folder immediately. Instead, it is picked up by some timer job in the backend. I feel like that is kind of risky business, so I decided to create the folder manually to speed up the process, and prevent unnecessary error messages in the user interface.

Unfortunately, there is no way to just create a new folder in a document library using Microsoft Flow. This left me quite frustrated until I found out that creating a new file will create subfolders automatically if they don’t already exist. We’ll use that feature to create the folder for us, by creating a new dummy file.

Start by adding a new action above the “Create a new record” action, and choose the “SharePoint – Create File” action. Choose the site for the Group you want to work with, and in the folder path type /Shared Documents/, then append Account Name from the dynamic variables. Name the file deleteme.txt, and type some useful content should it fail to be deleted.


Now, create a new action beneath this one and select “SharePoint – Delete File”. Select the site where the file was created, and for File Identifier select Path from the “Create file” action.


The entire flow should now look like this


Taking this beauty out for a spin

To test out our new flow I created a new channel in Teams first, then created a new account with the same name as the channel. After a short while the flow triggers, and I can see that it fails on the create channel step, but it hits the “if yes” condition because it’s an existing folder, and the result says success.


I create another new account with a name that doesn’t exist as a channel already, and this time all the blocks are marked with green check marks



So there you have it. A flow which automatically creates a new channel in teams and adds a document location to the Dynamics 365 account so you can share and collaborate on documents.