Friday, February 2, 2018

Use reference tables for previews in Power BI

In my quest to provide customers with awesome reports I’ve been looking at why the performance in Power BI can be so slow at times. As I pointed out in earlier posts the OData.Feed can be quite the resource hog depending on the endpoint, so I’ve started to fall back on Json.Document to get much better preview times (but not without it’s drawbacks, like record count, connected tables and option set labels).

Just because I’m kinda new to this I like to perform a lot of the logic using M in the query editor, then I sew it all together with relationships and measures inside the reports I build. I noticed that the previews kept getting slower the more I logic I added, so get the ultimate magic tool and looked at what was happening.

First step: Json.Document(Web.Contents(“https://<org>.api.crm4.dynamics.com/api/data/v8.2/opportunities”))

image

OK, so it makes a request to Dynamics and retrieves the data in 3.435 seconds. That’s not fantastic, but it’s not bad.

Second step: = Source[value]

image

OK, no requests made. That’s what I would expect from it

Third step: = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error)

image

Still no requests made, that’s great

Fourth step: = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"actualclosedate", "actualvalue", "budgetamount", "estimatedclosedate", "estimatedvalue", "name", "opportunityid", "pzl_countryregion", "statuscode"}, {"Column1.actualclosedate", "Column1.actualvalue", "Column1.budgetamount", "Column1.estimatedclosedate", "Column1.estimatedvalue", "Column1.name", "Column1.opportunityid", "Column1.pzl_countryregion", "Column1.statuscode"})

Guess what? Still no requests.

Fifth step: = Table.TransformColumnTypes(#"Expanded Column1",{{"Column1.estimatedclosedate", type date}})

No requests. Great, so we only make 1 request per step, as long as we don’t merge in more columns? Not quite.

Refresh preview

image

Wait what? 5 requests, that’s 1 request per step. That makes the loading time increase from the original 3+ seconds into well over 15 seconds. I don’t have the answer for why it makes a new request for each and every step, but let me show you the consequence of this when you use OData.Feed instead of Json.Document

These are the steps taken:

  1. = OData.Feed(https://<org>.api.crm4.dynamics.com/api/data/v8.2/opportunities)
  2. = Table.SelectColumns(Source,{"actualclosedate", "actualvalue", "budgetamount", "createdon", "estimatedclosedate", "estimatedvalue", "name", "new_calculatedrevenue", "new_product", "opportunityid", "pzl_countryregion"})
  3. = Table.TransformColumnTypes(#"Removed Other Columns",{{"createdon", type datetime}})

image

Now that should be enough to make you think twice before reporting on Dynamics data in Power BI. As you can see it’s the metadata documents which takes the longest, but for some reason there are 4 requests to the “all opportunities” endpoint and the metadata endpoint, while there are 3 requests to the opportunity endpoint with reduced number of columns (fields). Also, there is one top=1000 request which completes before the last two. This tells me that when you refresh the data, the requests are made in sequence for each step, but the steps are performed in parallel. It almost smells like premature optimization.

So let’s break this down and do it step by step.

Step 1

image

OK, great, one request for the opportunities and one for the metadata, finally one to get the top 1000 records for the preview.

Step 2

image

Now we see that it makes the same three requests as it did in the previous step, but it also performs another set of requests against “all opportunities” and metadata before it performs a request against opportunities with reduced number of columns (fields).

Step 3

image

Finally, it does all the requests from previous steps, but it also adds another duplicate for this current step (which only converts the type, and doesn’t require collecting more data from the source system).

The magic workaround

Just by pure luck, when I tested this originally I got 4 duplicates of the “all opportunities”+select requests, and I couldn’t figure out where the 4th duplicate came in. Turns out, because I made a reference from this opportunity query into a new query, and the data is refreshed in all referenced queries as well as the parent(s).

That made me test one final thing. I created a new reference query after I had done all my logic in the main query.

image

Then I refreshed the data while in the reference query.

Step 1: = Query1

image

VoilĂ ! Only one set of requests, even though it’s technically the fourth step.

This means I just saved myself many duplicate requests, and the data loads much faster than for each subsequent step. This was only for 3 steps, imagine how much this will matter if you have 10, or 20 steps in your queries.

The wrap up

So in summary, when you refresh the preview for a query in a Power BI, all the requests are are repeated and added on for each step you have in that query. It’s an exponential growth of data gathering, and can make everything slow as nothing else.

Creating new queries with references removes the need to duplicate all the requests, and you can keep working with and massaging the data without having to worry about minutes of data reloading.

Wednesday, January 31, 2018

How to retrieve OData v4 fast in Power BI

I recently blogged about how Power BI has a bug which ignores the select statements when querying an OData v4 endpoint. Even though the columns would include any data, it still bugged me so much that I wanted to find a way around it.

Turns out that’s much easier than you would expect.

I recently explored some of the different data loading capabilities in PBI, and it turns out that there’s a Json.Document function which allows you to handle OData at an even lower level.

So what I did was take my original uri and query string: https://<org>.api.crm4.dynamics.com/api/data/v8.2/opportunities?$select=name

Then I put it into a Json.Document(Web.Contents())

And voila, this is the result

image

Navigating into that list, you can then right click the column and select “To Table”. This will present you with a table of records ready to be exploded:

image


So, time to do a little speed testing

Using: = OData.Feed("https://<org>.api.crm4.dynamics.com/api/data/v8.2/opportunities?$select=name")

Screenshot_20180131-160714

Using: = Json.Document(Web.Contents("https://<org>.api.crm4.dynamics.com/api/data/v8.2/opportunities?$select=name"))

Screenshot_20180131-160738


Enough said?

PowerBI desktop ignores select statements in OData v4

I’ve been working a bit with Power BI and Dynamics the last few weeks, and I’ve encountered an issue which I at first thought was because of the Dynamics REST OData service. Turns out this is an issue with Power BI.
The issue is, when you specify a select statement in the query URL Power BI ignores it, and retrieves ALL the fields/columns instead.
To demonstrate the issue, here is what happens when I use the old OData service from Dynamics CRM
image
And here is what happens when I use the modern REST service
image
This is what happens when I try to use the graph api
image
So as you can see, when we use OData v4 endpoints the select statement is ignored, and the consequence is that all fields/columns are included. However, the columns does not include any data, so it is much faster than retrieving all columns with data.

So what can we do about it?

Nothing, for now, but if you vote for my idea to fix it we can get something soon (hopefully). Unfortunately, the Power BI community forum doesn’t integrate with Microsoft Services, so you have to create a new account (if you don’t already have one). I BEG of you to please take the time to vote
http://community.powerbi.com/t5/Issues/Select-query-in-odata-v4-is-ignored/idi-p/348974

Monday, January 29, 2018

Dynamic option sets in PowerBI

I recently had a requirement from a customer who needed to filter reports based on global option sets. The kicker for this requirement was that the option set would change rapidly for some time (don’t ask), so they needed a dynamic way to retrieve the labels and use them in a report.
Now, there are several solutions out there to retrieve the option set data, and I got some tips from the talented Ulrik “Why is he not an MVP yet” Carlsson. While his solutions work, I don’t want a static transformation in PBI, I don’t want a custom entity to store information already in the system, and I don’t want to use fetchxml as a source. Are those unreasonable demands? Maybe. Is that going to stop me? Never.
So to start with, I checked the global option set definitions for my organization, found from the following URL
https://<org>.crm4.dynamics.com/api/data/v8.2/GlobalOptionSetDefinitions
Sure enough I found all the option sets in the system, so I tried to add that url as a source in PowerBI to see if I can get the option set values. After waiting some time for it to load I can see that there are no lists or subsets to load that gives me anything useful.
image
So I try to load up the direct url for the option set in question, using the metadata id of the option set. The url looks like this:
https://<org>.crm4.dynamics.com/api/data/v8.2/GlobalOptionSetDefinitions(bd3519cf-3d67-43f2-8a94-d6849872f615)
While this does give me a result, it’s not anything I can use.
image
So I load the URL into chrome to look at the data. Here I can clearly see all of the option sets, complete with locales and values. That means that the data is available, but I am not getting it in the format I want.

Alternative 1, parse the content as a JSON document

Instead of using an OData.Feed to load the data, I can just take the specific optionset metadata and load it into a Json.Document instead. To do this, create a new blank query, then paste the following value
= Json.Document(Web.Contents("https://<org>.crm4.dynamics.com/api/data/v8.2/GlobalOptionSetDefinitions(bd3519cf-3d67-43f2-8a94-d6849872f615)
This will provide you with “Options” as an attribute which contains a list of records. You can then click on the List to navigate to it.
image
You now get a list of records, where you can right click the column and select “To Table”. This will convert the list into a table, and from here you can expand the columns. I chose to expand Value and Label, as that will give me the option set int value and the label string. The label is a record itself, so you have to expand it to get more options. The choices here is LocalizedLabels and UserLocalizedLabel. The difference being that the LocalizedLabels contains all the labels for all the localizations defined in your system, while the UserLocalizedLabel contains only the label for the user’s language.
image

UserLocalizedLabel

Expanding the UserLocalizedLabel gives us a new record, which we then can expand to Label, LanguageCode, and other additional metadata attributes. I’ve chosen to expand the first two, and we now have a useable table with both int values as well as label strings in the user’s language
image
You can now rename the columns and use them to join with the primary records in your PBI report.

LocalizedLabels

Expanding the LocalizedLabels will give us lists of labels for each value. You can’t expand a list to new columns straight away, so you will have to expand the table with new rows. If you have additional languages, which is probably why you’re doing this, then you will duplicate the integer values for all option set values that have several language definitions.
In my example I’ve just duplicated the table because I don’t have multiple languages enabled, so don’t worry about duplicate values, it’s on purpose.
image
So the next step is to right click the Column1.Value column, and select group by. Give the new column a sensible name (I named mine labels), then choose operation “Max” and select the label column as the column. This is just to get a basis for the grouping.
Now, in the next stage, the formula should look somewhat like this:
= Table.Group(#"Appended Query", {"Column1.Value"}, {{"Labels", each List.Max([Column1.Label.LocalizedLabels.Label]), type text}})
You need to change it so it looks like this instead:
= Table.Group(#"Appended Query", {"Column1.Value"}, {{"Labels", each Text.Combine([Column1.Label.LocalizedLabels.Label], ","), type text}})
The result will look something like this:
image
Now, right click the Labels column and select “Split Columns” –> “By delimiter”. Choose to split each occurrence. The resulting table now looks like this:
image
You can now give the columns sensible names and use them in your PBI report.

Alternative 2, add query options to get the list of labels

The other solution to retrieve the odata feed is to add additional columns. As we saw earlier the OData.Feed(… just gave us metadata about the entity, but none of the options. This is because the options, which includes data, is excluded by default. To include them in the query simply change the source definition from this:
= OData.Feed(https://<org>.crm4.dynamics.com/api/data/v8.2/GlobalOptionSetDefinitions(4565bea1-7c02-e811-8113-5065f38bb391))
To this:
= OData.Feed(https://<org>.crm4.dynamics.com/api/data/v8.2/GlobalOptionSetDefinitions(4565bea1-7c02-e811-8113-5065f38bb391), null, [MoreColumns = true])
The “null” is for additional headers, while the second is a PBI record which states that MoreColumns should be true (default false). All the available options are specified in the following article https://msdn.microsoft.com/en-us/library/mt260868.aspx
The result of this is the following screen.
image
We now have an attribute named “More Columns”. Clicking on that will navigate into the record, which is a list of records. Right click the column and choose “To Table”. You will now have a table where you can expand the columns just like specified in alternative 1.

Wrap up

Getting option set labels isn’t obvious when you’re creating reports in Power BI thanks to how attributes and attribute metadata is defined in Dynamics 365. Luckily there are ways to retrieve additional data in a dynamic way which allows us to utilize the system in awesome ways.
Until next time!

Tuesday, January 9, 2018

Dynamics 365 App for Outlook stops working after URL change

I encountered an error where the Dynamics 365 App for Outlook stopped working for a customer after we changed the URL for his instance.

As you might already know, as a D365 administrator you can go to the Dynamics Admin Portal to change the organization settings, and one of those settings is the organization URL. When we did this for a customer who was going live with a Dynamics 365 v9 organization the App for Outlook stopped working.

To fix this simply go to Settings –> Dynamics 365 App for Outlook

image

Even thought the app isn’t working the status will say “Added to Outlook”. To fix this simply select the users with the app, then click the “ADD APP TO OUTLOOK” (or “ADD APP FOR ALL ELLIGIBLE USERS”) button to re-publish the app to the users. After a few minutes the app was, again, available and working for all users.

image

Wednesday, January 3, 2018

Using flow to automate document locations in Dynamics 365

I recently had the following requirement to solve in Dynamics 365

When a customer (account only) is created a new Teams channel should be created.

All documents regarding the customer should be put into this channel folder in a specific group.

This should happen automatically, and be easy to maintain and support.

I decided to solve this in Microsoft Flow, as this seems like the type of problem it is meant to solve, and the following is the brick walls I ran headfirst into trying to get it working.

Prerequisites:

  1. Server side document integration enabled
  2. Group site added to SharePoint sites
  3. Documents (Shared Documents) added as SharePoint Document Location
  4. Guid of the document location record in Dynamics


Creating a flow which triggers on account creation

image

Starting out I create a new flow from blank, and set the trigger to “When a record is created” in Dynamics 365. If this is the first time you create a flow then you might get a JSON error message in the organization drop down. To work around this simply go to “Connectors” from the toolbar, and find the Dynamics 365 connector. Click on the “When a record is created” trigger, and a flow editor will start with a working Organization drop-down.

Select the organization you want to connect to, and then choose Accounts from the Entity drop down. If this is the first time you use the Dynamics connector it can take a minute or two before the drop down is populated.

Creating a channel for Microsoft Teams

My first thought was to create a condition to check if there already is a channel with the same name. Unfortunately, the only actions for Microsoft Teams as of today is “Create a channel”, “List channels”, “List teams” and “Post message”.

image

While List channels might sound like a good way to check whether a channel exists, you’re getting all the channels from a specific team without filters. This also means that looping through the list will generate a lot of unnecessary actions in Flow (which again might start to cost money if you’re not careful). An alternative, as Mikael “I dare you to ask me about search” Svenson pointed out is to write your own formula to check the content for an existing channel. The result looks something like this

image

This works fine, but the formula used is barely human readable, and not something I feel comfortable leaving at a customer’s environment to maintained by someone else. I decided to try and strong-arm the Flow into doing what I wanted instead.

I put in the “Create channel” action, where I pick the team Id and I set the display name to the Account Name from the MSDYN365 trigger. This means that every time an account is created, a corresponding channel is created. Next I hit Save flow and Done.

image

Testing channel creation

First thing I did was create a new channel in teams. Then I replace the account name in the “Create a channel” action with the same name. I did this is to see what happens when it tries to create an existing channel. Creating a new account which triggered the flow produced the following error:

image

So when we try to create an existing channel we get an error 400 telling us that the channel already exists. That’s great, that means we have a good description that we can use in a condition to handle existing channels.

We’ll create a new Condition from that step, change to advanced method and paste in the following formula (if body contains NameAlreadyExists or does not contain “error”:)

@or(contains(body('Create_a_channel'), 'NameAlreadyExists'), not(contains(body('Create_a_channel'), '"error":')))

Then click on the elipsis (…) and click the “Configure run after” option, and make sure both “is successful” and “has failed” is checked. It should look something like this

image

In the failed step you can add actions to notify when an unexpected error occurs, for this example I’ve chosen to notify myself whenever the condition is false, where the formula for the email body is:
body(‘Create_a_channel’)

Next I add a Terminate action to prevent the flow from processing more steps. The result looks like this

image

Creating a new document location in Dynamics

After we’ve handled errors on creation of a new Teams channel, we have to create a new document location in Dynamics. Add the “Create a new record” for Dynamics 365 action. Choose the organization you’re working with, and choose Document Locations as the entity name.

Click on the Show advanced options expansion link to enable filling out everything we need.

  • Organization Name –> Select the organization you’re working with
  • Entity Name –> Document Locations
  • Parent Site or Location –> The Guid of the Documents (Shared Documents) record in Dynamics (prerequisites in the beginning of this post)
  • Parent Site or Location type –> Select sharepointdocumentlocations
  • Regarding –> Expand the “When a record is created” selection from the dynamic content, and select Account (Unique identitifer of the account)
  • Regarding Type –> Select accounts

The end result should look like the following:

image

Creating a SharePoint folder

Testing the flow it looks like everything runs fine, and checking in Dynamics we can see that the document location has been created successfully. However, if you try to browse to the document location on the account created you get an error message saying that the folder does not exist in SharePoint. After poking around a bit I found out that creating a Teams channel from Flow (which in turn uses the Graph API) does not create a corresponding folder immediately. Instead, it is picked up by some timer job in the backend. I feel like that is kind of risky business, so I decided to create the folder manually to speed up the process, and prevent unnecessary error messages in the user interface.

Unfortunately, there is no way to just create a new folder in a document library using Microsoft Flow. This left me quite frustrated until I found out that creating a new file will create subfolders automatically if they don’t already exist. We’ll use that feature to create the folder for us, by creating a new dummy file.

Start by adding a new action above the “Create a new record” action, and choose the “SharePoint – Create File” action. Choose the site for the Group you want to work with, and in the folder path type /Shared Documents/, then append Account Name from the dynamic variables. Name the file deleteme.txt, and type some useful content should it fail to be deleted.

image

Now, create a new action beneath this one and select “SharePoint – Delete File”. Select the site where the file was created, and for File Identifier select Path from the “Create file” action.

image

The entire flow should now look like this

image

Taking this beauty out for a spin

To test out our new flow I created a new channel in Teams first, then created a new account with the same name as the channel. After a short while the flow triggers, and I can see that it fails on the create channel step, but it hits the “if yes” condition because it’s an existing folder, and the result says success.

image

I create another new account with a name that doesn’t exist as a channel already, and this time all the blocks are marked with green check marks

image

Summary

So there you have it. A flow which automatically creates a new channel in teams and adds a document location to the Dynamics 365 account so you can share and collaborate on documents.

Wednesday, December 6, 2017

Set Properties not working when using editable grids

I found a little issue using editable grids. The following error occurs in all version since Editable Grids was introduced

When I have editable grids on the primary entity form then I cannot set the values for that entity inside a workflow. To reproduce the error do the following

  1. Create a new custom entity (or use an existing entity if you dare)
  2. Add a 1:N relationship from entity1 to any other entity
  3. Add a sub-grid to the form of entity1
  4. Go to the controls tab of the sub-grid, and choose add control. Select the Editable Grid control, and activate it for web
    image
  5. Save and publish the form
  6. Create a new workflow for the entity
  7. Add an update or create step for the record, and open the “set properties” window
  8. When you select any field, the operator drop-down and field drop-down will be empty, and you cannot select any values
    image

Changing the control to the standard read-only grid will prevent this error. Also, this error only occurs for custom entities with editable grids, not for default entities with editable sub grids.

Diving in to the debug console of your browser you will see the following error:

SCRIPT5009: '$P_CRM' is undefined
File: crminternalutility.js, Line: 1, Column: 1786

Browsing through the code it looks like $P_CRM should be an instance of jQuery, but it is never assigned.