Quantcast
Channel: Scott Durow's Activities

How to design a custom grid using HTML Web Resource

$
0
0

Hi,

I want to put custom design(table) using HTML WebResource in existing entity how can i do?

Which is good way to design using HTML Web Resource or using c#?

I want to display product detail in proper format in purchase order entity how can i design that?

I want to design like this

I want to perform action exactly like this..but through code how can I?

http://www.microsoft.com/en-US/dynamics/crm-customer-center/create-a-team-template-and-add-to-an-entity-form.aspx


How to update multiple records in a model-driven app grid using Power Fx commanding

$
0
0

If you wanted to add a button to a command bar to perform an update on multiple records in a grid, you can easily create a formula that results in slow performance caused by multiple grid refreshes. This post outlines, the most performant way of applying updates to multiple records from a Power Fx command button.

Step 1 - Add your button

Inside the modern command bar editor, add a button to the Main Grid or Sub Grid command bars of the Account entity.

Step 2 - Set the Visibility rule

Any buttons on a grid that apply to selected records will only become visible if you provide a visibility rule.

Select the Visibility property, and select Show on condition from formula.

In the formula bar at the top, enter the following:

!IsEmpty(Self.Selected.AllItems)

Step 3 - Add the command formula

Select the Action of the button, and select Run formula.

In the formula bar at the top, enter the following:

// Paralle Updates
Patch(
    Accounts, 
    ForAll(Self.Selected.AllItems, 
        {
            Account:ThisRecord.Account,'Credit Hold':'Credit Hold (Accounts)'.Yes 
        }
    )
)

Note: If you are adding a button for a different entity, you will need to change the table name (Accounts) and primary key column name (Account).

Step 4 - Save and Publish, then Play your app!

The changes will take a short while to appear in your app. You will see a message similar to the following when the changes are ready:

Parallel vs Sequential Patches

If you used the following formula, it would result in multiple grid refreshes since the Patches will be done in sequence.

// Sequential Updates
ForAll(Self.Selected.AllItems, 
    Patch(
        Accounts, 
        ThisRecord, 
        { 'Credit Hold':'Credit Hold (Accounts)'.No }
    )
);

Using the parallel version above instead of this sequential one will try and perform as many parallel updates as the browser can handle. This is due to the limited number of HTTP requests that can be sent simultaneously from the browser.
You can see in the timeline below, that only 6 parallel requests are in progress at once.

Despite this, this technique will be considerably more efficient than performing the updates sequentially, and the grid will only be refreshed the once, instead of with each record updated.

 

Perform complex Dataverse FetchXml queries using Power Fx (from a Canvas App)

$
0
0

One of the constant challenges we face when writing canvas apps and custom pages using Power Fx is ensuring that the queries we use are always delegatable. Not all operators are delegatable to the server when using Filter or Sort, which can sometimes create a performance challenge. Furthermore, some types of queries are not possible such as group-by and complex link entity queries. Wouldn't it be great to be able to run a FetchXML query and then use the results in your app? In this post, I'll show you a pattern that allows you to do just that using the new PasrseJSON Power Fx function.

Creating the query mechanism

We will use a Cloud Flow to perform the FetchXml query, which will be called from Power Fx. This side-steps any delegation issues since we know that the query is always run on the server.

1. Create a new instant cloud flow (I will call it 'PowerApp: FetchXml Query'). Select Power Apps as the trigger.

2. Add a Dataverse List Rows action, and then in the Table name and Fetch Xml Query parameters, add Ask in Power Apps under Dynamic Content.

3. Add a Respond to Power Apps action, Add a string output and enter the expression:

outputs('List_rows')?['body']['value']

Your flow should look similar to the following:

Perform query using Power Fx

Imagine that you wanted to find all the distinct accounts that had at least one contact with an activity that is due in the current month. This would be difficult using standard Power Fx since you would need to perform multiple queries, possibly causing performance issues. 

1. First we create the query using the Accounts view in a model-driven app. The query looks like the following:

2. Now we can use Download FetchXML to get the query. You can also use the awesome FetchXML builder from Jonas.

3. Inside your canvas app, enable the ParseJSON feature inside Settings:

NOTE: You will need to save and reload your app for this setting to take effect.

4. Under the Power Automate tab, add the FetchXml Query Cloud Flow so that it is available to our app.

5. Inside your app, make a call to the following Power Fx. This could be inside a Screen OnVisible, or a button:

// Get all accounts with at least one contact that has an activity due this month
UpdateContext({ctxAccountsThisMonth:
    ForAll(
        Table(
            ParseJSON('PowerApp:FetchXmlQuery'.Run("<fetch version='1.0' output-format='xml-platform' mapping='logical' no-lock='true' distinct='true'><entity name='account'><attribute name='name' /><attribute name='accountid' /><filter type='and'><condition attribute='statecode' operator='eq' value='0' /><condition attribute='industrycode' operator='not-null' /></filter><link-entity alias='accountprimarycontactidcontactcontactid' name='contact' from='contactid' to='primarycontactid' link-type='outer' visible='false'></link-entity><link-entity name='contact' alias='aa' link-type='inner' from='parentcustomerid' to='accountid'><link-entity name='activitypointer' alias='ae' link-type='inner' from='regardingobjectid' to='contactid'><filter type='and'><condition attribute='scheduledend' operator='this-month' /></filter></link-entity></link-entity></entity></fetch>","accounts").results
                )
            ),
            {
                accountid:GUID(Value.accountid),
                name:Text(Value.name)
            }
        )
    })

This code simply calls the Fetch XML Query and then maps the results into a collection. It picks out each attribute from the results and converts it to the correct data type (e.g. Text or number).

You can now use this data in your app! Each time you call this flow, the query will be re-evaluated without any caching so be careful how many times you call it in order to minimise API entitlement consumption. When you want to join the results back to the standard Power Fx Accounts data source, if you show the results in a gallery you could use:

 LookUp(Accounts,Account=Gallery1.Selected.accountid).'Account Name'.

Performing aggregate group by queries from your canvas app

Imagine that you wanted to show the total number of accounts per industry code inside your app. You could easily use a Power BI query for this - however, there are sometimes when you need the data to be native to the app.

Using the awesome FetchXML builder, you might create the Fetch XML to look similar to:

<fetch aggregate='true'><entity name='account'><attribute name='industrycode' alias='group' groupby='true' /><attribute name='industrycode' alias='industrycode_count' aggregate='count' /></entity></fetch>

If you use this in your Power Fx in a similar way to the code above, you will find that the Cloud Flow gives an error similar to:

An ODataPrimitiveValue was instantiated with a value of type 'Microsoft.Xrm.Sdk.OptionSetValue'. ODataPrimitiveValue can only wrap values which can be represented as primitive EDM types.

The reason behind this is that the Dataverse connector for Flow does not know how to interpret the metadata being returned from the aggregate query since each row is not an account, but an aggregate row. You will get a similar error if you try and group by a Lookup:

An ODataPrimitiveValue was instantiated with a value of type 'Microsoft.Xrm.Sdk.EntityReference'. ODataPrimitiveValue can only wrap values which can be represented as primitive EDM types.

To work around this we must create a 'primitive' data type column to group by. So for an OptionSet/Choice, we would create a numeric column that contains the Choice Value and for a Lookup, we would create a String column that contains the Lookup GUID. We can then group by these columns. I call these 'materialized columns'. To do this you could either create a Plugin or a Cloud Flow The cloud flow should be triggered when a record is created or updated, and then update the two materialized 'primitive' columns.

1. First create the Cloud flow to be triggered when an account is created or updated:

2. Add a second Dataverse Update a row step, and set the:

Row Id : triggerOutputs()?['body/accountid']

Account Status ID (This is the custom string column)  : triggerOutputs()?['body/_dev1_accountstatus_value']

Industry Code Value (This is the custom number column) : triggerOutputs()?['body/industrycode]

3. After you save your flow, when you update your accounts you should find that the two custom materialized columns will contain the primitive version of the Choice and Lookup columns. This now allows us to perform grouping/aggregate queries inside a flow.

4. Inside your app, add the following Power Fx to a screen OnVisible:

Concurrent(
// Get accounts by industry code
UpdateContext({ctxCountByIndustry:
    ForAll(
        Table(
            ParseJSON('PowerApp:FetchXmlQuery'.Run("<fetch aggregate='true'><entity name='account'><attribute name='dev1_industrycodevalue' alias='group' groupby='true' /><attribute name='dev1_industrycodevalue' alias='industrycode_count' aggregate='count' /></entity></fetch>","accounts").results
                )
            ),
            {
                group:Value(Value.group),
                industrycode_count:Value(Value.industrycode_count)
            }
        )
    })
,
// Get the industry code name/value pairs for the current language
UpdateContext({ctxIndustryCodes:
    ForAll(
        Table(
            ParseJSON('PowerApp:FetchXmlQuery'.Run("<fetch><entity name='stringmap'><attribute name='stringmapid' /><attribute name='attributevalue' /><attribute name='displayorder' /><attribute name='value' /><filter><condition attribute='attributename' operator='eq' value='industrycode' /><condition attribute='objecttypecode' operator='eq' value='1' /><condition attribute='langid' operator='eq' value='1033' /></filter></entity></fetch>","stringmaps").results
                )
            ),
            {
                attributevalue:Value(Value.attributevalue),
                value:Text(Value.value),
                displayorder:Value(Value.displayorder)
            }
        )
    })
);

There are two queries here, the first is the aggregate query returning the accounts grouped by industry code (the custom materialized column). The second is a query that returns the name/value pairs for the industry code choice column in the current language since we can no longer use the standard enum (Industry (Accounts)) that Power Fx provides us. The Power Fx enums are text-based only and you cannot get access to the Choice numeric value.

Notice that the two queries are performed inside a Concurrent function to ensure that they are run in parallel so that they can run as quickly as possible.

5. We can now show the results inside a Chart by binding the Chart.Items to :

AddColumns(Filter(ctxCountByIndustry,!IsBlank(group)),"IndustryName",LookUp(ctxIndustryCodes,attributevalue = group).value)

Note: The IndustryName column is added so that the chart can show the Choice Text value instead of the numeric value that we used to group by. The result might look something like this:

So that's it. I hope that eventually, these kinds of queries will be possible natively using Power Fx without the need to call a Cloud Flow. Maybe even, allow querying Dataverse using the SQL endpoint.

Hope this help!

@ScottDurow

 

Perform ExecuteMultiple in batches and changesets from TypeScript using dataverse-ify version 2

$
0
0

My free tutorial course on writing Dataverse web resources has been available for over a year now, and it has had over 1000 people enrol! The course uses version 1 of dataverse-ify, and of course over that time I've been working on version 2 which is currently available in beta.

What is Dataverse-ify?

Dataverse-ify aims to simplify calling the Dataverse WebAPI from TypeScript inside model-driven apps, Single Page Applications (SPAs) and integration tests running inside VSCode/Node. It uses a small amount of metadata that is generated using dataverse-gen and a set of early bound types to make it easy to interact with dataverse tables and columns using a similar API that you might be used to using if you've used the IOrganizationService inside C#. 

You can use the new version by adding @2 on the end of the node modules:

For example:

npx dataverse-auth@2
npx dataverse-gen@2
npm install dataverse-ify@2

Soon, I will be removing the beta tag and publishing it so that it will install by default. There are a few breaking changes detailed in the Upgrading readme, but I will be publishing more samples including a Single Page Application that uses dataverse-ify even where the Xrm.WebApi is not available.

I wanted to give you a peek at one of the features that I am really excited about in version 2 - support for ExecuteMultiple with batch and change set support. A batch allows you to send multiple requests in a single request, and change sets allow you to send multiple requests that will be executed as a transaction. This can give your client-side code a performance boost and make it easier to perform a single changeset where if one request fails, they all will fail. Custom API requests can even be wrapped up in executeMultiple!

Imagine that you have a Command Bar button that calls a JavaScript function from a grid that needs to make an update to a column on all of the selected records, and then wait for a triggered flow to run, as indicated by the updated column being reset. The updates can be wrapped up in an ExecuteMultiple batch rather than being made by lots of Update requests.

Create the top-level function

When a command bar calls a JavaScript function it can return a Promise if there is asynchronous work being performed. In our case, we don't want the model-driven app to wait until our flows are run, so we can use Promise.resolve on an internal function to 'fire and forget' a long-running task:

static async CreateProjectReportTrigger(entityIds: string[]): Promise<void> {
  // Fire and forget the internal command so it does not cause a ribbon action timeout
  Promise.resolve(ProjectRibbon.CreateProjectReportTriggerInternal(entityIds));
}

Create the Internal function and initialize the metadata cache

Inside the internal function, we need to first set our metadata that was created using dataverse-gen - this provides dataverse-ify with some of the information, it needs to work out the data types of columns that are not present in the WebApi responses. We also create a random value to update the column that will trigger flow:

setMetadataCache(metadataCache);jj
const requestCount = entityIds.length;
const trigger = "trigger" + Math.random().toString();

Make the update using executeMultiple (this is not C# remember, it's TypeScript!)

This is where the magic happens - we can create an array of UpdateRequest objects using the entitiyIds provided to function from the Command Bar:

// Trigger the flow for each selected project (using a batch)
const updates = entityIds.map((id) => {
  return {
    logicalName: "Update",
    target: {
      logicalName: dev1_projectMetadata.logicalName,
      dev1_projectid: id,
      dev1_reportgenerationstatus: trigger,
    } as dev1_Project,
  } as UpdateRequest;
});
const serviceClient = new XrmContextDataverseClient(Xrm.WebApi);
await serviceClient.executeMultiple(updates);

You can see that the updates array is simply passed into executeMultiple which then will bundle them up inside a $batch request. If you wanted to, you can run the updates inside a transaction by simply wrapping the batch inside an array:

await serviceClient.executeMultiple([updates]);

This array could actually contain multiple change sets which each would run independently inside a transaction.

So the resulting function would be:

static async CreateProjectReportTriggerInternal(entityIds: string[]): Promise<void> {
  // Update a column on the selected records, to trigger a flow
  try {
    setMetadataCache(metadataCache);
    const requestCount = entityIds.length;
    const trigger = "trigger" + Math.random().toString();
    // Trigger the flow for each selected project (using a batch)
    const updates = entityIds.map((id) => {
      return {
        logicalName: "Update",
        target: {
          logicalName: dev1_projectMetadata.logicalName,
          dev1_projectid: id,
          dev1_reportgenerationstatus: trigger,
        } as dev1_Project,
      } as UpdateRequest;
    });
    const serviceClient = new XrmContextDataverseClient(Xrm.WebApi);
    await serviceClient.executeMultiple(updates);
    // Monitor the result
    const query = `<fetch aggregate="true"><entity name="dev1_project"><attribute name="dev1_projectid" alias="count_items" aggregate="countcolumn" /><filter><condition attribute="dev1_reportgenerationstatus" operator="eq" value="${trigger}" /></filter></entity></fetch>`;
    let complete = false;
    do {
      const inProgressQuery = await serviceClient.retrieveMultiple(query, { returnRawEntities: true });
      complete = inProgressQuery.entities.length === 0;
      if (!complete) {
        const inProgressCount = inProgressQuery.entities[0]["count_items"] as number;
        complete = inProgressCount === 0;
        // Report status
        Xrm.Utility.showProgressIndicator(`Generating Reports ${requestCount - inProgressCount}/${requestCount}`);
        await ProjectRibbon.sleepTimeout(2000);
      }
    } while (!complete);
    Xrm.Utility.closeProgressIndicator();
  } catch (e) {
    Xrm.Utility.closeProgressIndicator();
    Xrm.Navigation.openErrorDialog({ message: "Could not generate reports", details: JSON.stringify(e) });
  }
}

static sleepTimeout(ms: number): Promise<void> {
  return new Promise((resolve) => setTimeout(resolve, ms));
}

This code adds in the polling for the number of records that have yet to have the flow run and reset the dev1_reportgenerationstatus attribute, indicating that it is completed or reports an error.

The batch request would look similar to:

--batch_1665710705198
Content-Type: application/http
Content-Transfer-Encoding: binary

PATCH /api/data/v9.0/dev1_projects(2361e495-1419-ed11-b83e-000d3a2ae2ee) HTTP/1.1
Content-Type: application/json

{"dev1_reportgenerationstatus":"trigger0.39324146578062336","@odata.type":"Microsoft.Dynamics.CRM.dev1_project"}
--batch_1665710705198
Content-Type: application/http

PATCH /api/data/v9.0/dev1_projects(e8184b63-1823-ed11-b83d-000d3a39d9b6) HTTP/1.1
Content-Type: application/json

{"dev1_reportgenerationstatus":"trigger0.39324146578062336","@odata.type":"Microsoft.Dynamics.CRM.dev1_project"}

--batch_1665710705198--

The code obviously can be improved by adding a timeout and better reporting of errors - but this shows the general idea of using executeMultiple using dataverse-ify version 2.

There are lots of other improvements in version 2 - so if you've used version 1 please do give version 2 a go whilst it's in beta and report any issues inside GitHub.

In my next post on version 2, I'll show you how to call a Custom API using a batch and changeset. If you want to peek before then - take a look at the tests for version 2 - they give lots of examples of its use.

@ScottDurow

 

CRM Configuration Migration Error: User Assignment Required

$
0
0

Hi  there,

If trying to import data by using CRM Configuration Migration Tool I constantly receive a warning "User assignment required" and additional an error "To continue a user assignment is required" message and nothing happens at all. 

The message sounds as if I could click somewhere and than continue, but I can not. 

Anyone there who know what this error exactly means? Tried Google already, but found nothing (maybe because it's labeled different in English (using German version)).

Thanks for any help,
Tom

Editable Grids Lookup Field Custom Filter

$
0
0

Hello guys,

Lately I've been exploring Dynamics 365's new feature, Editable Grids.
On normal form, it is possible to add custom filter to a lookup field using Javascript fetchXml (addPreSearch, addCustomFilter). Is it possible to add custom filter on editable grids lookup cell using the same method? I've searched everywhere but I don't see any clue about this. Microsoft's documentation also doesn't state anything (https://msdn.microsoft.com/en-us/library/mt788311.aspx). Any help or clue will be appreciated.

P.S.: Sorry if this post turned out to be in the wrong place.

Thank you.

Power Fx delegation of 'in' operator against Davaverse

$
0
0

Delegation of queries in Canvas Apps/Custom Pages has long been a troublesome topic and we are always looking out for the triangle of doom, or the double blue underline of eternal stench (well, it is Halloween soon!)

I try to keep a close eye on the connector delegation support table in the official documentation for any changes and additions. Part of what I love about the Power Platform is that new features are constantly being released, often without fanfare! 

Here is the current delegation support at the time of writing (for posterity from the docs):

ItemNumber [1]Text [2]ChoiceDateTime [3]Guid
FilterYesYesYesYesYes
SortYesYesYesYes-
SortByColumnsYesYesYesYes-
LookupYesYesYesYesYes
=, <>YesYesYesYesYes
<, <=, >, >=YesYesNoYes-
In (substring)-Yes---
In (membership) (preview)YesYesYesYesYes
And/Or/NotYesYesYesYesYes
StartsWith-Yes---
IsBlankYes [4]Yes [4]No [4]Yes [4]Yes
Sum, Min, Max, Avg [5]Yes--No-
CountRows [6] [7], CountIf [5]YesYesYesYesYes

 

The Caveats are important - especially around aggregation limits:

  1. Numeric with arithmetic expressions (for example, Filter(table, field + 10 > 100) ) aren't delegable. Language and TimeZone aren't delegable.

  2. Doesn't support Trim[Ends] or Len. Does support other functions such as Left, Mid, Right, Upper, Lower, Replace, Substitute, etc.

  3. DateTime is delegable except for DateTime functions Now() and Today().

  4. Supports comparisons. For example, Filter(TableName, MyCol = Blank()).

  5. The aggregate functions are limited to a collection of 50,000 rows. If needed, use the Filter function to select 50,000

  6. CountRows on Dataverse uses a cached value. For non-cached values where the record count is expected to be under 50,000 records, use CountIf(table, True).

  7. For CountRows, ensure that users have appropriate permissions to get totals for the table.

Old 'in' delegation limit

The really exciting addition to this table is the mention of 'In (membership)'. It is currently marked as preview but can be used in the latest version of canvas studio.

Previously, if you had written a formula to get all the accounts that had a primary contact of A or B it might look like:

Set(varInFilter, 
    [
        First(Contacts).Contact, 
        Last(Contacts).Contact
    ]);

ClearCollect(colAccounts,
    Filter(Accounts, 'Primary Contact'.Contact in varInFilter)
);

In this situation, previously you would have been presented with the delegation warnings:

When you execute the query you would have seen the warning:

The reason being is that the query that was executed against Dataverse would be:

/api/data/v9.0/accounts?
$select=accountid,dev1_AccountStatus,primarycontactid,_dev1_accountstatus_value,_primarycontactid_value

Here there are no filters that are sent to the server to filter by the primary contact, so the delegation limit will be hit.

The new 'In' server-side delegation!

With the new behaviour, if you are using version 3.22102.32 or later (See all versions), the 'in' operator is now delegable to Dataverse. This means you will see no warning:

And inside the monitor, you see a clean delegated query!

This is because the filtering is now performed on the server using the OData query:

/api/data/v9.0/accounts?
$filter=(primarycontactid/contactid eq ... or primarycontactid/contactid eq ...)&$select=accountid,primarycontactid,_primarycontactid_value

The key part here is that the primarycontactid is filtered using the OR query. This is great news because we no longer will hit that delegation limit.

Those troublesome polymorphic relationships

One of the constant challenges in Power Fx is the support for polymorphic relationships in Dataverse when performing delegated queries. This new support is no exception, unfortunately. If you were to write the following formula you would still hit the delegation limit:

ClearCollect(colcontacts,
    Filter(Contacts, AsType('Company Name',[@Accounts]).Account in varInFilter)
)

I'm going to be keeping an eye out for this to be supported in the future and I'll let you know! 

Check out my video showing this new 'in' delegation when used with the Creator Kit! 

@ScottDurow

First look at PCF dynamic schema object outputs

$
0
0

One of the new features now supported in PCF (Power Apps Component Framework) code components are 'object outputs'. A PCF component has a manifest file that defines the inputs/output properties that it accepts, and each of those properties has a data type. Until now, it was only supported to use data types that are supported by Dataverse (E.g. Decimals, Lookups, Choices etc. ). Object-typed property support was introduced in 1.19.4 of pcf-scripts (see the release notes). They have not yet made it to the control manifest schema documentation, but I expect that to follow soon once the component samples have been updated.

With object-typed output properties, we can now specify a schema for the output record which will then be picked up in Power Fx. Amongst the scenarios that this unlocks are:

  • Performing calculations or calling APIs, and then returning the results as an object output with a static schema. The output property can then have arrays and nested objects that will be visible in Canvas Apps at design time.
  • Raise the OnChange event and provide a Record as an output similar to the built-in Selected property using a dynamic schema definition. 

For the second scenario, let's imagine a scenario where you have a grid, and when the user selects a command in a row, you want to output both the event type and the source row, without needing to provide a key that must be used to look up the record. For this scenario, we will take the input schema of the dataset being passed to the grid (e.g. Accounts or Contacts), and then map it to an object output schema for a property named EventRow. When the schema of the input dataset changes, the schema of the output property also changes to match.

Define the output property in the ControlManifest.Input.xml

For each object output property, there must be a dependent Schema property that will be used by Canvas Apps to display the auto-complete on the object. We add two properties, the output and the schema:

<property name="EventRow" display-name-key="EventRow" of-type="Object" usage="output"/><property name="EventRowSchema" display-name-key="EventRowSchema" of-type="SingleLine.Text" usage="bound" hidden="true"/>

Now we must also indicate that the Schema property is used as the schema for the EventRow property by adding the following below inside the control element:

<property-dependencies><property-dependency input="EventRowSchema" output="EventRow" required-for="schema" /></property-dependencies>

Notice that the property-dependency element joins the EventRowSchema and EventRow properties together to be used to determine the schema as indicated by required-for="schema".

Define the JSON Schema

In our example, whenever the input dataset changes, we must update the output schema to reflect the same schema so that we can see the same properties. The output schema is defined using the json-schema format.

To use the JSON schema types, we can add the definitely typed node module using:

npm install --save @types/json-schema

Once this has been installed, you can use the type JSONSchema4 to describe the output schema by adding the following to your index.ts:

private getInputSchema(context: ComponentFramework.Context<IInputs>) {
    const dataset = context.parameters.records;
    const columnProperties: Record<string, any> = {};
    dataset.columns
        .filter((c) => !c.isHidden && (c.displayName || c.name))
        .forEach((c) => {
            const properties = this.getColumnSchema(c);
            columnProperties[c.displayName || c.name] = properties;
        });
    this.columnProperties = columnProperties;
    return columnProperties;
}
private getColumnSchema(column: ComponentFramework.PropertyHelper.DataSetApi.Column): JSONSchema4 {
    switch (column.dataType) {
        // Number Types
        case 'TwoOptions':
            return { type: 'boolean' };
        case 'Whole.None':
            return { type: 'integer' };
        case 'Currency':
        case 'Decimal':
        case 'FP':
        case 'Whole.Duration':
            return { type: 'number' };
        // String Types
        case 'SingleLine.Text':
        case 'SingleLine.Email':
        case 'SingleLine.Phone':
        case 'SingleLine.Ticker':
        case 'SingleLine.URL':
        case 'SingleLine.TextArea':
        case 'Multiple':
            return { type: 'string' };
        // Other Types
        case 'DateAndTime.DateOnly':
        case 'DateAndTime.DateAndTime':
            return {
                type: 'string',
                format: 'date-time',
            };
        // Choice Types
        case 'OptionSet':
            // TODO: Can we return an enum type dynamically?
            return { type: 'string' };
        case 'MultiSelectPicklist':
            return {
                type: 'array',
                items: {
                    type: 'number',
                },
            };
        // Lookup Types
        case 'Lookup.Simple':
        case 'Lookup.Customer':
        case 'Lookup.Owner':
            // TODO: What is the schema for lookups?
            return { type: 'string' };
        // Other Types
        case 'Whole.TimeZone':
        case 'Whole.Language':
            return { type: 'string' };
    }
    return { type: 'string' };
}

As you can see, each dataverse data type is mapped across to a JSON schema equivalent. I am still trying to establish the correct schema for complex objects such as Choices and Lookups, so I'll update this post when I find out more, but I expect that some of them such as Choice columns may not be possible.

Output the schema

Since the input schema can change at any time, we add the following to detect if it has changed, and then call notifyOutputChanged if it has:

private updateInputSchemaIfChanged() {
    const newSchema = JSON.stringify(this.getInputSchema(this.context));
    if (newSchema !== this.inputSchema) {
        this.inputSchema = newSchema;
        this.eventRow = undefined;
        this.notifyOutputChanged();
    }
}

Inside updateView, we then simply make a call to this to detect the change. I've not worked out a way of detecting the change other than comparing the old and new schema. It would be good if there were a flag in the context.updatedProperties array but there does not seem to be one as far as I can find.

Generate the output record object to match the schema

In our example, each time the selection changes we raise the OnChange event and output the row that was selected (similar to the Selected property that raises the OnSelect event). In order to do this, we have to map the selected record onto an object that has the properties that the schema defines:

private getOutputObjectRecord(row: ComponentFramework.PropertyHelper.DataSetApi.EntityRecord) {
    const outputObject: Record<string, string | number | boolean | number[] | undefined> = {};
    this.context.parameters.records.columns.forEach((c) => {
        const value = this.getRowValue(row, c);
        outputObject[c.displayName || c.name] = value;
    });
    return outputObject;
}
private getRowValue(
    row: ComponentFramework.PropertyHelper.DataSetApi.EntityRecord,
    column: ComponentFramework.PropertyHelper.DataSetApi.Column,
) {
    switch (column.dataType) {
        // Number Types
        case 'TwoOptions':
            return row.getValue(column.name) as boolean;
        case 'Whole.None':
        case 'Currency':
        case 'Decimal':
        case 'FP':
        case 'Whole.Duration':
            return row.getValue(column.name) as number;
        // String Types
        case 'SingleLine.Text':
        case 'SingleLine.Email':
        case 'SingleLine.Phone':
        case 'SingleLine.Ticker':
        case 'SingleLine.URL':
        case 'SingleLine.TextArea':
        case 'Multiple':
            return row.getFormattedValue(column.name);
        // Date Types
        case 'DateAndTime.DateOnly':
        case 'DateAndTime.DateAndTime':
            return (row.getValue(column.name) as Date)?.toISOString();
        // Choice Types
        case 'OptionSet':
            // TODO: Can we return an enum?
            return row.getFormattedValue(column.name) as string;
        case 'MultiSelectPicklist':
            return row.getValue(column.name) as number[];
        // Lookup Types
        case 'Lookup.Simple':
        case 'Lookup.Customer':
        case 'Lookup.Owner':
            // TODO: How do we return Lookups?
            return (row.getValue(column.name) as ComponentFramework.EntityReference)?.id.guid;
        // Other
        case 'Whole.TimeZone':
        case 'Whole.Language':
            return row.getFormattedValue(column.name);
    }
}

Again, I am unsure of the shape that is needed to support lookups and choice columns, so I am simply mapping them to numbers and strings at this time. 

We can now use this to output the record when the selection changes:

this.eventRow = this.getOutputObjectRecord(dataset.records[ids[0]]);
this.notifyOutputChanged();

In the getOutputs, we then simply add:

public getOutputs(): IOutputs {
    return {
        EventRowSchema: this.inputSchema,
        EventRow: this.eventRow,
    } as IOutputs;
}

 

Implement getOutputSchema

Notice above, we output both the selected record and its schema. If the schema has changed, this then triggers Power Apps to make a call to the method called getOutputSchema. This is where the actual JSON schema is returned and used by Power Apps:

public async getOutputSchema(context: ComponentFramework.Context<IInputs>): Promise<Record<string, unknown>> {
    const eventRowSchema: JSONSchema4 = {
        $schema: 'http://json-schema.org/draft-04/schema#',
        title: 'EventRow',
        type: 'object',
        properties: this.getInputSchema(context),
    };
    return Promise.resolve({
        EventRow: eventRowSchema,
    });
}

The result

Once this is done and published, your component will now have a new EventRow property, inheriting the same schema as the input record - with the caveat that Choices and Lookups will be strings, rather than complex types.

If you had bound the grid to Accounts, the EventRow property might look similar to:

You can grab the code for this example from GitHub: https://github.com/scottdurow/PCFDynamicSchemaOutputExample 

This functionality takes us one step closer to parity with the first-party controls in canvas apps - I'm just now waiting for custom events to be supported next!

 


Refresh Opportunity Form using Plugin after Create Operation

$
0
0

Hi All,

I am facing the scenario that when I create the Opportunity using Plugin,I need to refresh the form same.

But I have searched that there is no way to refresh the Plugin because Plugin works in Server Side.

Is there any way to refresh the form using Plugin???????????????

Regards,

Arunkumar.V

Focus sharepoint Dynamics 365

BAP Skilling

Importing XLSX from anything other than template doesn't work

$
0
0

Whenever I try to import my data I get errors if I use anything other than the "Import File Template" with all my data manually moved over.  

When I upload my file, an error box pops up and says either:

Invalid Format in Import File

The import file does not have the format that Dynamics CRM uses to map the data to CRM fields. It is possible some hidden sheets that the system uses have been modified or corrupted. Try again with a file exported from CRM

-or-

Error

An error has occurred.

Try this action again. If the problem continues, check the Microsoft Dynamics CRM Community for solutions or contact your organization's Microsoft Dynamics CRM Administrator. Finally, you can contact Microsoft Support

Under no circumstances can I get the "mapping" wizard part to pop up.  If I use the template it just skips that, if I don't then I get the error.

Any suggestions?

Import Global Option Set values

$
0
0

Are there any possibility to import global option set values in the newer version of dynamics crm ?

It doesnt matter, whether directly when importing record to create the values or to import the values it self seperately.

thanks.

High severity vulnerability in pcf-scripts package due to dependency on xml2js

$
0
0

Have you noticed recently that when you run npm install on your PCF projects, you get a high severity vulnerabilities error (or maybe you were spammed by the GitHub dependabot like I was)?
Luckily, it's not necessarily a reason to panic!

As of the time of writing this (14th April 2023), there is currently a vulnerability in the xml2js package which pcf-scripts depends on, so if you run npm audit, you will see something like:

# npm audit report

xml2js  <0.5.0
Severity: high
xml2js is vulnerable to prototype pollution  - https://github.com/advisories/GHSA-776f-qx25-q3cc
No fix available
node_modules/xml2js
  pcf-scripts  *
  Depends on vulnerable versions of xml2js
  node_modules/pcf-scripts
  pcf-start  *
  Depends on vulnerable versions of xml2js
  node_modules/pcf-start

3 high severity vulnerabilities

This error is not as scary as it sounds and the good news is that the pcf-scripts package is only used a build-time and it doesn't get used at run-time. The xml2js package doesn't affect the functionality or security of your PCF control at all (unless you are using it in your own code of course!) since it is not included in your final PCF bundle.js when used by the pcf-scripts package.

So how do you fix this? 

Well until the owner of the xml2js package releases a new version or the pcf-scripts package is updated not to require it, there isn't anything you can do!

Since pcf-scripts is included in the devDependencies section of the packages.json and is only used for development purposes, the way to determine if you have any issues that will impact your PCF bundle.js is to run the command:

npm audit --omit=dev

This will check only the packages that are in the dependencies section, and you should get the message:

found 0 vulnerabilities

Congratulations!

Gamification - Can't add players or set up a game

$
0
0

Just installed gamification into our dynamics 365 production but I can't seem to add players or set up a game! I've been on all the blogs but I don't seem to have the same steps. If I go to my 'games' there is nothing there with nowhere to add, same with players. I have access to the portal but I can't see any options for adding games or adding players. The images I have seen of other people's portals seems different to mine. Can anyone help me? 






Latest Images