Adding a New User to Microsoft Dynamics 365 Online Organization – Something’s Changed

I recently noticed that the procedure for adding users to Microsoft Dynamics 365 Online Organization does not work anymore.
Adding a user via Office 365 Administration and granting a license did not add the user to Microsoft Dynamics 365 Organization no matter how long I waited.
After some research, I found out that an additional step was added to the procedure: you need to associate the Office 365 created user to the target organization via the Power Platform Admin portal.

So here is the whole procedure, step by step:

  • Add an Office 365 user
In your Microsoft Dynamics 365 organization, click Advanced Setting
Navigate to Settings -> Security -> Users and click New
Click Add and License Users
In Microsoft 365 Admin Center Active users list, click Add A user
Fill in the user details and click Next
Assign a Dynamics 365 CE license to the user and click next -> Click next -> Finish adding -> Close
  • Associate Office 365 user to target Organization
Navigate to https://admin.powerplatform.microsoft.com/environments and select the target environment (organization)
Click Settings
Click Users + permissions -> Users
Note the newly Added user. If your new user does not appear, click Add user and look it up
Click the new user to open the user’s record in Dynamics 365.
Click Assign Roles to assign a Security Role and to complete the procedure

Designing Accountable Integration

Reviewing various Dynamics 365 integration mechanisms for over a decade now (yes, Dynamics CRM back then) taught me some valuable lessons:

  • Integration mechanisms break down silently the second you look away. Usually metadata or data changed without notification, endpoint moved or some port closed due to security considerations.
  • Sooner or later after an integration mechanism break, someone will claim the problem is on your side on the integration.
  • An integration mechanism performance may degrade over time. Sometimes the volume of data sets you started with on day has growing rapidly so queries/business logic operations now consume more time and resources. Other times, an external endpoint you are using is now responding slowly.
  • Application user used for integration on Dynamics 365 side almost always has more privileges than it actually requires, often granted the System Administrator role.

In this post I would like to share some design tips that can help you build an accountable integration. For me, accountable integration implements few key concepts I’ll explain next.
While these design concepts may sound trivial, it is surprising how little they are actually implemented.

  1. Failing gracefully
    When integration mechanism fails, system administrator must be able to get the full failure details, but end users and the integrating party should not be exposed to failure details, as it usually means nothing to them and they can’t do anything about it anyway. For them, it is enough to know a failure occurred with the minimum required details. Another concern here is security: unhandled exception details can expose sensitive information to hackers and such.

    That means that no exception should go unhandled. Hermetically seal your code with try/catch blocks and if you let an exception float, make sure it is not exposing any redundant details.
  2. Diagnostic enabled
    The first element I look for in failure scenario is diagnostics as it should direct to the problem cause and sometimes even allow solving it without recompiling code.
    In many implementations, diagnostic elements are missing or turned off. In other, it is missing critical information.
    Any diagnostics element should at least be able to confirm that an integration mechanism actually executed, even if it did not operate on any data or performed any action. Additionally, did it succeed or fail and if so, what is the failure message? What is the time stamp for each execution and what were the incoming and out going parameters?

    If you are using Cusotm Workflow Activities or Plug-ins, that means using the Trace service. Azure elements like Function and Logic Apps also allow tracing.
    Make your own convention but make sure you are tracing the following key events:

    1. Execution start/end along with time stamps. This will give you an indication of any execution, event if it did nothing.
    2. Incoming and outgoing parameters. This will allow you to diagnose problems faster as unexpected parameter values are the culprit most of the time.
    3. Any outgoing calls to external APIs. Trace before and after any such call as well as the outgoing message. This will allow you to measure the time such call actually took, the outgoing request and response.
    4. Any exception
  3. Proactive
    Many implementations has logging/tracing enabled but no one even know how to access or analyze it. Sometimes a failure is detected days or weeks after it actually appeared.
    When failure occurs, the system administrator should know about it ASAP. I don’t expect system administrators to spend time scanning trace and logs searching for anomalies.
    An integration mechanism should be able to send an immediate notification of failure in the form of an email, SMS or push.

    If you are using Custom Workflow Activities, you can leverage the hosting process built-in emailing capabilities. Set one of the outgoing parameters to indicate success/failure, so you can send an email to the system administrator every time a failures occurs.
    Azure elements can leverage Application Insights Alerts capability to send SMS, email or push notification on failure.
  4. Least Exposed
    What’s wrong with granting a designated application user (used only for integration scenarios) with the System Admin Role? Everything.
    If this omnipotent user credentials leak (sometimes leaked), it exposes all of your data and allows an attacker to do maximum damage.

    When setting an application user to perform integration operations within Dynamics 365, make sure it has a least possible privileges to allow the required operations. Best to create a custom Security Role.

In the past, I posted about Custom Code Tracing & Exception Logging in Dynamics. This actual solution embeds most of the accountable integration concepts discussed here and may be a good starting point.

Implementing No Code Dynamics 365 Service Bus Listener using Logic Apps

My last post demonstrated how to implement Dynamics 365 Service Bus listener using Azure Function.
This implementation type requires writing code and if you need absolute flexibility, Azure Function is a great solution. Additionally, Azure Function Apps support pay-as-you-go pricing with free grants, which should be considered when planing your architecture costs.

So why would you want to replace Azure Function with Logic Apps?
Mainly because in many cases, no code is required to implement patterns like Inbound Asynchronous Integration. This enable non-developers to implement complex integration scenarios in very short time.
How is this achieved? By leveraging Logic Apps built-in connectors to ASB, Dynamics 365 and numerous other applications. You can even expose custom connectors to your organizational APIs.
When working with Logic Apps, you should note that triggers (except HTTP WebHook trigger) use polling to detect relevant business events. While you can manage the trigger polling interval, you are billed by each polling action (trigger invocation) even if there is no relevant business event to operate on.

Using the built-in Dynamics 365 connectors allow simple CRUD operations but using the Command pattern can enable more advanced approach, where Logic Apps creates a Command (custom entity) record and Dynamics 365 A\Sync. Workflow operates on the request.

In this post I’ll walkthrough the process of setting a Logic Apps process as an ASB one-way listener which posts incoming messages into Dynamics 365.
To demonstrate posting a message to ASB I’ll use Postman.

Business Scenario:

Whenever a new Account record is created in Contoso’s ERP application, reflect the new Account data into Dynamics 365 CE.

Prerequisites:

  1. Azure subscription allowing you to provision Azure Service Bus and Azure Logic Apps
  2. Microsoft Dynamics 365 9.x instance with admin privileges, a trial will do

Walkthrough:

  1. Setup Azure Service Bus queue 
  2. Setup Logic App process as ASB listener
    In Azure portal click ‘Create a resource’, type ‘Logic App’ and then ‘Create’
    undefined
    undefined
    undefined
    Type in Logic App details and click ‘Create’.
    Once created, navigate to the new resource
    undefined
    Select ‘When a message is received in a Service Bus queue’ trigger
    undefined
    Click +
    undefined
    Enter a meaningful connection name and click to select existing ASB queue. Then click ‘Create’ and ‘Continue’
    undefined
    undefined
    Change the polling interval if you want and click ‘+ New step’
    undefined
    Type ‘Data operations’ and select it from the menu. Then select the Parse JSON action
    undefined
    undefined
    Select the content field and feed in the following expression:
    decodeBase64(triggerBody()?[‘ContentData’])
    undefined
    Click ‘Use sample payload to generate schema’ and then paste in the following JSON sample payload.
    {“name”: “The Krusty Krab”, “websiteurl”: “http://thekrustykrab.com”, “numberofemployees”: 3}
    Click ‘Save’ on the command bar.

    Click ‘New Step’ on your process, then type in Dynamics 365. Select Dynamics 365 and then ‘Create Dynamics 365 record action’
    undefined
    undefined
    Click ‘Sign in’ and sign into your Dynamics 365 organization
    undefined
    Select your organization name, the target entity and then map each required field to the matching detail in the JSON payload.
    To map the nubmerofemployees integer data, put in the following expression:
    int(body(‘Parse_JSON’)?[‘numberofemployees’])
    undefined
    You should now have the following process defined:
    undefined
  3. Test by clicking ‘Run’ which will make you process start polling ASB.
    undefined
    Next, post a message with the required format to ASB queue.
    Note that if you left the trigger interval its 3 minutes default, you may wait up to 3 minutes for the message to be processed.
    You can do this using Postman and this useful collection

    Once Logic Apps picks the new message, you’ll see the process run
    undefined
    Finally, check the new Account record created in your Dynamics 365 organization
    undefined

    If you just set this listener up for learning purposes and you don’t actually need it running, make sure you disable your Logic App process so it will not burden your Azure budget for nothing.


Implementing Dynamics 365 Service Bus Listener using Azure Function

One of the built-in triggers for Azure Function App is Service Bus (ASB) queue trigger, which makes Azure Function a listener for ASB events.
While Azure Service Bus provides a robust decoupling mechanism, the receiving side has to poll for messages or implement a listener

This trigger makes Azure Functions an ideal implementation component for Inbound Asynchronous Integration with Microsoft Dynamics 365,
one that allows you to write custom code, but releases you from explicitly writing a listener.
An Inbound Asynchronous Integration pattern describes scenarios where a business event occurs in an external application and must be reflected into Dynamics 365.
Processing of the event in Dynamics 365 does not occur immediately and sometimes no response is required from Dynamics 365 side (one way).

In this post I’ll walkthrough the process of setting an Azure Function as an ASB one-way listener which posts incoming messages into Dynamics 365.
To demonstrate posting a message to ASB I’ll use Postman.

Business Scenario:

Whenever a new Account record is created in Contoso’s ERP application, reflect the new Account data into Dynamics 365 CE.

Prerequisites:

  1. Azure subscription allowing you to register apps, provision Azure Service Bus and Azure Function App
  2. Microsoft Dynamics 365 9.x instance with admin privileges, a trial will do

Walkthrough:

  1. Setup Azure Service Bus queue and make a note of the connection string
  2. Register an AAD App and application user to allow using Microsoft Dynamics 365 Web API. Note the Application Id  and secret
  3. Setup Azure Function App with Service Bus queue trigger
     
    1. In Azure Portal, click ‘Create a resource’
      undefined
    2. Select Function App
      undefined
    3. Search for Function App (classic) and create a new one
      undefined

      undefined
    4. Set Function details and click ‘Create’
      undefined
    5. Select the new Function and click Function app settings
      undefined
    6. Select 1 as Runtime version
      undefined
    7. Select the Functions and Click New function
      undefined
    8. Select Azure Service Bus Queue trigger with C#
      undefined
    9. Set triggers details. Under Service Bus connection click new and select details for the ASB queue you set on stage 1

      undefined
      undefined
    10. Once back in the Function code, replace existing code with the code found here.
      This code parse the incoming message, authenticates to Dynamics 365 and create a new Account record.
      Replace the ApplicationId, Secret and WebAPIURL to match the details you noted on step 2 and your organization Web API URL.
      Save the Function and make sure it compiles.

      undefined
  4. Test by posting a message with the required format to ASB queue.
    You can post messages to ASB using Postman and this useful collection
    undefined
    Make sure Azure Function processes the message
    undefined
    and that the record was created in Dynamics 365
    undefined

Post message to Azure Service Bus via Custom Workflow Activity

While Dynamics 365 Plug-ins allow declarative integration with Azure Service Bus via ServiceEndPoints, integrating through Custom Workflow Activity requires some coding.

So why would you prefer a Custom Workflow Activity over a Plug-in?
Here are just a few good reasons:

  1. Cost-efficiency: A Custom Workflow Activity component can be embedded in Custom Actions, A/Sync. Workflow, Dialog (while it’s still here), while Plug-in component can only be called and operate via the Plug-in execution context. When you put time into writing code, you want to allow as much re-use as possible.
  2. Declarative: once registered, Custom Workflow Activity can be leveraged by non-developers in various Processes, while Plug-in is usually accessible only to developers.
  3. Ubiquity: A Custom Action wrapping a Custom Workflow Activity can be accessed from other processes, client side code, server side code and consumers external to Dynamics 365
  4. Flexibility: unlike the built in declarative integration, addressing Azure Service Bus via code allows you to add conditional logic and also reduce the message payload (see component code).

Convinced or not, the following walkthrough demonstrates using the Post Message To Azure Service Bus Custom Workflow Activity component to post message to ASB queue.
You can view the source code here and modify as required.

Prerequisites:

  1. Azure subscription allowing you to provision Azure Service Bus
  2. Microsoft Dynamics 365 9.x instance with admin privileges, a trial will do
  3. Access to Plugin Registration Tool

Walkthrough:

  1. Setup Azure Service Bus and make a note of the connection string
  2. Use the Plugin Registration Tool to register a ServiceEndPoint and the noted connection string
  3. Download, import and publish the unmanaged solution available here
  4. Configure the solution contained Workflow by selecting the registered ServiceEndPoint, stating semi-column delimited properties list and an optional custom message. Activate the Workflow Rule.


  5. Test by manually activating the Workflow on a Contact record.
    If all goes well, a new message will appear in ASB queue. You can view the queue dashboard in Azure Portal.

    undefined
    If you want to view the actual message content, you can use Azure Service Bus Explorer

    undefined

Few implementation notes:

  1. While the IWorkflowContext allows accessing the target record properties via Pre/Post entity images, this technique is not documented and therefor considered unsupported.
    For this reason, the code explicitly retrieves and push the required attributes values into the SharedVariables collection.
    The propertiesSet input parameter allows sending in a list of required properties.
  2. Clearing the IWorkflowContext data heavy collections before posting it to Azure Service Bus can reduce the the message payload weight by ~80%, which can simplify handling the message and improve performance.





Plug-in ServiceEndPoint Integration – Minimize IPluginExecutionContext payload

Integrating Microsoft Dynamics 365 with Azure Service Bus (ASB) can be easily implemented via the Plug-in Registration Tool without writing any code.
With this integration method, each time the target business event occurs, the complete operation IPluginExecutionContext payload is automatically sent to ASB, though often you need only a small portion of the data.

For some integration scenarios you need to apply conditional logic to decide if and which ASB queue to target. For these scenarios, you can write custom code which allows you to programmatically post message to ASB from Plug-in/Custom Workflow Activity components.
This method, also allows minimizing the IPluginExecutionContext payload which can simplify the logic required to parse it and generally improve performance. Unfortunately, I could not find a way to reduce the payload to just the required data, only minimize it.

The following screenshots (ASB Explorer) describes 2 similar messages (Contact creation event) posted to ASB before (4656 bytes) and after minimizing the IPluginExecutionContext payload (1842 bytes).
As you can see, the full payload weighs ~2.5 times the minimized payload.


To minimize the IPluginExecutionContext payload while maintaining the data you need, follow these simple steps in your code:

  1. Extract the required data from IPluginExecutionContext and add it to the SharedVariables collection
  2. Clear the data heavy IPluginExecutionContext collections
    • InputParameters
    • OutputParameters
    • PreEntityImages
    • PostEntityImages

The following sample code extracts the lastname attribute from the IPluginExecutionContext, clears the redundant collections and posts the minimized context to ASB:

        public void Execute(IServiceProvider serviceProvider)
        {
            // Retrieve the execution context.
            IPluginExecutionContext context = 
                (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

            IServiceEndpointNotificationService cloudService = 
                (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService));

            // Extract the tracing service.
            ITracingService tracingService = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

            //extract target record data from InputParameters collection
            Entity contact = (Entity)context.InputParameters["Target"];
            //extract required data 
            string fullName = contact["lastname"].ToString();

            //add required data to the SharedVariables collection
            context.SharedVariables.Add("lastname", fullName);

            //clear redundant data from context 
            context.PreEntityImages.Clear();
            context.PostEntityImages.Clear();           
            context.InputParameters.Clear();
            context.OutputParameters.Clear();
            
            try
            {
                tracingService.Trace("Posting the execution context.");

                string response = cloudService.Execute(new EntityReference("serviceendpoint", 
                    serviceEndpointId),
                    context);
                
                if (!String.IsNullOrEmpty(response))
                {
                    tracingService.Trace("Response = {0}", response);
                }

                tracingService.Trace("Done.");
            }
            catch (Exception e)
            {
                tracingService.Trace("Exception: {0}", e.ToString());
                throw;
            }
        }
    }

Dynamics 365 Power Tel Aviv – Azure Integration Patterns

I had the privilege to deliver a lecture about Dynamics 365 Azure Integration Patterns in the DynamicsPower! Tel Aviv event last week.

This is the first 365 Saturday event in Israel, gathering Power Platform MVPs and experts from around the world to share their insights with the community.

I would like to thank Victor Dantas & Liran Debbi for their efforts in organizing this awesome event.

Click here to download my presentation.

Dynamics 365 Power Israel team

The Strange Case of The Liquid Template fetxhxml Tag

I have been asked by one of my customers to investigate a problem related to the Liquid Template fetxhxml tag.
If you aren’t familiar with this powerful Liquid tag yet, it allows retrieving Dynamics 365 data using a standard FetchXML query. You can even leverage this tag to retrieve data a-synchronously.

Trying to reproduce the problem, I added a simple FetchXML query within a fetchxml tag in one of the Portal pages via the actual Web Page record, knowing that the front side editor disrupts FetchXML queries syntax.

Fetchxml query

Testing the query results in the target Portal page, I noticed that the query condition was ignored – the query returned all Contact records.

Returning to the Web Page and refreshing it, I noticed that the FetchXML query mysteriously changed – the closing  slash for each attribute element was replaced with a full closing <attribute> tag.

Changed query

It seems that before saving the Web Page form, the text editor diligently ‘corrects’ my query to comply to XML hierarchy format. This behavior was reproduced with Chrome/Edge browsers on both UII/old forms UI.
Strangely enough, the changed query syntax is considered valid by Dynamics 365, but any query condition is completely ignored.
I am guessing a recent features update changed the text editor behavior, as this behavior does not exist on earlier Dynamics 365 versions. 

After searching Microsoft Portal documentation for the fetchxml Liquid Template tag finding no results, I started considering the option that this is not a bug, but maybe some kind of feature phase out. Naaah.

So until Microsoft fixes this glitch, how can you use conditional FetchXML queries and fetchxml liquid tags? By using XrmTookBox Portal Code Editor Plug-in which allows Portal content editing (among other features) without disrupting FetchXML queries

 XrmTookBox Portal Code Editor Plug-in

Integrating Dynamics 365 with Azure Function using Managed Identity

I love Azure Functions. These days, I have to restrain the tendency to solve every problem with this Swiss army knife. 

When designing Azure Function integration with Dynamics 365, one of the immediate questions raised is where to store connection strings, credentials and other sensitive details. 
The immediate answer is ‘Not in your code’. So where?

One option is to store sensitive details in the Application Settings store which is ‘encrypted at rest and transmitted over an encrypted channel.’
While this option is  quite easy to use, it isn’t considered most secured.

One option is to store sensitive details in the Application Settings store

Another option is using Managed Identity with Azure Key Vault service.
With this option, Azure Function App is assigned an AAD identity, which is then authorized to access specific Key Vault secrets.
This option is considered more secured as the Function App is specifically granted access to  specific sensitive data, while Application Settings stored data is generally exposed.

In this post, I’ll walkthrough the process of setting and using Managed Identity to Integrate Azure Function with Dynamics 365. In this case, Dynamics 365 access details will be stored in Azure Key Vault.

For the walkthrough, I’ll use the Lead Landing Page scenario, replacing Flow component with Azure Function. Although a bit verbose, the process is quite simple.

Implementation Notes

  • Like most Azure services, Azure Key Vault usage costs money. With Azure Key Vault, any retrieval of secret is paid for. In order to reduce costs, some caching mechanism (which will not be discussed in this post) is in order.

Prerequisites

  1. Have accessible Microsoft Dynamics 365 instance
  2. Have access to Azure environment with sufficient privileges to create Azure Function and Key Vault store.

Walkthrough

  1. Register App with AAD

    Register a new Web App/API in Azure AD with a Client Secret and copy Application Id key and secret to Notepad.

  2. Add an App User to Dynamics 365

    Follow this article: Add a new Application User in Dynamics 365

  3. Create a Function App

    Create a Function App

    Set Function App details and click ‘Create’

    Set Function App details and click ‘Create’

    Add a new Function

    Add a new Function

    Select the HTTP trigger option

    Select the HTTP trigger option

    Click ‘Create’ to complete Function creation

    Click ‘Create’ to complete Function creation

    Leave the function code as is for now, we will alter it later on.

    Leave the function code as is for now, we will alter it later on.

  4. Assign Function App with a Managed Identity

    Go to the Function App Platform Features tab and click ‘Identity’

    Go to the Function App Platform Features tab and click ‘Identity’

    Click the ‘On’ button to assign a System identity and click ‘Save’

    Click the ‘On’ button to assign a System identity and click ‘Save’

    Click ‘Yes’ to enable system assigned managed indentity

    Click ‘Yes’ to enable system assigned managed indentity

    You can see the newly assigned identity object ID

    You can see the newly assigned identity object ID

  5. Setup Azure Key Vault store

    Create a new Azure Key Vault store

    Create a new Azure Key Vault store

    Click ‘Create’ at the bottom of the screen

    Click ‘Create’ at the bottom of the screen

    Click ‘Add new’ under the Access policies blade.
    In the Add access policy, check Get and List in the Key Management Operations group.
    Under the Principal blade, find and select your Function App. Click ‘Select’.
    This will grant our Function identity access to the Key Vault secrets.

    Click ‘Add new’ under the Access policies blade. In the Add access policy, check Get and List in the Key Management Operations group.

    Next, select a Resource Group for the Key Vault store. Click ‘Create’ to complete Azure Key Vault store creation

    Next, select a Resource Group for the Key Vault store

  6. Store secrets in Azure Key Vault

    Find the newly created Azure Key Vault store or access it from the dashboard if possible.

    Find the newly created Azure Key Vault store or access it from the dashboard if possible.

    Access the Secrets area

    Access the Secrets area

    Click Secrets and ‘Generate/import’ to generate a new secret

    Click ‘Generate/import’ to generate a new secret

    Set secret Name (select a self explanatory name, since once created, you won’t be able to see the actual secret value in the area).
    Set the secret string in the Value field. Click ‘Create’.
    In this case, the secret I defined is Dynamics Web API URL, similar to https://<ORGNAME>.api.crm<DATACENTERCODE>.dynamics.com/api/data/v9.1/

    Set secret Name (select a meaningful name, as this will be used in our code). Set the secret string in the Value field

    In the same manner, add additional secrets to hold the applicationId and secret keys you copied after registering an app in AAD (step 1 in this walkthrough).

    In the same manner, add additional secrets to hold the applicationId and secret keys you copied after registering an app in AAD

    Click each of the newly created secrets and copy the Secret Identifier, which will be used in the code to access the secret value

    Click each of the newly created secrets and copy the Secret Identifier, which will be used in the code to access the secret value

  7. Update Azure Function Code 

    Go back to the Function created on step 3 above.
    Click View Files, add a new project.json file and paste in the following definition. Click ‘Save’.

    {
        “frameworks”: {
            “net46”: {
                “dependencies”: {
                    “Microsoft.Azure.KeyVault”: “2.4.0-preview”,
                    “Microsoft.Azure.Services.AppAuthentication”: “1.1.0-preview”                
                }
            }
        }
    }

    image

    Go back to the function code and replace the existing code with the code found here (placed in Github for convenience).

    This code, triggered by an HTTP request from the Lead landing page, performs the following tasks:
    – Receives and parse Lead data
    – Extract Dynamics access details from Azure Key Vault
    – Use access details to generate an access token
    – Create a new Dynamics Lead record using Web API
    – Returns operation result to the caller

    In the GetDyn365AccessDetails method, replace the URLs for the three keys

    dyn365WebAPIURL, dyn365AppIdKVURL, dyn365secretKVURL with the URLs copied on step 6.
    Click ‘Save’.

    Click ‘Get Function URL’ and paste somewhere, as it will be used next

    Click ‘Get Function URL’ and paste somewhere, as it will be used next

  8. Hookup Lead Landing Page to Azure Function

    Last, create a new HTML page, and copy the HTML found here (placed in Github for convenience).
    Find the AzureFunctionRequestURL variable and replace its value with the Azure Function URL copied in the previous step. Save.

    Find the AzureFunctionRequestURL variable and replace its value with the Azure Function URL copied in the previous step

  9. Test

    To test the solution, run the Lead Landing HTML page. Submitting Lead data should results with a new Lead record in Dynamics 365.

    To test the whole flow, run the Lead Landing HTML page

    Submitting Lead data should results with a new Lead record in Dynamics 365.

    If the flow fails, add the address from which the HTML page is executed to the Azure Function CORS collection to authorize it.

    If the flow fails, add the address from which the HTML page is executed to the Azure Function CORS collection to authorize it.

    If the flow fails, add the address from which the HTML page is executed to the Azure Function CORS collection to authorize it.

Implementing Lead Landing Page with Flow (in 10 min. or less)

Microsoft Flow, along with Logic Apps, Power Apps and CDS has revolutionized integration with Microsoft Dynamics 365.
I have been working with Dynamics products since 2005 and when comparing the resources required back then to hook up a landing page to Dynamics, I estimate that modern solutions require less than 5%.
In addition, you don’t have to be an expert developer to implement simple integration scenarios, as declarative mechanisms like Flow and Logic Apps can handle the heavy lifting.    

In this post I’ll walkthrough the process of Implementing a Lead Landing Page with Flow while writing the minimum amount of required code.

The ingredients:

  1. Microsoft Flow Handle Landing Page Lead triggered by HTTP request, creating a Lead record in Microsoft Dynamics 365 instance using Flow built in Dynamics Create Record Action.
  2. HTML page implementing landing page UI along with JavaScript code sending Lead data to the Handle Landing Page Lead Flow.

Prerequisites:

  1. Have access to Microsoft Dynamics 365 online instance and Flow environment residing in the same tenant

Walkthrough:

  1. Create a ‘Handle Landing Page Lead’ Flow 

    Start a Flow from scratch and create a Request trigger. Once saved,  the URL value will be filled automatically.

    Start a Flow from scratch and create a Request trigger. Once saved,  the URL value will be filled automatically.

    Next, add a Parse JSON action and paste in the following schema

    {
        “type”: “object”,
        “properties”: {
             “topic”: {
                “type”: “string”
              },
              “firstName”: {
                 “type”: “string”
              },
              “lastName”: {
                 “type”: “string”
              },
              “email”: {
                 “type”: “string”
              },
              “mobilePhone”: {
                 “type”: “string”
              }
         }
    }

    Next, add a Parse JSON action and paste in the following schema

    Next, add Dynamics Create Record action. Select your target Dynamics 365 instance and map to the Lead entity.
    Map the previous Parse JSON action output values as Lead attributes.

    Next, add Dynamics Create Record action

    Finally, add a parallel branch element. Add two HTTP Response actions, one handling Lead creation success and the other for failure.
    Both returning code 200, each returning the respective message in the response body.
    Click each response ellipsis and set the ‘Configure run after’ option to handle the
    previous step success and failure respectively. 
    Finally, save the Flow.

    Finally, add a parallel branch element

    Click each response ellipsis and set the ‘Configure run after’ option to handle the

  2. Create HTML Landing page

    Create the following HTML page. Replace the flowRequestURL variable value with Flow Request Action URL generated in the first step.

    <html>
    <head>
        <meta charset="utf-8" />
        <title>Implementing Lead Landing Page with Flow</title>
        <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
    </head>
    <body style="font-family:'Segoe UI';background-color:lightskyblue">
            <b>Contact us</b>
            <table>
                <tr>
                    <td><label>Topic</label></td>
                    <td><input type="text" id="topic" name="topic" placeholder="insert topic"></td>
                </tr>
                <tr>
                    <td><label>First Name</label></td>
                    <td><input type="text" id="firstName" name="firstName" placeholder="insert first name"></td>
                </tr>
                <tr>
                    <td><label>Last Name</label></td>
                    <td><input type="text" id="lastName" name="lastName" placeholder="insert last name"></td>
                </tr>
                <tr>
                    <td><label>Email Address</label></td>
                    <td><input type="text" id="email" name="email" placeholder="insert email address"></td>
                </tr>
                <tr>
                    <td><label>Mobile Phone</label></td>
                    <td><input type="text" id="mobilePhone" name="mobilePhone" placeholder="insert mobile phone number"></td>
                </tr>
                <tr>
                    <td colspan="2">
                        <input value="Send" type="button" onclick="sendRequest()">
                    </td>
                </tr>
            </table>    
        <br />
        <div id="messageArea"></div>
    
        <script type="text/javascript">
    
            // Send new Lead request to Microsoft Flow
            sendRequest = function () {
    
                //set Flow request URL - available once Flow Request step is saved 
                var flowRequestURL = "https://FLOWREQUESTURL";
    
                //set Lead object
                var leadData = {
                    topic: $("#topic").val(),
                    firstName: $("#firstName").val(),
                    lastName: $("#lastName").val(),
                    email: $("#email").val(),
                    mobilePhone: $("#mobilePhone").val()                
                };
    
                //send request
                var req = new XMLHttpRequest();
                req.open("POST", flowRequestURL, true);
                req.setRequestHeader("Accept", "application/json");
                req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
                req.onreadystatechange = function () {
                    if (this.readyState == 4) {
                        req.onreadystatechange = null;
                        //handle success
                        if (this.status == 200) {
                            $("#messageArea").text(this.response);
                        }
                        //handle failure
                        else {
                            $("#messageArea").text(this.response);
                        }
                    }
                }
                //send request
                req.send(window.JSON.stringify(leadData));
            }
        </script>
    </body>
    </html>
    

Test your landing page by submitting a new Lead. If all works as expected, it will be instantly created in your Dynamics 365 instance.

Test your landing page by submitting a new Lead

If all works as expected, it will be instantly created in your Dynamics 365 instance.