Designing Accountable Integration

Reviewing various Dynamics 365 integration mechanisms for over a decade now (yes, Dynamics CRM back then) taught me some valuable lessons:

  • Integration mechanisms break down silently the second you look away. Usually metadata or data changed without notification, endpoint moved or some port closed due to security considerations.
  • Sooner or later after an integration mechanism break, someone will claim the problem is on your side on the integration.
  • An integration mechanism performance may degrade over time. Sometimes the volume of data sets you started with on day has growing rapidly so queries/business logic operations now consume more time and resources. Other times, an external endpoint you are using is now responding slowly.
  • Application user used for integration on Dynamics 365 side almost always has more privileges than it actually requires, often granted the System Administrator role.

In this post I would like to share some design tips that can help you build an accountable integration. For me, accountable integration implements few key concepts I’ll explain next.
While these design concepts may sound trivial, it is surprising how little they are actually implemented.

  1. Failing gracefully
    When integration mechanism fails, system administrator must be able to get the full failure details, but end users and the integrating party should not be exposed to failure details, as it usually means nothing to them and they can’t do anything about it anyway. For them, it is enough to know a failure occurred with the minimum required details. Another concern here is security: unhandled exception details can expose sensitive information to hackers and such.

    That means that no exception should go unhandled. Hermetically seal your code with try/catch blocks and if you let an exception float, make sure it is not exposing any redundant details.
  2. Diagnostic enabled
    The first element I look for in failure scenario is diagnostics as it should direct to the problem cause and sometimes even allow solving it without recompiling code.
    In many implementations, diagnostic elements are missing or turned off. In other, it is missing critical information.
    Any diagnostics element should at least be able to confirm that an integration mechanism actually executed, even if it did not operate on any data or performed any action. Additionally, did it succeed or fail and if so, what is the failure message? What is the time stamp for each execution and what were the incoming and out going parameters?

    If you are using Cusotm Workflow Activities or Plug-ins, that means using the Trace service. Azure elements like Function and Logic Apps also allow tracing.
    Make your own convention but make sure you are tracing the following key events:

    1. Execution start/end along with time stamps. This will give you an indication of any execution, event if it did nothing.
    2. Incoming and outgoing parameters. This will allow you to diagnose problems faster as unexpected parameter values are the culprit most of the time.
    3. Any outgoing calls to external APIs. Trace before and after any such call as well as the outgoing message. This will allow you to measure the time such call actually took, the outgoing request and response.
    4. Any exception
  3. Proactive
    Many implementations has logging/tracing enabled but no one even know how to access or analyze it. Sometimes a failure is detected days or weeks after it actually appeared.
    When failure occurs, the system administrator should know about it ASAP. I don’t expect system administrators to spend time scanning trace and logs searching for anomalies.
    An integration mechanism should be able to send an immediate notification of failure in the form of an email, SMS or push.

    If you are using Custom Workflow Activities, you can leverage the hosting process built-in emailing capabilities. Set one of the outgoing parameters to indicate success/failure, so you can send an email to the system administrator every time a failures occurs.
    Azure elements can leverage Application Insights Alerts capability to send SMS, email or push notification on failure.
  4. Least Exposed
    What’s wrong with granting a designated application user (used only for integration scenarios) with the System Admin Role? Everything.
    If this omnipotent user credentials leak (sometimes leaked), it exposes all of your data and allows an attacker to do maximum damage.

    When setting an application user to perform integration operations within Dynamics 365, make sure it has a least possible privileges to allow the required operations. Best to create a custom Security Role.

In the past, I posted about Custom Code Tracing & Exception Logging in Dynamics. This actual solution embeds most of the accountable integration concepts discussed here and may be a good starting point.

Implementing Dynamics 365 Service Bus Listener using Azure Function

One of the built-in triggers for Azure Function App is Service Bus (ASB) queue trigger, which makes Azure Function a listener for ASB events.
While Azure Service Bus provides a robust decoupling mechanism, the receiving side has to poll for messages or implement a listener

This trigger makes Azure Functions an ideal implementation component for Inbound Asynchronous Integration with Microsoft Dynamics 365,
one that allows you to write custom code, but releases you from explicitly writing a listener.
An Inbound Asynchronous Integration pattern describes scenarios where a business event occurs in an external application and must be reflected into Dynamics 365.
Processing of the event in Dynamics 365 does not occur immediately and sometimes no response is required from Dynamics 365 side (one way).

In this post I’ll walkthrough the process of setting an Azure Function as an ASB one-way listener which posts incoming messages into Dynamics 365.
To demonstrate posting a message to ASB I’ll use Postman.

Business Scenario:

Whenever a new Account record is created in Contoso’s ERP application, reflect the new Account data into Dynamics 365 CE.

Prerequisites:

  1. Azure subscription allowing you to register apps, provision Azure Service Bus and Azure Function App
  2. Microsoft Dynamics 365 9.x instance with admin privileges, a trial will do

Walkthrough:

  1. Setup Azure Service Bus queue and make a note of the connection string
  2. Register an AAD App and application user to allow using Microsoft Dynamics 365 Web API. Note the Application Id  and secret
  3. Setup Azure Function App with Service Bus queue trigger
     
    1. In Azure Portal, click ‘Create a resource’
      undefined
    2. Select Function App
      undefined
    3. Search for Function App (classic) and create a new one
      undefined

      undefined
    4. Set Function details and click ‘Create’
      undefined
    5. Select the new Function and click Function app settings
      undefined
    6. Select 1 as Runtime version
      undefined
    7. Select the Functions and Click New function
      undefined
    8. Select Azure Service Bus Queue trigger with C#
      undefined
    9. Set triggers details. Under Service Bus connection click new and select details for the ASB queue you set on stage 1

      undefined
      undefined
    10. Once back in the Function code, replace existing code with the code found here.
      This code parse the incoming message, authenticates to Dynamics 365 and create a new Account record.
      Replace the ApplicationId, Secret and WebAPIURL to match the details you noted on step 2 and your organization Web API URL.
      Save the Function and make sure it compiles.

      undefined
  4. Test by posting a message with the required format to ASB queue.
    You can post messages to ASB using Postman and this useful collection
    undefined
    Make sure Azure Function processes the message
    undefined
    and that the record was created in Dynamics 365
    undefined

Post message to Azure Service Bus via Custom Workflow Activity

While Dynamics 365 Plug-ins allow declarative integration with Azure Service Bus via ServiceEndPoints, integrating through Custom Workflow Activity requires some coding.

So why would you prefer a Custom Workflow Activity over a Plug-in?
Here are just a few good reasons:

  1. Cost-efficiency: A Custom Workflow Activity component can be embedded in Custom Actions, A/Sync. Workflow, Dialog (while it’s still here), while Plug-in component can only be called and operate via the Plug-in execution context. When you put time into writing code, you want to allow as much re-use as possible.
  2. Declarative: once registered, Custom Workflow Activity can be leveraged by non-developers in various Processes, while Plug-in is usually accessible only to developers.
  3. Ubiquity: A Custom Action wrapping a Custom Workflow Activity can be accessed from other processes, client side code, server side code and consumers external to Dynamics 365
  4. Flexibility: unlike the built in declarative integration, addressing Azure Service Bus via code allows you to add conditional logic and also reduce the message payload (see component code).

Convinced or not, the following walkthrough demonstrates using the Post Message To Azure Service Bus Custom Workflow Activity component to post message to ASB queue.
You can view the source code here and modify as required.

Prerequisites:

  1. Azure subscription allowing you to provision Azure Service Bus
  2. Microsoft Dynamics 365 9.x instance with admin privileges, a trial will do
  3. Access to Plugin Registration Tool

Walkthrough:

  1. Setup Azure Service Bus and make a note of the connection string
  2. Use the Plugin Registration Tool to register a ServiceEndPoint and the noted connection string
  3. Download, import and publish the unmanaged solution available here
  4. Configure the solution contained Workflow by selecting the registered ServiceEndPoint, stating semi-column delimited properties list and an optional custom message. Activate the Workflow Rule.


  5. Test by manually activating the Workflow on a Contact record.
    If all goes well, a new message will appear in ASB queue. You can view the queue dashboard in Azure Portal.

    undefined
    If you want to view the actual message content, you can use Azure Service Bus Explorer

    undefined

Few implementation notes:

  1. While the IWorkflowContext allows accessing the target record properties via Pre/Post entity images, this technique is not documented and therefor considered unsupported.
    For this reason, the code explicitly retrieves and push the required attributes values into the SharedVariables collection.
    The propertiesSet input parameter allows sending in a list of required properties.
  2. Clearing the IWorkflowContext data heavy collections before posting it to Azure Service Bus can reduce the the message payload weight by ~80%, which can simplify handling the message and improve performance.





Dynamics 365 Power Tel Aviv – Azure Integration Patterns

I had the privilege to deliver a lecture about Dynamics 365 Azure Integration Patterns in the DynamicsPower! Tel Aviv event last week.

This is the first 365 Saturday event in Israel, gathering Power Platform MVPs and experts from around the world to share their insights with the community.

I would like to thank Victor Dantas & Liran Debbi for their efforts in organizing this awesome event.

Click here to download my presentation.

Dynamics 365 Power Israel team

Walkthrough: Execute Azure Function from Microsoft Dynamics Entity Form

Azure Function is a fantastic mechanism for various integration scenarios. Here are few key characteristics:

  • Being a serverless application, Azure Function has the best time to market when it comes to deploying a web service
  • Pay-per-use pricing model means you pay only for what you use
  • Built in integration options in PowerApps and Flow allows you to give non-developers new building blocks when designing application and processes
  • CORS (Cross-Origin Resource Sharing) support allows consuming Functions from server/client side in any domain you find suitable

What can you do with Azure Functions in the context of Microsoft Dynamics integration scenarios? Just about anything:

  • Export/Import data to/from external application/data storage
  • Notify an external application on a business event
  • Get notification from external source
  • Handle complex or lengthy computation process (considering the Plug-in/Custom Workflow Activity execution timeout limitation)
  • Allow a 3rd party to interact with your Dynamics 365 organization without directly exposing an Dynamics endpoint and access credentials

So Why would you want to Execute Azure Function from an Entity Form?

  • To provide responsive and fluent UX
  • To avoid writing Plug-in/Custom Workflow Activity code which is not easy to debug
  • To respond to form level events such as field value change as it occurs

In this post, I’ll walkthrough executing Azure Function from Microsoft Dynamics 365 Entity form. This walkthrough was set up with Microsoft Dynamics 365 v9.0 but can be easily adapted to v8.2.
My sample scenario demonstrates sending new Lead data to an Azure data storage using Azure Function. Off course, you can use this walkthrough with an On-premises deployment, but you will have to allow access to Azure.
Thank you Nishant Rana  for an enlightening post that helped setup this walkthrough.

 

  1. Setup an Azure Function App

    The Function App is a container that will contain your Functions and will allow you to manage these components.
    In your Azure Portal (open a trial if you don’t have a subscription), type in ‘Function App’ in the search box and select the Function App item in the Marketplace.

    Select Function App

    In the next dialog, type in the Function App name (which will be part of the function URL) and fill all other required fields.
    Create a new Resource Group if you need one as well as Storage. Pin your function to the dashboard for easy access.

    Define Function App

    After few seconds you will be navigated to the dashboard. Wait while your Function App is setup and then you will be navigated to the Function App design area

    Function App design area

    Click the Functions node and then click the ‘+ New function’ button to add a new Function

    Click the Functions node and then click the ‘+ New function’ button to add a new Function.

    Select the Generic webhook option coupled with C#  language

    Select the Generic webhook option coupled with C#  language

    In the next dialog, give your function a meaningful name and click ‘Create’

    In the next dialog, give your function a meaningful name and click ‘Create’
    Since we want to handle new Lead data by queuing it, click the Integrate node to add a new Function output and select the Azure Queue Storage.

    Since we want to handle new Lead data by queuing it, click the Integrate node to add a new Function output and select the Azure Queue Storage

    In the next dialog, note the Message parameter name as it will be part of the function code. Click ‘Save’

    In the next dialog, note the Message parameter name as it will be part of the function code. Click ‘Save’

    Click the Function node to access the Function code

    Click the Function node to access the Function code

    Replace the existing code with the following code block and click ‘Save’

    #r "Newtonsoft.Json"
    
    using System.Net;
    using Newtonsoft.Json;
    
    public class Lead
    {
        public string Topic { get; set;}
        public string FullName { get; set;}
        public string Email { get; set;}
    }
    
    //function entry point 
    public static async Task Run(HttpRequestMessage req, TraceWriter log, IAsyncCollector outputQueueItem)
    {
        //trace incoming request 
        log.Info($"New HandleNewLead request received");
    
        //parse request into Lead object
        string jsonContent = await req.Content.ReadAsStringAsync();
        var lead = JsonConvert.DeserializeObject(jsonContent);
    
        //trace Lead data
        log.Info($"Lead data: topic: {lead.Topic}, full name: {lead.FullName}, email: {lead.Email}");
    
        // add lead object to queue
        await outputQueueItem.AddAsync(lead);
    
        //return response to the caller 
        return req.CreateResponse(HttpStatusCode.OK, new { message = "Lead processed successfully" });
    }

    Opening the Logs pane below, you can see a successful compilation message

    Opening the Logs pane, you can see a success compilation message

    To test your Function, open the right pane and click the Test tab. Feed in some test JSON data and click Run. If all went well, you will receive a success message as well as a Function response

    To test the function, open the right pane and click the Test tab. Feed in some test JSON data and click Run

    Clicking the Monitor node, you can see the queued test Lead data

    Clicking the Monitor node, you can see the queued Lead data

    Before leaving the Function area, click the Get function URL and copy it. You will be using it later in your Entity form JavaScript code

    Before leaving the Function area, click the Get function URL and copy it

  2. Configure Cross-Origin Resource Sharing

    In order to consume your Function from Microsoft Dynamics 365 organization which may be residing in a different domain, you’ll need to define it as an allowed origin under the CORS element:

    In order to consume your Function from Microsoft Dynamics 365 organization which may be residing in a different domain, you need to define it as an allowed origin under the CORS element

    Add your Microsoft Dynamics 365 organization base URL and click ‘Save’

    Add your Microsoft Dynamics 365 organization base URL and click ‘Save’

  3. Setup entity form code and event

    Head into your Microsoft Dynamics 365 organization and define a new JavaScript Web Resource named dyn_AzureServicesLib.js with the following code.
    Replace the AZURE_BASE_ENDPOINT and AZURE_FUNCTION_ENDPOINT constants with the Function URL you copied earlier. Note the separation of the base URL part from the the function and code part

    (function (ns) {
    
        //constants 
        Constants = function () {
            this.CALLING_MODULE_NAME = "dyn_AzureServicesLib.js";
            this.AZURE_BASE_ENDPOINT = "https://dyneventhandlersample.azurewebsites.net/api/";
            this.AZURE_FUNCTION_ENDPOINT = "HandleNewLead?code=xxx";
            this.FORM_TYPE_CREATE = 1;
            this.MSG_OPERATION_SUCCESS = "Lead successfully exported :)";
            this.MSG_OPERATION_FAILURE = "Something went wrong :(";
    
            return this;
        }();
    
        //members
        var formContext = null;
    
        //public methods 
    
        //Export newly created Lead record to external storage/application 
        ns.exportLead = function (executionContext) {
            debugger
    
            //get form context 
            formContext = executionContext.getFormContext();
    
            //get form type
            var formType = formContext.ui.getFormType();
    
            //operate for newly created records only
            if (formType == Constants.FORM_TYPE_CREATE) {
                //extract Lead details        
                var lead = {
                    firstname: formContext.getAttribute("firstname").getValue(),
                    lastname: formContext.getAttribute("lastname").getValue(),
                    email: formContext.getAttribute("emailaddress1").getValue()
                }
    
                //send Lead details to Azure Function 
                executeAzureFunction(lead, handleExportSuccess, handleExportFailure);
            }
        }
    
        //private methods
    
        //handle opertation success
        handleExportSuccess = function (response) {
            formContext.ui.setFormNotification(MSG_OPERATION_SUCCESS, "INFO", null);
        }
    
        //handle opertation failure
        handleExportFailure = function (response) {
            formContext.ui.setFormNotification(MSG_OPERATION_FAILURE, "ERROR", null);
        }
    
        //execute Azure Function to process Lead details
        executeAzureFunction = function (lead, successHandler, failureHandler) {
            debugger
    
            //set Azure Function endpoint
            var endpoint = Constants.AZURE_BASE_ENDPOINT + Constants.AZURE_FUNCTION_ENDPOINT;
    
            //define request
            var req = new XMLHttpRequest();
            req.open("POST", endpoint, true);
            req.setRequestHeader("Accept", "application/json");
            req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
            req.setRequestHeader("OData-MaxVersion", "4.0");
            req.setRequestHeader("OData-Version", "4.0");
            req.onreadystatechange = function () {
                if (this.readyState == 4) {
                    req.onreadystatechange = null;
    
                    if (this.status == 200) {
                        successHandler(JSON.parse(this.response));
                    }
                    else {
                        failureHandler(JSON.parse(this.response).error);
                    }
                }
            }
            //send request
            req.send(window.JSON.stringify(lead));
        }
    })(window.AzureServicesLib = window.AzureServicesLib || {});
    

    Next, go to the Lead entity form designer and add the dyn_AzureServicesLib.js Web Resource in the Form Properties dialog.
    Bind the form OnSave event to the AzureServicesLib.exportLead function.  Make sure you check the ‘Pass execution context…’ option.
    Save and Publish.

    add the dyn_AzureServicesLib.js Web Resource in the Form Properties dialog

    Make sure you check the ‘Pass execution context…’ option

  4. Test

    Create a new Lead record and save it. If all went well, you will see a form level success notification

    Create a new Lead record. If all went well, you will see a form success notification

    Going back to your Function monitoring area, you should see your new Lead data queued successfully

    Going back to your Function monitoring area, you should see your new Lead data queued successfully