Designing Accountable Integration

Reviewing various Dynamics 365 integration mechanisms for over a decade now (yes, Dynamics CRM back then) taught me some valuable lessons:

  • Integration mechanisms break down silently the second you look away. Usually metadata or data changed without notification, endpoint moved or some port closed due to security considerations.
  • Sooner or later after an integration mechanism break, someone will claim the problem is on your side on the integration.
  • An integration mechanism performance may degrade over time. Sometimes the volume of data sets you started with on day has growing rapidly so queries/business logic operations now consume more time and resources. Other times, an external endpoint you are using is now responding slowly.
  • Application user used for integration on Dynamics 365 side almost always has more privileges than it actually requires, often granted the System Administrator role.

In this post I would like to share some design tips that can help you build an accountable integration. For me, accountable integration implements few key concepts I’ll explain next.
While these design concepts may sound trivial, it is surprising how little they are actually implemented.

  1. Failing gracefully
    When integration mechanism fails, system administrator must be able to get the full failure details, but end users and the integrating party should not be exposed to failure details, as it usually means nothing to them and they can’t do anything about it anyway. For them, it is enough to know a failure occurred with the minimum required details. Another concern here is security: unhandled exception details can expose sensitive information to hackers and such.

    That means that no exception should go unhandled. Hermetically seal your code with try/catch blocks and if you let an exception float, make sure it is not exposing any redundant details.
  2. Diagnostic enabled
    The first element I look for in failure scenario is diagnostics as it should direct to the problem cause and sometimes even allow solving it without recompiling code.
    In many implementations, diagnostic elements are missing or turned off. In other, it is missing critical information.
    Any diagnostics element should at least be able to confirm that an integration mechanism actually executed, even if it did not operate on any data or performed any action. Additionally, did it succeed or fail and if so, what is the failure message? What is the time stamp for each execution and what were the incoming and out going parameters?

    If you are using Cusotm Workflow Activities or Plug-ins, that means using the Trace service. Azure elements like Function and Logic Apps also allow tracing.
    Make your own convention but make sure you are tracing the following key events:

    1. Execution start/end along with time stamps. This will give you an indication of any execution, event if it did nothing.
    2. Incoming and outgoing parameters. This will allow you to diagnose problems faster as unexpected parameter values are the culprit most of the time.
    3. Any outgoing calls to external APIs. Trace before and after any such call as well as the outgoing message. This will allow you to measure the time such call actually took, the outgoing request and response.
    4. Any exception
  3. Proactive
    Many implementations has logging/tracing enabled but no one even know how to access or analyze it. Sometimes a failure is detected days or weeks after it actually appeared.
    When failure occurs, the system administrator should know about it ASAP. I don’t expect system administrators to spend time scanning trace and logs searching for anomalies.
    An integration mechanism should be able to send an immediate notification of failure in the form of an email, SMS or push.

    If you are using Custom Workflow Activities, you can leverage the hosting process built-in emailing capabilities. Set one of the outgoing parameters to indicate success/failure, so you can send an email to the system administrator every time a failures occurs.
    Azure elements can leverage Application Insights Alerts capability to send SMS, email or push notification on failure.
  4. Least Exposed
    What’s wrong with granting a designated application user (used only for integration scenarios) with the System Admin Role? Everything.
    If this omnipotent user credentials leak (sometimes leaked), it exposes all of your data and allows an attacker to do maximum damage.

    When setting an application user to perform integration operations within Dynamics 365, make sure it has a least possible privileges to allow the required operations. Best to create a custom Security Role.

In the past, I posted about Custom Code Tracing & Exception Logging in Dynamics. This actual solution embeds most of the accountable integration concepts discussed here and may be a good starting point.

Dynamics 365 Power Tel Aviv – Azure Integration Patterns

I had the privilege to deliver a lecture about Dynamics 365 Azure Integration Patterns in the DynamicsPower! Tel Aviv event last week.

This is the first 365 Saturday event in Israel, gathering Power Platform MVPs and experts from around the world to share their insights with the community.

I would like to thank Victor Dantas & Liran Debbi for their efforts in organizing this awesome event.

Click here to download my presentation.

Dynamics 365 Power Israel team

Integrating Dynamics 365 with Azure Function using Managed Identity

I love Azure Functions. These days, I have to restrain the tendency to solve every problem with this Swiss army knife. 

When designing Azure Function integration with Dynamics 365, one of the immediate questions raised is where to store connection strings, credentials and other sensitive details. 
The immediate answer is ‘Not in your code’. So where?

One option is to store sensitive details in the Application Settings store which is ‘encrypted at rest and transmitted over an encrypted channel.’
While this option is  quite easy to use, it isn’t considered most secured.

One option is to store sensitive details in the Application Settings store

Another option is using Managed Identity with Azure Key Vault service.
With this option, Azure Function App is assigned an AAD identity, which is then authorized to access specific Key Vault secrets.
This option is considered more secured as the Function App is specifically granted access to  specific sensitive data, while Application Settings stored data is generally exposed.

In this post, I’ll walkthrough the process of setting and using Managed Identity to Integrate Azure Function with Dynamics 365. In this case, Dynamics 365 access details will be stored in Azure Key Vault.

For the walkthrough, I’ll use the Lead Landing Page scenario, replacing Flow component with Azure Function. Although a bit verbose, the process is quite simple.

Implementation Notes

  • Like most Azure services, Azure Key Vault usage costs money. With Azure Key Vault, any retrieval of secret is paid for. In order to reduce costs, some caching mechanism (which will not be discussed in this post) is in order.

Prerequisites

  1. Have accessible Microsoft Dynamics 365 instance
  2. Have access to Azure environment with sufficient privileges to create Azure Function and Key Vault store.

Walkthrough

  1. Register App with AAD

    Register a new Web App/API in Azure AD with a Client Secret and copy Application Id key and secret to Notepad.

  2. Add an App User to Dynamics 365

    Follow this article: Add a new Application User in Dynamics 365

  3. Create a Function App

    Create a Function App

    Set Function App details and click ‘Create’

    Set Function App details and click ‘Create’

    Add a new Function

    Add a new Function

    Select the HTTP trigger option

    Select the HTTP trigger option

    Click ‘Create’ to complete Function creation

    Click ‘Create’ to complete Function creation

    Leave the function code as is for now, we will alter it later on.

    Leave the function code as is for now, we will alter it later on.

  4. Assign Function App with a Managed Identity

    Go to the Function App Platform Features tab and click ‘Identity’

    Go to the Function App Platform Features tab and click ‘Identity’

    Click the ‘On’ button to assign a System identity and click ‘Save’

    Click the ‘On’ button to assign a System identity and click ‘Save’

    Click ‘Yes’ to enable system assigned managed indentity

    Click ‘Yes’ to enable system assigned managed indentity

    You can see the newly assigned identity object ID

    You can see the newly assigned identity object ID

  5. Setup Azure Key Vault store

    Create a new Azure Key Vault store

    Create a new Azure Key Vault store

    Click ‘Create’ at the bottom of the screen

    Click ‘Create’ at the bottom of the screen

    Click ‘Add new’ under the Access policies blade.
    In the Add access policy, check Get and List in the Key Management Operations group.
    Under the Principal blade, find and select your Function App. Click ‘Select’.
    This will grant our Function identity access to the Key Vault secrets.

    Click ‘Add new’ under the Access policies blade. In the Add access policy, check Get and List in the Key Management Operations group.

    Next, select a Resource Group for the Key Vault store. Click ‘Create’ to complete Azure Key Vault store creation

    Next, select a Resource Group for the Key Vault store

  6. Store secrets in Azure Key Vault

    Find the newly created Azure Key Vault store or access it from the dashboard if possible.

    Find the newly created Azure Key Vault store or access it from the dashboard if possible.

    Access the Secrets area

    Access the Secrets area

    Click Secrets and ‘Generate/import’ to generate a new secret

    Click ‘Generate/import’ to generate a new secret

    Set secret Name (select a self explanatory name, since once created, you won’t be able to see the actual secret value in the area).
    Set the secret string in the Value field. Click ‘Create’.
    In this case, the secret I defined is Dynamics Web API URL, similar to https://<ORGNAME>.api.crm<DATACENTERCODE>.dynamics.com/api/data/v9.1/

    Set secret Name (select a meaningful name, as this will be used in our code). Set the secret string in the Value field

    In the same manner, add additional secrets to hold the applicationId and secret keys you copied after registering an app in AAD (step 1 in this walkthrough).

    In the same manner, add additional secrets to hold the applicationId and secret keys you copied after registering an app in AAD

    Click each of the newly created secrets and copy the Secret Identifier, which will be used in the code to access the secret value

    Click each of the newly created secrets and copy the Secret Identifier, which will be used in the code to access the secret value

  7. Update Azure Function Code 

    Go back to the Function created on step 3 above.
    Click View Files, add a new project.json file and paste in the following definition. Click ‘Save’.

    {
        “frameworks”: {
            “net46”: {
                “dependencies”: {
                    “Microsoft.Azure.KeyVault”: “2.4.0-preview”,
                    “Microsoft.Azure.Services.AppAuthentication”: “1.1.0-preview”                
                }
            }
        }
    }

    image

    Go back to the function code and replace the existing code with the code found here (placed in Github for convenience).

    This code, triggered by an HTTP request from the Lead landing page, performs the following tasks:
    – Receives and parse Lead data
    – Extract Dynamics access details from Azure Key Vault
    – Use access details to generate an access token
    – Create a new Dynamics Lead record using Web API
    – Returns operation result to the caller

    In the GetDyn365AccessDetails method, replace the URLs for the three keys

    dyn365WebAPIURL, dyn365AppIdKVURL, dyn365secretKVURL with the URLs copied on step 6.
    Click ‘Save’.

    Click ‘Get Function URL’ and paste somewhere, as it will be used next

    Click ‘Get Function URL’ and paste somewhere, as it will be used next

  8. Hookup Lead Landing Page to Azure Function

    Last, create a new HTML page, and copy the HTML found here (placed in Github for convenience).
    Find the AzureFunctionRequestURL variable and replace its value with the Azure Function URL copied in the previous step. Save.

    Find the AzureFunctionRequestURL variable and replace its value with the Azure Function URL copied in the previous step

  9. Test

    To test the solution, run the Lead Landing HTML page. Submitting Lead data should results with a new Lead record in Dynamics 365.

    To test the whole flow, run the Lead Landing HTML page

    Submitting Lead data should results with a new Lead record in Dynamics 365.

    If the flow fails, add the address from which the HTML page is executed to the Azure Function CORS collection to authorize it.

    If the flow fails, add the address from which the HTML page is executed to the Azure Function CORS collection to authorize it.

    If the flow fails, add the address from which the HTML page is executed to the Azure Function CORS collection to authorize it.

Referencing Dynamics Assemblies with Azure Function Apps v.2

Just stumbled upon a new  Azure environment, where Azure Function Apps have been upgraded to version 2.
Right away, noticed that my Azure Function code referencing Dynamics assemblies does not compile, complaining about

The type or namespace name ‘Xrm’ does not exist in the namespace ‘Microsoft’ (are you missing an assembly reference?)

After digging around, I found out that the project.json is not longer valid with v.2.
Instead, the function.proj file must be created and reference Dynamics assemblies in the following manner:

<Project Sdk=”Microsoft.NET.Sdk”>

  <PropertyGroup>

    <TargetFramework>461</TargetFramework>

  </PropertyGroup>

  <ItemGroup>

    <PackageReference Include=”Microsoft.CrmSdk.CoreAssemblies” Version=”9.0.0.7″/>

    <PackageReference Include=”Microsoft.CrmSdk.XrmTooling.CoreAssembly” Version=”9.0.0.7″/>

  </ItemGroup>

</Project>

Instead, the function.proj file must be created and reference Dynamics assemblies in the following manner

Microsoft Graph API – Assign DYN365 License to AAD User

The automated process of user provisioning becomes common in many projects. Often, the process includes components outside of Dynamics 365 such as  creating a user in Azure Active Directory, assigning plans and licenses and adding user to AAD groups. All of these can be automated using the Microsoft Graph API.

In this post, I’ll demonstrate assigning Microsoft Dynamics 365 license to an existing user with Postman. You can later convert Postman requests to JS or C# code or use in Flow and Logic Apps.

Prerequisites

  1. Have Postman application installed
  2. Have access to Office 365 and Azure environments

Walkthrough

  1. Register a new Web App/API in Azure AD with a Client Secret and copy Application Id key, callback URL and secret to Notepad.
    Make sure your app is granted the ‘Read directory data’ privilege

    Make sure your app has at least the Read directory data privilege

  2. Set a request to retrieve available SKUs

    In Postman, create a new GET request to the following address, after replacing <YOUR_DOMAIN_NAME> with your actual domain name (e.g. bikinibottom.onmicrosoft.com)

    https://graph.microsoft.com/v1.0/<YOUR_DOMAIN_NAME>.onmicrosoft.com/subscribedSkus

    In Postman, create a new GET request

  3. Get an Authorization Token

    Before actually accessing the Graph API, you’ll need an access token to authenticate your requests.
    Click the Authorization tab, and set type to OAuth2.0.
    Click the Get New Access Token and select the Authorization Code option for the Grant type attribute In the dialog opened.
    Fill in the Client ID, Client Secret and Callback URL details copied previously to Notepad.

    Fill in the Client ID, Client Secret and Callback URL details

    Click the ‘Request Token’ button to receive a dialog containing an Access Token. Scroll down and clock ‘Use Token’ button

    Click the ‘Request Token’ button to receive a dialog containing an Access Token

  4. Retrieve Subscriptions

    Click ‘Send’ to execute the request which will retrieve a list of commercial subscriptions that your organization has acquired.

    execute the request which will retrieve a list of commercial subscriptions that your organization has acquired

    If you have Dynamics 365 licenses, the following node will be included in the response.
    Copy the skuId value which will be used in the next step.

    If you have Dynamics 365 licenses, the following node is included in the response

  5. Assign License to User

    Create an additional POST request to assign license to user with the following URL. Replace the target user principal name 
    (e.g.
    sandycheeks@bikinibottom.onmicrosoft.com)

    https://graph.microsoft.com/v1.0/users/<USER_PRINCIPAL_NAME>/assignLicense

    In the Authorization tab, set the same settings as the first request. As you already have a token, no need to get a new one.
    Add a header of  Content-Type = application/json

    Add a header of  Content-Type = application/json

    Set the request body with the previously copied skuId

    Set the request body with the previously copied skuId

    Click ‘Send’ to send the request and assign Dynamics 365 license to the target user.
    A correct response looks like this

    A correct response looks like this

 

 

 

Execute a Recurring Job in Microsoft Dynamics 365 with Azure Scheduler

The requirement for recurring job execution is quite common in Microsoft Dynamics implementations. Here are some of the business requirements I have encountered:

  • Send monthly newsletter to target customers
  • Synchronize MSCRM Users details with Active Directory once a day
  • Once a month, disqualify all leads that have no open activities
  • Once every hour, export Appointments from mail server and import into Dynamics 365

Microsoft Dynamics 365 has no reliable built in scheduling mechanism that can be leveraged for custom solutions. The Asynchronous Batch Process Pattern I have written about in the past can be used with daily recurring jobs but when it comes to hours resolution and less, it becomes unsteady.

Azure Scheduler is a reliable service that can run jobs in or out of Azure on a predefined schedule, multiple times or just once. So why not harness this mechanism to schedule Microsoft Dynamics 365 recurring jobs?

In this post, I’ll demonstrate how to use Azure Scheduler to execute a recurring job in Microsoft Dynamics 365.

Sample business requirement

Each day, automatically email a birthday greeting to contacts whose birthday occurs on that same day.

Implementation Details

Here are the solution main components:

  1. Custom Action dyn_SendBirthdayGreetings: activates a Custom Workflow Activity SendBirthdayGreeting which Retrieve all relevant Contact records by birthdate, creates and sends an email for each contact record.
  2. Azure Function BirthdayGreetingsFunction: invokes the dyn_SendBirthdayGreetings Custom Action via Microsoft Dynamics 365 API.
  3. Azure Scheduler BirthdayGreetingsSchduler: set to execute once a day at 9:00 for unlimited occurrences and invokes the BirthdayGreetingsFunction Azure Function

Architectural Notes

Why using Custom Action? Although it is possible to manage the required business logic in Azure Function, Dynamics related business logic should reside within Dynamics, managed and packaged in a Solution. This way, the Scheduling and Executing components are kept agnostic and isolated as possible and therefore easily maintained.
Having said that, you should be aware of the Sandbox Execution Timeout Limitation and consider using Custom Workflow Activity after assessing the business logic at hand.

Implementation Steps:

  1. Define dyn_SendBirthdayGreetings Custom Action

    Download, import and publish the unmanaged BirthdayGreetingAutomationSample_1_0_0_0 solution.
    It contains a Custom Action called dyn_sendBirthdayGreeting which will be later called from the BirthdayGreetingsFunction Azure Function.
    By default, the Custom Action will create a birthday greeting email but will not send it. To change this, disable the Custom Workflow Activity, edit the only step and change the Send greeting email after creation? property value to true. Note that this may actually send email to your contacts.
    The SendBirthdayGreeting Custom Workflow Activity code can be found here.

  2. Define BirthdayGreetingsFunction Azure Function

    After creating a Function App (follow the first 3 steps here), create a new C# Function of type Generic webhook under your Function App functions collection, name it SendBirthdayGreetingFunction

    Create a new Function of type Generic webhook under your Function App collection, name it SendBirthdayGreetingFunction

    name it SendBirthdayGreetingFunction
    Click the App Service Editor option in the Application Settings tab

    Click the App Service Editor option in the Application Settings tab

    Add a new file under your Function root, name it project.json. Copy the following JSON snippet into the text editor

    Add a new file under your Function root, name it project.json. Copy the following JSON snippet into the text editor

    {
    	"frameworks": 
    	{
    		"net46":
    		{
    			"dependencies": 
    			{
    				"Microsoft.CrmSdk.CoreAssemblies": "9.0.0.7",
    				"Microsoft.CrmSdk.XrmTooling.CoreAssembly": "9.0.0.7"
    			}
    		}
    	}
    }
    

    Close the App Service Editor to get back to your function. Selecting your Function App, click the Application settings tab.

    Close the App Service Editor to get back to your function. Selecting your Function App, click the Application settings tab

    Scroll down to the Connection strings area and add a new Connection string named TargetOrganization. This will be used to connect and authenticate to your Microsoft Dynamics 365 Organization.
    For the connection string value, set your organization details in the following format:

    AuthType=Office365;Username=XXX@ORGNAME.onmicrosoft.com;Password=PASSWORD;Url=https://ORGNAME.crm.dynamics.com

    Note the data center targeted, crm.dynamics.com is targeting an organization in North America data center.

    Click Save to save the newly created Connection String.

    Scroll down to the Connection strings area and add a new Connection string named TargetOrganization
    Navigate back to your SendBirthdayGreetingFunction function and  replace the default Function code with the following code snippet.
    Note that code is executing a Custom Action named dyn_SendBIrthdayGreetings.It also uses the TargetOrganization connection string when accessing Microsoft Dynamics 365 API.

    using System.Net;
    using System.Configuration;
    using Microsoft.Xrm.Sdk;
    using Microsoft.Xrm.Sdk.Client;
    using Microsoft.Xrm.Tooling.Connector;
    
    public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log)
    {
        string actionResponse = string.Empty;
    
        //define Dynamics Custom target Action name 
        string actionName = "dyn_SendBirthdayGreetings";
    
        log.Info("Processing new SendBirthdayGreetingFunction request");
    
        //init Dynamics connection, authenticate and get referent to the Organization Service
        IOrganizationService organizationService = InitDynamicsConnection();
    
        //execute Custom Action  
        OrganizationRequest sendBirthdayGreetingsReq = new OrganizationRequest(actionName);
        OrganizationResponse sendBirthdayGreetingsRes = 
            (OrganizationResponse)organizationService.Execute(sendBirthdayGreetingsReq);
    
        //return completion status response 
        return actionResponse == null
            ? req.CreateResponse(HttpStatusCode.BadRequest, "An error occured")
            : req.CreateResponse(HttpStatusCode.OK, "Action completed successfully");
    }
    
    //init Dynamics connection
    private static IOrganizationService InitDynamicsConnection()
    {
        IOrganizationService result = null;
    
        ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
        CrmServiceClient client = new CrmServiceClient(
            ConfigurationManager.ConnectionStrings["TargetOrganization"].ConnectionString);
    
        result = client.OrganizationServiceProxy;
    
        return result;
    }
    

    Navigate to the Function Integrate area and set the Mode to standard. This will enable the consumption of the function using HTTP GET method.

    Navigate to the Function Integrate area and set the Mode to standard. This will enable the consumption of the function using HTTP GET method

    Back at your Function code editor, expand the collapsed Test pane and set the HTTP method to GET. Click Run to test your function. If all went well, a success message will be returned.

    Back at your Function code editor, open expand the collapsed Test pane and set the HTTP method to GET. Click Run to test your function. If all went well, a success message will be returned.

    Finally, click the </> Get function URL button on top and copy your function endpoint address.

  3. Define BirthdayGreetingSchduler Azure Scheduler

    Click the + New button on the left menu, type in Scheduler and select the Scheduler element returned. Click Create

    Click the + New button on the left menu, type in Scheduler and select the Scheduler element returned

    Define the scheduler name to BirthdayGreetingSchduler.

    Define the scheduler name to BirthdayGreetingSchduler

    Click the Action settings tile. Set the Action to Https, method to Get and paste in the Function URL you copied at the end of the step 2 above. Click OK

    Click the Action settings tile. Set the Action to Https, method to Get and paste in the Function URL you copied at the end of the step 2 above. Click OK

    Click the Schedule tile and set the schedule as Recurring, frequency to Days and End to Never. Click the Advanced schedule pane and set 9 in the Hours text box. Click Ok.
    This will trigger your function every day at 9:00 for unlimited number of times.
    Check Pin to dashboard and click Create.

    This will trigger your function every day at 9:00 for unlimited number of times.

    After a few seconds, you will be navigated to the Scheduler management area.

  4. Test

    To test the complete solution, update a Microsoft Dynamics 365 test Contact record birthdate to match the day and month of the current date.

    Go back to your Scheduler Job and click Run now.

    Go back to your Scheduler Job and click Run now.

    Clicking the History tab, you can monitor the Schedule job completion status

    Clicking the History tab, you can monitor the Schedule job completion status

    Refreshing the Sample Contact activities list, you should be able to view the newly created Birthday greeting email

    Refreshing the Sample Contact activities list, you should be able to view the newly created email

Walkthrough: Execute Azure Function from Microsoft Dynamics Entity Form

Azure Function is a fantastic mechanism for various integration scenarios. Here are few key characteristics:

  • Being a serverless application, Azure Function has the best time to market when it comes to deploying a web service
  • Pay-per-use pricing model means you pay only for what you use
  • Built in integration options in PowerApps and Flow allows you to give non-developers new building blocks when designing application and processes
  • CORS (Cross-Origin Resource Sharing) support allows consuming Functions from server/client side in any domain you find suitable

What can you do with Azure Functions in the context of Microsoft Dynamics integration scenarios? Just about anything:

  • Export/Import data to/from external application/data storage
  • Notify an external application on a business event
  • Get notification from external source
  • Handle complex or lengthy computation process (considering the Plug-in/Custom Workflow Activity execution timeout limitation)
  • Allow a 3rd party to interact with your Dynamics 365 organization without directly exposing an Dynamics endpoint and access credentials

So Why would you want to Execute Azure Function from an Entity Form?

  • To provide responsive and fluent UX
  • To avoid writing Plug-in/Custom Workflow Activity code which is not easy to debug
  • To respond to form level events such as field value change as it occurs

In this post, I’ll walkthrough executing Azure Function from Microsoft Dynamics 365 Entity form. This walkthrough was set up with Microsoft Dynamics 365 v9.0 but can be easily adapted to v8.2.
My sample scenario demonstrates sending new Lead data to an Azure data storage using Azure Function. Off course, you can use this walkthrough with an On-premises deployment, but you will have to allow access to Azure.
Thank you Nishant Rana  for an enlightening post that helped setup this walkthrough.

 

  1. Setup an Azure Function App

    The Function App is a container that will contain your Functions and will allow you to manage these components.
    In your Azure Portal (open a trial if you don’t have a subscription), type in ‘Function App’ in the search box and select the Function App item in the Marketplace.

    Select Function App

    In the next dialog, type in the Function App name (which will be part of the function URL) and fill all other required fields.
    Create a new Resource Group if you need one as well as Storage. Pin your function to the dashboard for easy access.

    Define Function App

    After few seconds you will be navigated to the dashboard. Wait while your Function App is setup and then you will be navigated to the Function App design area

    Function App design area

    Click the Functions node and then click the ‘+ New function’ button to add a new Function

    Click the Functions node and then click the ‘+ New function’ button to add a new Function.

    Select the Generic webhook option coupled with C#  language

    Select the Generic webhook option coupled with C#  language

    In the next dialog, give your function a meaningful name and click ‘Create’

    In the next dialog, give your function a meaningful name and click ‘Create’
    Since we want to handle new Lead data by queuing it, click the Integrate node to add a new Function output and select the Azure Queue Storage.

    Since we want to handle new Lead data by queuing it, click the Integrate node to add a new Function output and select the Azure Queue Storage

    In the next dialog, note the Message parameter name as it will be part of the function code. Click ‘Save’

    In the next dialog, note the Message parameter name as it will be part of the function code. Click ‘Save’

    Click the Function node to access the Function code

    Click the Function node to access the Function code

    Replace the existing code with the following code block and click ‘Save’

    #r "Newtonsoft.Json"
    
    using System.Net;
    using Newtonsoft.Json;
    
    public class Lead
    {
        public string Topic { get; set;}
        public string FullName { get; set;}
        public string Email { get; set;}
    }
    
    //function entry point 
    public static async Task Run(HttpRequestMessage req, TraceWriter log, IAsyncCollector outputQueueItem)
    {
        //trace incoming request 
        log.Info($"New HandleNewLead request received");
    
        //parse request into Lead object
        string jsonContent = await req.Content.ReadAsStringAsync();
        var lead = JsonConvert.DeserializeObject(jsonContent);
    
        //trace Lead data
        log.Info($"Lead data: topic: {lead.Topic}, full name: {lead.FullName}, email: {lead.Email}");
    
        // add lead object to queue
        await outputQueueItem.AddAsync(lead);
    
        //return response to the caller 
        return req.CreateResponse(HttpStatusCode.OK, new { message = "Lead processed successfully" });
    }

    Opening the Logs pane below, you can see a successful compilation message

    Opening the Logs pane, you can see a success compilation message

    To test your Function, open the right pane and click the Test tab. Feed in some test JSON data and click Run. If all went well, you will receive a success message as well as a Function response

    To test the function, open the right pane and click the Test tab. Feed in some test JSON data and click Run

    Clicking the Monitor node, you can see the queued test Lead data

    Clicking the Monitor node, you can see the queued Lead data

    Before leaving the Function area, click the Get function URL and copy it. You will be using it later in your Entity form JavaScript code

    Before leaving the Function area, click the Get function URL and copy it

  2. Configure Cross-Origin Resource Sharing

    In order to consume your Function from Microsoft Dynamics 365 organization which may be residing in a different domain, you’ll need to define it as an allowed origin under the CORS element:

    In order to consume your Function from Microsoft Dynamics 365 organization which may be residing in a different domain, you need to define it as an allowed origin under the CORS element

    Add your Microsoft Dynamics 365 organization base URL and click ‘Save’

    Add your Microsoft Dynamics 365 organization base URL and click ‘Save’

  3. Setup entity form code and event

    Head into your Microsoft Dynamics 365 organization and define a new JavaScript Web Resource named dyn_AzureServicesLib.js with the following code.
    Replace the AZURE_BASE_ENDPOINT and AZURE_FUNCTION_ENDPOINT constants with the Function URL you copied earlier. Note the separation of the base URL part from the the function and code part

    (function (ns) {
    
        //constants 
        Constants = function () {
            this.CALLING_MODULE_NAME = "dyn_AzureServicesLib.js";
            this.AZURE_BASE_ENDPOINT = "https://dyneventhandlersample.azurewebsites.net/api/";
            this.AZURE_FUNCTION_ENDPOINT = "HandleNewLead?code=xxx";
            this.FORM_TYPE_CREATE = 1;
            this.MSG_OPERATION_SUCCESS = "Lead successfully exported :)";
            this.MSG_OPERATION_FAILURE = "Something went wrong :(";
    
            return this;
        }();
    
        //members
        var formContext = null;
    
        //public methods 
    
        //Export newly created Lead record to external storage/application 
        ns.exportLead = function (executionContext) {
            debugger
    
            //get form context 
            formContext = executionContext.getFormContext();
    
            //get form type
            var formType = formContext.ui.getFormType();
    
            //operate for newly created records only
            if (formType == Constants.FORM_TYPE_CREATE) {
                //extract Lead details        
                var lead = {
                    firstname: formContext.getAttribute("firstname").getValue(),
                    lastname: formContext.getAttribute("lastname").getValue(),
                    email: formContext.getAttribute("emailaddress1").getValue()
                }
    
                //send Lead details to Azure Function 
                executeAzureFunction(lead, handleExportSuccess, handleExportFailure);
            }
        }
    
        //private methods
    
        //handle opertation success
        handleExportSuccess = function (response) {
            formContext.ui.setFormNotification(MSG_OPERATION_SUCCESS, "INFO", null);
        }
    
        //handle opertation failure
        handleExportFailure = function (response) {
            formContext.ui.setFormNotification(MSG_OPERATION_FAILURE, "ERROR", null);
        }
    
        //execute Azure Function to process Lead details
        executeAzureFunction = function (lead, successHandler, failureHandler) {
            debugger
    
            //set Azure Function endpoint
            var endpoint = Constants.AZURE_BASE_ENDPOINT + Constants.AZURE_FUNCTION_ENDPOINT;
    
            //define request
            var req = new XMLHttpRequest();
            req.open("POST", endpoint, true);
            req.setRequestHeader("Accept", "application/json");
            req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
            req.setRequestHeader("OData-MaxVersion", "4.0");
            req.setRequestHeader("OData-Version", "4.0");
            req.onreadystatechange = function () {
                if (this.readyState == 4) {
                    req.onreadystatechange = null;
    
                    if (this.status == 200) {
                        successHandler(JSON.parse(this.response));
                    }
                    else {
                        failureHandler(JSON.parse(this.response).error);
                    }
                }
            }
            //send request
            req.send(window.JSON.stringify(lead));
        }
    })(window.AzureServicesLib = window.AzureServicesLib || {});
    

    Next, go to the Lead entity form designer and add the dyn_AzureServicesLib.js Web Resource in the Form Properties dialog.
    Bind the form OnSave event to the AzureServicesLib.exportLead function.  Make sure you check the ‘Pass execution context…’ option.
    Save and Publish.

    add the dyn_AzureServicesLib.js Web Resource in the Form Properties dialog

    Make sure you check the ‘Pass execution context…’ option

  4. Test

    Create a new Lead record and save it. If all went well, you will see a form level success notification

    Create a new Lead record. If all went well, you will see a form success notification

    Going back to your Function monitoring area, you should see your new Lead data queued successfully

    Going back to your Function monitoring area, you should see your new Lead data queued successfully