Handling large datasets efficiently in Microsoft Dynamics 365 CRM is crucial to achieving optimal system performance, especially when integrating external services like Azure Functions. When creating, updating, or deleting records in bulk, minimizing the number of requests sent to the server is essential. Execute Multiple with batch size is one of the best techniques for this, particularly useful when processing large datasets in external integrations like Azure Functions.
Purpose of this Article
This article demonstrates how to use Execute Multiple with Batch Size in Azure Functions for bulk operations in Dynamics 365 CRM, along with a practical example.
What is Execute Multiple?
Execute Multiple is a powerful feature in Dynamics 365 CRM that allows you to group multiple requests (e.g., Create, Update, Delete) and execute them in a single round-trip to the server. Using Execute Multiple reduces the number of server requests, improving performance and scalability for bulk operations.
What is Batch Size?
Batch size refers to the number of requests that can be executed in one go using the Execute Multiple feature. In Dynamics 365 CRM, the maximum batch size is 1000 requests per batch. For example, if you’re updating 3000 records, you’ll divide the operation into three batches: two batches of 1000 and one batch of 1000.
Why Use Execute Multiple in Azure Functions?
Azure Functions allows you to run small pieces of code (functions) in the cloud. They’re perfect for external integrations, background processing, and executing logic outside CRM’s plugin architecture. When you integrate with Dynamics 365 CRM, Azure Functions can efficiently handle bulk data operations using Execute Multiple, allowing you to offload complex operations and improve system performance.
Here’s why you should consider using Azure Functions with Execute Multiple:
- Scalability: Azure Functions can scale automatically based on load.
- External Processing: Offload heavy data processing from CRM and handle it in the cloud.
- Flexibility: Azure Functions can be triggered by multiple events, making them highly versatile for CRM integrations.
Example: Execute Multiple with Batch Size in an Azure Function
Let’s walk through an example where we update the “Annual Revenue” field of multiple accounts using Execute Multiple in an Azure Function.
Scenario:
You need to update the annual revenue of 2500 accounts in Dynamics 365 CRM. Instead of sending individual update requests for each account, you’ll use Execute Multiple to batch the updates in groups of 1000.
Step 1: Create the Azure Function
We will create an Azure Function that interacts with Dynamics 365 CRM via the Web API.
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Microsoft.PowerPlatform.Dataverse.Client;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Query;
using System;
using System.Collections.Generic;
public static class BulkAccountRevenueUpdateFunction
{
[FunctionName("BulkAccountRevenueUpdateFunction")]
public static void Run([TimerTrigger("0 0 * * * *")] TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
// Connect to Dynamics 365 CRM via the ServiceClient
var serviceClient = new ServiceClient("connection_string_here");
// Fetch account records (for simplicity, this assumes 2500 accounts)
QueryExpression query = new QueryExpression("account")
{
ColumnSet = new ColumnSet("accountid", "revenue")
};
query.Criteria.AddCondition("revenue", ConditionOperator.NotNull);
EntityCollection accounts = serviceClient.RetrieveMultiple(query);
// Execute Multiple for batch updates
ExecuteMultipleRequest executeMultipleRequest = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings
{
ContinueOnError = true,
ReturnResponses = false
},
Requests = new OrganizationRequestCollection()
};
int batchSize = 1000;
int count = 0;
foreach (Entity account in accounts.Entities)
{
// Create an update request for each account
UpdateRequest updateRequest = new UpdateRequest
{
Target = new Entity("account", account.Id)
{
["revenue"] = new Money(5000000) // Set the new revenue value
}
};
executeMultipleRequest.Requests.Add(updateRequest);
count++;
// Execute batch when reaching the batch size limit
if (count == batchSize)
{
ExecuteBatch(serviceClient, executeMultipleRequest, log);
count = 0;
}
}
// Execute the last batch if there are remaining requests
if (count > 0)
{
ExecuteBatch(serviceClient, executeMultipleRequest, log);
}
}
// Function to execute a batch of requests
private static void ExecuteBatch(ServiceClient serviceClient, ExecuteMultipleRequest executeMultipleRequest, ILogger log)
{
ExecuteMultipleResponse executeMultipleResponse = (ExecuteMultipleResponse)serviceClient.Execute(executeMultipleRequest);
if (executeMultipleResponse.IsFaulted)
{
foreach (var response in executeMultipleResponse.Responses)
{
if (response.Fault != null)
{
// Log or handle the errors
log.LogError($"Error: {response.Fault.Message}");
}
}
}
// Clear the requests after batch execution
executeMultipleRequest.Requests.Clear();
}
}
Explanation:
- Batching the Update Requests:
- We query 2500 accounts that need to be updated.
- We create an
UpdateRequest
for each account and add it to theExecuteMultipleRequest
collection. - The batch is executed once the collection reaches 1000 requests.
- Execute the Batch:
- The
ExecuteBatch
method sends the batched requests to the server. - It handles any errors encountered during execution by logging the fault messages.
- Final Batch Execution:
- After processing batches of 1000, the remaining requests (less than 1000) are sent to the server.
Benefits of Using Execute Multiple with Batch Size in Azure Functions
- Improved Performance:
- By sending multiple requests in a single batch, you reduce the number of round trips between the function and the CRM server. This decreases the overall time taken for bulk operations.
- Reduced Server Load:
- Fewer round trips to the CRM server mean reduced load. The server processes requests in chunks, optimizing resource utilization.
- Scalability with Azure Functions:
- Azure Functions automatically scale based on demand, making this solution ideal for handling large datasets without worrying about server resources.
- Error Handling:
- The
ContinueOnError
setting allows the batch to continue even if some requests fail. This is useful when you don’t want a single failed operation to halt the entire batch.
- Flexibility in Integrations:
- Azure Functions can be triggered by various events like a timer (as shown in the example) or even by external HTTP requests, making it a versatile solution for CRM integrations.
Execute Multiple with Batch Size is a must-have tool for anyone working with bulk operations in Microsoft Dynamics 365 CRM, and using it within Azure Functions further enhances the performance and scalability of large data operations. By incorporating this feature into your external processing strategy, you can handle bulk updates, creations, or deletions of records in an efficient, scalable, and error-resilient manner.
If you’re managing a large CRM implementation with frequent bulk operations, using Azure Functions with Execute Multiple will save time, and resources, and reduce server bandwidth, all while offloading the heavy lifting to the cloud.