Monthly Archives: May 2017

Divide and conquer – processing high volume records within Dynamics CRM

Next week, I will revive my blog with an article about a design I’ve recently come up with to process high volume records within Dynamics CRM. Here is a preview:

Dynamics CRM is a great product for enterprises who want to efficiently track and optimise all interactions with their customers. Often there is a need to perform a particular operation on a batch of records. For example, in a customer-loyalty scenario, one might want to automatically re-evaluate a customer’s gold-star status on a nightly interval. In another scenario, one might want to recalculated an account balance based on the transactions of the previous day. In this post I will outline a robust design that lets actions be executed for an unlimited amount of records.

When working in CRM Online, we as CRM developers, vendors and partners have no control over the maximum allowed time that server side custom code components are allowed to execute within CRM. One of these limitations is that any custom workflow activity can run no longer than 2 minutes.

Next to the time limitation, there is also a concept of a depth limitation to prevent the system from containing malfunctioning code and ending up in an infinite loop. This limitation means that custom code calling itself or other custom code components get to do so at most 7 times every hour (for a maximum depth of 8).

These limitation makes the processing of large volumes of records from within custom code in CRM quite challenging.

The Microsoft Dynamics CRM 2015 Asynchronous Batch Process Solution is a good attempt at a framework to perform scheduled operations on a collection of records defined by a Fetch XML query. However, due to the aforementioned limitations of CRM, it is not a robust framework. We had implemented this solution at one of my client’s projects in CRM Online. It worked fine for low volumes of records, but we were only able to process 3000 records before CRM timed out and the batch job failed.

To overcome this problem, the client is usually presented with an external solution that processes these high volumes of records outside of CRM. In this article, I will present a design that will allow CRM to process an unbounded amount of records based on a Fetch XML query.

Note, however, that CRM is not meant to act as batch processing system and thus it is still the responsibility of the system’s architect to determine if CRM is to be made responsible for the operation, or that an external system would be the best solution.

Stay tuned to find out more.

48 total views, 1 views today