Search This Blog

Friday, October 9, 2015

Improving performance when a large portion of data matches a workflow activity criterion

Here is a real customer scenario from a related Support Center thread

"Hi, I have an application with several workflows that work ok in my local computer (testing with few records), but when I deploy the workflows to a server and the workflow gets approximately 1,500,000 records that match the criteria the workflow service crashes...
I've done some testing and this only happens when the workflow needs to deal with a lot of data, workflows that deal with less records don't have a problem. I've been looking for an alternative like XPCursor or something similar to ask the workflow to page the records that meet the criteria but I haven't found any."


After some time, we could replicate this behavior locally and found that it could be easily improved upon on our side. In short, our solution is in using XafDataView + a sort of paging  when first querying and processing records that match a certain workflow activity definition criteria in the workflow service. 

Here are more technical details from the original thread: 

"To reduce memory usage in cases when a large amount of workflow data is used, we have implemented partial loading of target objects in the next version (v15.1.8) of our product. Starting with this new version, you can configure the maximum number of target objects that can be simultaneously operated by a certain workflow activity 
via the static MaxSimultaneouslyOperatedTargetObjectsCount  property of the DevExpress.ExpressApp.Workflow.Server.StartWorkflowListenerService class.In a typical scenario, you can set this property in the WorkflowServerService.xx file of the workflow service project as follows:"

[C#]

using DevExpress.ExpressApp.Workflow.Server;

StartWorkflowListenerService.MaxSimultaneouslyOperatedTargetObjectsCount = 2000;

No comments:

Post a Comment