Tuesday, August 2, 2016

Preventing trigger recursion and handling a Workflow field update

Like a good developer, I've included recursion protection in my managed package after update triggers to prevent interactions with other triggers in an org from creating an infinite update loop that would ultimately end in a "maximum trigger depth exceeded" exception. The recursion protection mechanism is fairly basic. It uses a class level static Set of processed record Ids. The first thing the trigger does is skip additional processing from any record ID already in the Set. After the record is first processed by the trigger its Id is added to the static Set.

The functionality for the triggers in question is dynamic in nature. Admins who install the managed package can configure a list of fields of interest on an Opportunity that will be dynamically mapped to custom records related to the Opportunity. E.g. They may map the Opportunity field with the API name "Description" into a custom record related to the Opportunity. This is then used for further processing when integrating with an external system. The important part is that it is entirely dynamic. Users of the managed package should be able to configure any Opportunity API field name and it will be mapped by the trigger to the custom record for further processing.

This setup works well with one exception. What if a subsequent trigger or workflow field update rule makes further changes to one of the mapped fields. In the Triggers and Order of Execution the workflow rules execute after the triggers. The workflow rule will cause the trigger to fire again for the field update, but the current recursion protection will prevent any further processing from occurring.

12. If the record was updated with workflow field updates, fires before update triggers and after update triggers one more time (and only one more time), in addition to standard validations. Custom validation rules, duplicate rules, and escalation rules are not run again. [Source]

I needed a mechanism that detects if one of the dynamically mapped fields has subsequently changed and to run the trigger auto mapping again. In the simplest case where I was only interested in a single field changing a Map from the record ID to the last processed field value could be used (See How to avoid recursive trigger other than the classic 'class w/ static variable' pattern?). The challenge here is that the fields of interest are dynamic in nature so they can't be predefined in a Map.

In my case the trigger field mapping functionality was idempotent. So while it is important that it didn't run recursively if nothing had changed on the base record, I didn't need to be exact in which fields were changing. Given this, I went with storing the System.hashCode(obj) for the Opportunity at the time it was last processed. The hash code helps here as any change to a field on the Opportunity will change the hash code, making it ideal to detect if there has been any field changes on the Opportunity.

The following example was put together directly by hand, so it might contain syntax errors etc...

trigger DynamicOpportunityFieldTrigger on Opportunity (after update) {
    OpportunityFieldMapper ofm = new OpportunityFieldMapper();
    ofm.mapFields(, trigger.oldMap);

public class OpportunityFieldMapper {
    private static Map<Id, Integer> visitedRecordIdsToLastSeenHashMap = new Map<Id, Integer>();

    // List of applicable Opportunities. Expected to be pruned already and only includes records not already processed in the transaction
    private List<sObject> recordsToMapFieldsFor = new List<sObject>();

    public void addOpportunity(Opportunity opp) {
        if(visitedRecordIdsToLastSeenHashMap.containsKey(opp.Id)) {

     Integer lastSeenHash = visitedRecordIdsToLastSeenHashMap.get(opp.Id);
     Integer currentHash = System.hashCode(opp);

     if(lastSeenHash == currentHash) {
      System.debug(LoggingLevel.Debug, '.addOpportunity skipping visited OpportunityId: ' + opp.Id + '. Unchanged hash');

     System.debug(LoggingLevel.Debug, 'AdBookDynamicPropertyMapping.addOpportunity Hash for OpportunityId: ' + opp.Id + ' changed from '+ lastSeenHash + ' to ' + currentHash + '. Not skipping due to change');   

        visitedRecordIdsToLastSeenHashMap.put(opp.Id, System.hashCode(opp));
        // Queue for latter field mapping.

    public void mapFields(List<Account> triggerNew, Map<id, Account> triggerOldMap) {
        for(Opportunity opp : triggerNew) {

        // Use the recordsToMapFieldsFor collection to perform the actual mappings

No comments:

Post a Comment