Tuesday, May 19, 2015

The mystery of the nondeterministic Salesforce test case

After making some changes to a trigger that can ultimately be fired after changes to an Opportunity are made I encountered an existing test case that started failing. The particular test case bulk inserts Opportunity and OpportunityLineItem records and ultimately goes on to bulk update OpportunityLineItem records that have revenue schedules.

What's odd is:

  • The test case specifically declares
  • I checked and the test doesn't use instances of
    or similar
  • Same for

Due to the nature of the test case I thought the problem may have been around bulkification. As a starting point I dropped the counter down at the start of the test method that specifies how many Opportunity records to test against.

With only 2 opportunities being inserted the test case appeared to pass with every run.

Scaling it up to 50 opportunities resulted in the test case failing with almost every run.

Opportunity Count Run 1 Run 2 Run 3 Run 4 Run 5
2 Pass Pass Pass Pass Pass
28 Pass Fail Pass Fail Pass
30 Pass Fail Pass Pass Fail
31 Fail Pass Fail Pass Fail
50 Fail Pass Fail Fail Pass
62 Fail Fail Fail Fail Fail
150 Fail Fail Fail Fail Fail

This was odd, as I'd expect a test case that wasn't using random data to produce consistent results. What was more odd was that varying the number of Opportunities altered the pass/fail pattern. Generally, the more opportunities there were the greater the odds of the test failing.

The clue to the answer was in the test failure message, which in hindsight I should have focused on more than the bulkification.

System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, MyNamespace.FooOpportunityTrigger: execution of AfterInsert

caused by: System.DmlException: Upsert failed. First exception on row 25; first error: DUPLICATE_VALUE, duplicate value found: MyNamespace__OpportunityIdDynamicPropertyIdUnique__c duplicates value on record with id: a0E4000001GRmaU: []

Class.MyNamespace.FooOpportunityHelper.mapUdfFields: line 636, column 1
Class.MyNamespace.FooOpportunityHelper.handleAfter: line 515, column 1
Trigger.MyNamespace.FooOpportunityTrigger: line 15, column 1: []

The OpportunityIdDynamicPropertyIdUnique__c custom field is used on a custom object to make a composite key out of the Opportunity Id and another Id. It's primary purpose is to enforce uniqueness.

The field is populated using a workflow rule formula as follows:

Knowing the the issue is around the custom field that is assigned the opportunity id and that there are failures consistently with 62 records and 50% of the time with around 31 records indicated the problem was with the case sensitivity of the opportunity id in the field (See What are Salesforce ID's composed of?).

The Id's of the custom object weren't resetting with each test run. Instead the values were accumulating across each test run. For example, if the Opportunity Ids 0064000000fNiOM and 0064000000fNiOm appeared in the same test run it would fail immediately after the Worflow action completed.

The formula needed to use the case insensitive version of the Opportunity Id or the custom field uniqueness needed to be case sensitive. CASESAFEID() seemed like the easiest change as the case sensitivity couldn't be changed on an existing managed package field. It did create a migration problem for existing data.

Tuesday, April 21, 2015

FuseIT SFDC Explorer v2.5 features

The new v2.5 release of the FuseIT SFDC Explorer has some new features.

Reporting the rolling 24 hour API Requests

In addition to the existing tracking of API calls made for the duration of the session the rolling 24 hour API usage and limit will be reported:

Monitoring Asynchronous Test execution and results

When running the automated test cases I usually care first and foremost about any tests that are failing. If anything is failing the other tests that passed are just noise.

The developer console, as at Spring 15, gives a tree control that shows the test classes and the status of the individual methods. To view any individual errors you need to bring up the table view for the entire test run.

You can also view the same data under Setup > App Setup > Develop > Apex Test Execution.

I've taken elements of each of these views and reworked them a bit to highlight any failed tests. Anything that passed is hidden away under an initially collapsed node. Selecting an individual failed test will show both the error message and the stack trace. You can reselect individual test class and use them to start a new test run for just those classes.

List of Apex Classes with coverage

Several releases ago Salesforce removed the really useful Code Coverage column from the Apex Classes page. The idea to add it back in was rejected based on no having to maintain it in both places. See Ideas: Put back Code Coverage column to Apex Classes and Apex Triggers page

I found it to be a super useful column as you could sort the Apex Classes by size (Number of characters excluding comments) and then get an indication of which classes needed the most additional code coverage. I.e. find the biggest class by size with the lowest code coverage. Improving the coverage on this class will give you the biggest overall code coverage improvement.

Thursday, March 5, 2015

.NET Integration with Salesforce Tooling API Spring 15 (v33.0)

The Spring 15 Salesforce release brings some changes to the Tooling API. From the release notes:

Feature: Metadata namespace
Description: In previous versions of the SOAP Tooling API, metadata type elements were defined in the same namespace as the sObjects of the API, tooling.soap.sforce.com. In order to uniquely name these types, the suffix “Metadata” was appended to the metadata types. In this version of the API, a new namespace is introduced, metadata.tooling.soap.sforce.com, and the suffix is no longer used. The old names will continue to work with clients against older API endpoints.

You can see the change diffing the v32.0 WSDL with the newer v33.0 WSDL. The metadata ApexClass element from ApexClass is now also called ApexClass and resides in the namespace urn:metadata.tooling.soap.sforce.com (with the alias mns.

Having two complexTypes with the same name but different namespaces in the same WSDL creates a bit of a mess with Web References (WSDL.exe) and Service References (SvcUtil.exe).

With a web reference to the WSDL the ApexClass from the Metadata namespace gets the matching name and the ApexClass from the urn:tooling.soap.sforce.com namespace becomes ApexClass1. Not only does this break all the previous code that referenced ApexClass within the Tooling API but it is damn ugly.

    [System.CodeDom.Compiler.GeneratedCodeAttribute("System.Xml", "4.0.30319.34234")]
    public partial class ApexClass : Metadata {
    // ...

    // ...
    [System.CodeDom.Compiler.GeneratedCodeAttribute("System.Xml", "4.0.30319.34234")]
    [System.Xml.Serialization.XmlTypeAttribute(TypeName="ApexClass", Namespace="urn:tooling.soap.sforce.com")]
    public partial class ApexClass1 : sObject {
        // ...
        private ApexClass metadataField; 
        // ...
        public ApexClass Metadata {
            get {
                return this.metadataField;
            set {
                this.metadataField = value;
        // ...

To be fair, this is an issue with the .NET generation of the classes in different namespaces as much as anything. The same WSDL in the FuseIT version of Wsdl2Apex creates separate Apex classes for each namespace to keep the class definitions distinct.

One brute force way to fix this is to rename the classes after the codegen is complete. Find the ApexClass that inherits from Metadata and rename it to ApexClassMetadata. Then rename ApexClass1 to ApexClass. Rinse and repeat for any other types that derive from Metadata (WorkflowRule1, WorkflowFieldUpdate1, ValidationRule1, Profile1, ...).

It might be possible to use the /namespace: option on SvcUtil.exe to move each WSDL namespace into a separate C# namespace. It doesn't appear that Wsdl.exe has the same option.

Thursday, February 19, 2015

Avoiding exception wrapping and exposing Exception.Data via log4net.

In .NET exception handling is often a layered approach involving several classes and methods catching a specific exception and then throwing a new exception with some additional details and the original exception wrapped as the inner exception. The additional details are often useful, but after passing through several layers of try-catch-throw-new-exception the root exception can become obscured somewhere down the stack trace and inner exceptions.

A common methods exception might be something like:

public string Foo(string someInput)
    catch(Exception ex)
         throw new FooIsBrokenException("Foo is bad for: " + someInput, ex);
        // Common cleanup code to release any resources etc...

If the exception is going to be exposed outside the current project then the wrapping exception will likely be useful. However, if it is only going to be for internal use as a way of adding additional context to the base exception you can save some effort by using Exception.Data.

public string Foo(string someInput)
    catch(Exception ex)
        // Consider what to do if the key is already present.
        ex.Data.Add("someInput", someInput);
        // Common cleanup code to release any resources etc...


Now you've got the details in the Data dictionary they need to be exposed. I'm working on an older project that started in 2007 and uses log4net. There is a suggestion on Logging Exception.Data using Log4Net to use a custom PatternLayoutConverter on each layout appender to convert the data dictionary.

I've gone with a slightly different approach by registering an IObjectRenderer with log4net you can append the additional Exception.Data values. The advantage here is that it will work with all the appenders to render exceptions.

public class ExceptionObjectLogger : IObjectRenderer
    public void RenderObject(RendererMap rendererMap, object obj, TextWriter writer)
        var ex = obj as Exception;

        if (ex == null)
            // Shouldn't happen if only configured for the System.Exception type.
            rendererMap.DefaultRenderer.RenderObject(rendererMap, obj, writer);
            while (ex != null)
                rendererMap.DefaultRenderer.RenderObject(rendererMap, obj, writer);
                RenderExceptionData(rendererMap, ex, writer);
                ex = ex.InnerException;

    private void RenderExceptionData(RendererMap rendererMap, Exception ex, TextWriter writer)
        foreach (DictionaryEntry entry in ex.Data)
            if (entry.Key is string)
                IObjectRenderer keyRenderer = rendererMap.Get(entry.Key.GetType());
                keyRenderer.RenderObject(rendererMap, entry.Key, writer);

            writer.Write(": ");

            if (entry.Value is string)
                IObjectRenderer valueRenderer = rendererMap.Get(entry.Value.GetType());
                valueRenderer.RenderObject(rendererMap, entry.Value, writer);

Logging configuration additions. Note that it only applies to classes inheriting from Exception.


See also:

Tuesday, February 17, 2015

Troubleshooting Salesforces Trialheads execution of Developer Edition org code

I thought I'd knock out a quick Trialhead challenge to see what was involved.

When checking the challenge Manipulating Records with DML I was surprised that my apex class failed with the message:

Challenge not yet complete... here's what's wrong:
Executing the 'insertNewAccount' method failed. Either the method does not exist, is not static, or does not insert the proper account.

I was reasonably confident my Apex class was correct.

I'm not going to post my class here, as that seems to go against the concept of Trialhead if you could just Google for an answer. Rather, I found something interesting when trying to figure out what was wrong.

Trialhead is using executeAnonymous via the tooling API to examine the apex class in my developer edition org.

It appears to take a black box approach to confirming that the Apex class is behaving as expected. If the method can be invoked and satisfy the given assertions it passes the challenge.

If you just open up the log in the developer console you can see the assertions being made, and in my case that the test was failing due to a STORAGE_LIMIT_EXCEEDED DmlException. So I can delete some records, free up some space and get the challenge to pass.

Better yet, if you open the RAW Log you can extract the anonymous Apex that is being used to test the code. Note that the default developer console log viewer won't show the executed code or logging levels like the FuseIT SFDC Explorer does.

Execute Anonymous: Account a = AccountHandler.insertNewAccount('My Test Account'); System.assertNotEquals(a,null); if(a != null) {delete a;}

Once you know this, you can more accurately figure out why your Apex class might be failing. For example, imagine you had implemented a check in your Org to ensure the Account Name is unique. If there was already an Account with the Name 'My Test Account' (maybe from a previous run that partially failed) that this would show up in the debug log. The debug log could provide some good clues for failures. The FuseIT SFDC Explorer will show the debug log directly with the anonymous apex.

In theory there is nothing stopping you implementing the minimal code to pass the assertions being made by the anonymous Apex. For example, with the above anonymous Apex, as long as insertNewAccount returns a non-null Account that can be deleted here the test will pass. Currently no check is made that the returned account has the correct name or was indeed inserted as part of the test. Have I gained anything here? Probably not, the goal with Trialhead is to learn about an aspect of Salesforce rather than game the system to earn a badge on your profile. Chances are that if you can figure out a way to work around the assertions you are also more than capable of completing the actual challenge.

I should note that a challenge may make multiple anonymous apex calls. Salesforce may also change the assertions made for each challenge at any time.

Historically interesting footnote

The following was true when I was originally going to post this last week. After contacting Salesforce about the issue they have subsequently resolved it. At the time the anonymous apex appeared as:

Execute Anonymous: Account a = AccountHandler.insertNewAccount('My Test Account'); System.assertNotEquals(a,null); System.assertEquals('test','success');

An interesting line of apex code here is the System.assertEquals('test','success'); assertion. This is similar to the technique mentioned in Adding Eval() support to Apex where an resulting exception is used to return the result from anonymous apex. If I just start my Apex class with that assertion then in theory the test would pass...

If fact, it does! Replacing the method body with just that assertion and returning null causes the challenge to pass. My guess would be Trailhead code is looking for the "System.AssertException: Assertion Failed: Expected: test, Actual: success" in the response.

UPDATE: As mentioned above, Salesforce have subsequently removed this assertion. I learned through Salesforce that the final assertion was intended to rollback any side effects of the challenge check. This makes sense. It seems like it would be an ideal scenario for my idea to Run anonymous apex as if it were a test case.

Thursday, November 27, 2014

Adding Eval() support to Apex

In the Codefriar blog post EVAL() in Apex Kevin presents a technique to to allow programmatic evaluation of an Apex string and the extraction or the result via an Exceptions message. Here I present an alternative approach using anonymous Apex and the debug log in place of the intentional exception.

Reasons you may or may not want to eval() Apex


  • Where a rollback may be required the separate context via the callout doesn't require the invoking Apex to rollback. You can still progress and make further callouts and subsequent DML operations.
  • JSON can be used in the response message to return structured data.
  • An odd way of increasing limits. E.g. Each anonymous apex context gets a new set of limits.


  • The potential for security issues depending on how the anonymous Apex is composed.
  • The requirement for users to have sufficient permissions to call executeAnonymous. Typically this means having “Author Apex” or running with the restricted access as per Executing Anonymous Apex through the API and the “Author Apex” Permission.
  • The need to parse the DEBUG log message out of the response to get the result. Other code may also write DEBUG ERROR messages, which will interfere with parsing the response. This could be addressed by improving the parsing of the Apex log. I.e. Extract all the USER_DEBUG entries to a list and then read the last one. Another alternative is to use delimiters in the debug message to make it easier to parse out.
  • Each eval() call is a callout back to the Salesforce web services. This creates limits on the number of evals that can be made. (Don't forget to add the Remote Site Setting to allow the callout)

API Background

Both the Tooling API and the older Apex API provide an executeAnonymous web method. The main difference is that the older Apex API will return the resulting Apex debug log in a SOAP header. With the tooling API the Apex debug log is generated but needs to be queried separately from the ApexLog. The older Apex API becomes more attractive here as one API call can execute the dynamic Apex code and return the log that can contains the output of the call.

By setting the DebuggingHeader correctly the size of the Apex debug log can be kept to a minimum. For example, getting on the ERROR level Apex_code messages makes extracting the required USER_DEBUG output easier and reduces the amount of superfluous data returned.

It should be noted that using executeAnonymous won't execute in the same context the way a Javascript eval() does. Any inputs need to be explicitly included in the Apex string to execute. Also, any return values need to be returned via the log and then parsed out to bring them into the context of the calling Apex.

Note that the current native Salesforce version of WSDL2Apex won't read the responses DebuggingInfo soap header. Instead these need to be read via an HttpRequest and parsing the resulting XML response.

Sample Raw SOAP Request/Response

This is sent to the Apex API at https://na5.salesforce.com/services/Soap/s/31.0


<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:apex="http://soap.sforce.com/2006/08/apex">
         <apex:String>Integer i = 314159; System.debug(LoggingLevel.Error, i);</apex:String>


<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="http://soap.sforce.com/2006/08/apex" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
         <debugLog>31.0 APEX_CODE,ERROR
Execute Anonymous: Integer i = 314159; System.debug(LoggingLevel.Error, i);
13:24:24.027 (27564504)|EXECUTION_STARTED
13:24:24.027 (27573409)|CODE_UNIT_STARTED|[EXTERNAL]|execute_anonymous_apex
13:24:24.028 (28065096)|USER_DEBUG|[1]|ERROR|314159
13:24:24.028 (28098385)|CODE_UNIT_FINISHED|execute_anonymous_apex
13:24:24.029 (29024086)|EXECUTION_FINISHED</debugLog>
            <compileProblem xsi:nil="true"/>
            <exceptionMessage xsi:nil="true"/>
            <exceptionStackTrace xsi:nil="true"/>

You can see the required output in the response next to |USER_DEBUG|[1]|ERROR|.

Given this the basic process becomes:

  • Build up the anonymous Apex string including any required inputs. Use a System.debug(LoggingLevel.Error, 'output here'); to send back the output data.
  • Call the Apex API executeAnonymous web method and capture the DebuggingInfo soap header in the response
  • Parse the USER_DEBUG Error message out of the Apex Log.
  • Convert the resulting string to the target data type if required.


Here are several examples showing parsing various data types from the Apex log.

 string output = soapSforceCom200608Apex.evalString('string first = \'foo\'; string second = \'bar\'; string result = first + second; System.debug(LoggingLevel.Error, result);');
 System.assertEquals('foobar', output);
 integer output = soapSforceCom200608Apex.evalInteger('integer first = 1; integer second = 5; integer result = first + second; System.debug(LoggingLevel.Error, result);');
 System.assertEquals(6, output);
 boolean output = soapSforceCom200608Apex.evalBoolean('boolean first = true; boolean second = false; boolean result = first || second; System.debug(LoggingLevel.Error, result);');
 System.assertEquals(true, output);

 string outputJson = soapSforceCom200608Apex.evalString('List<object> result = new List<object>(); result.add(\'foo\'); result.add(12345); System.debug(LoggingLevel.Error, JSON.serialize(result));');
 List<Object> result = 
 System.assertEquals(2, result.size());
 System.assertEquals('foo', result[0]);
 System.assertEquals(12345, result[1]);

Modified Apex API wrapper class

This wrapper around the Apex API SOAP web service uses HTTP Requests so that the DebuggingHeader can be extracted from the response. I've added several methods to execute the eval requests and parse out the expected response type.

Friday, November 7, 2014

Deep linking to the Salesforce Developer console

One of the nice things about Salesforce initially was, given a record ID, the ability to easily build a URL to view or edit that record. E.g. you could easily view an Apex Class with:


Or edit the same class by appending a /e. E.g.


This is still possible, but more and more of the built in Apex editor functionality is moving into the Developer Console. Previously there was no way to deep link directly to a record in the developer console. Instead you needed to open the record using the provided menus.

With the Winter 15 release I noticed there were deep links to the Lightning Component Bundles under Setup > Build > Develop > Lightning Components

This opens the developer console with the URL:


This seems to work equally well with an Apex Class ID and a hand crafted URL. E.g.


Of course, this link structure isn't officially supported. Salesforce may change it at any point in the future (as confirmed in the Dreamforce 2014 Meet the Developers session). Still, there would be times where a link directly into the Apex class or trigger in the developer console would be convenient.

IdeaExchange: Provide a deep link API to the Developer Console