Thursday, June 30, 2016

Monitoring your Salesforce API usage

This seems to be a fairly common request on the Salesforce forums that developers frequent.

What is REQUEST_LIMIT_EXCEEDED: TotalRequests Limit exceeded. error about?
How can I found out what caused me to hit it?
How to determine what is making the API calls?
How can I get insight into an API call that took place on a certain date for one of our connected apps?

First, some context. This is the Total API Request Limits limit. It is a rolling limit for an organization over the last 24-hours. This means an API call made just now will count towards that limit until 24 hours from now. Don't expect the limit to reset back to zero at midnight.

The exact size of this limit depends on the Salesforce Edition and the number of per user licenses of a given type you have. It is possible to purchase additional API calls without needing more user licenses.

Basic monitoring

There are two locations where it is easy to check the current API usage.

Under Setup > Company Profile > Company Information there is an API Requests field. This will show you the current API call count the the maximum you can reach.

Then, within Reports > Administrative Reports there is API Usage Last 7 Days

This provides slightly finer detail, such and the username that held the session and the Client Id that was used to make the call. The Client Id can be useful as it can identify which external app was consuming the API calls.

Receiving a warning

You can configure an email alert to a user when the API requests exceed a percentage or you maximum requests. This is a RateLimitingNotification record that you create from Setup > Administration Setup > Monitoring > API Usage Notifications.

Monitoring via the API

The REST API has a Limits resource and a specific Organizations Limits resource that includes the "DailyApiRequests".

The SOAP APIs have a LimitInfoHeader that can be used to monitor API usage.

Event Monitoring API

The Event Monitoring API can provide much finer details about API calls over a wider history. With this (paid feature) you can see exactly what API calls were made in a time period.

E.g. From the developer console query editor

select Id, EventType, LogDate, LogFileLength from EventLogFile where EventType = 'API' and LogDate = 2016-02-20T00:00:00.000Z

Look for the EventType of API and the LogDate for the UTC day of interest. Unfortunately there isn't a single comprehensive EventType that will allow you to monitor all events that contribute to the limit. There is also the Bulk API, Metadata API Operation, and REST API.

You can then pull down the single `LogFile` data, which is a base 64 encoded CSV with all the API calls for that day.

select Id, LogFile from EventLogFile where ID = '0AT700000005WDaGAM'

The FuseIT SFDC Explorer has an Event Log tab that uses the same API calls and will extract the file for you and export it to a CSV.

For the LogFile, look for the USER_ID, CLIENT_IP, and CLIENT_NAME to help identify which app is making the calls.

See also:

Tuesday, June 28, 2016

Taming the size of a Salesforce Canvas

For an artist, facing a blank canvas can be a real challenge. For a Salesforce developer a Canvas app can be challenging for an entirely different reason - how to even define what size the Canvas is to start with?

In the ideal world you could just follow the docs and Automatically Resize the Canvas App to suit the content size. In practice this doesn't work so well for all scenarios.

I'm looking to embed an existing web application that was previously located in a Web Tab iframe into a Canvas Visualforce page using <apex:canvasApp />. My ideal goal is that the external web application blends in with Salesforce. There shouldn't be any jarring scrollbars on the iframe that make it look out of place.

The Web Tab approached worked well in that the width of the iframe didn't need to be defined and so it would get an <iframe width="100%">. Now the iframe can shrink and grow to follow along with any changes to the browser size. The nested app can correspondingly adjust its width to suit the available dimensions. The downside to this approach is that the iframes height needs to be specified. This is more problematic, and requires a fixed height that is sufficient to hold the majority of content. E.g. 3000px. Ech! Crazy vertical scroll bar!

Back the the Canvas iframe and the problem with dimensions. Firstly, a plain default Canvas app won't size past 2000px in height and 1000px in width. You need to explicitly set the maxHeight and maxWidth attributes to "infinite" (Documented in Docs but undocumented in other locations.). With the browser full width on a 1080 screen the default 1000px width limit is way too low. Sadly there is currently no corresponding minWidth/minHeight attributes.

Now we've pulled the limits off how big the iframe can get, how to correctly size it to both the browser window and the content within? As mentioned above, the auto resize should be just the thing here. Unfortunately it doesn't play so well with content that dynamically scales to the available space. I found it would either default to the minimum page width defined by the content, or worse still, shrink as the iframe content used Javascript to resize and then reacted to the auto resizing in an unending loop. If there was a way to define the minimum height and to let the iframe width stay at "100%" it would be infinitely more useful.

The autogrow() resizing script appears to come from canvas-all.js. It is basically a timer to periodically call resize. I haven't gone through the fine details, but I believe part of the code is for communicating with the parent iframe so that it can be resized accordingly.

How can I size a Canvas apps iframe in Visualforce to be the full window width with a height to fit the content?

At this stage I'm experimenting with either using a custom canvas-all.js implementation or manually calling Sfdc.canvas.client.resize(). I'll update this page when I get somewhere workable.

Another potential problem with the height is dynamic content, such a modal dialogs, that may not register as affecting the page height.

See also:

Friday, June 3, 2016

The importance of reading the Salesforce Release notes - a cautionary tale

By the time you read this the underlying problem is likely to be a non-issue with the transition to Summer '16 complete. It does highlight the importance of going through the release notes with a fine-tooth comb.

How carefully do you read the release notes for each of the big triannual releases? For Spring '16 the complete document weighed in at 486 pages. "Great!" you say, that's a lot of new features and fixes to play with.

Spoiler alert: I'll admit here that I read them, but didn't commit the entire document to memory. This caught me out as follows:

Some names and identifying details have been changed to protect the privacy of individual pods.

  • Client: For our FooBarWidget records, I need to be able to set if it has one of 4 possible values. Only those 4 values are applicable.
  • Me: That sounds like a good candidate for a picklist field. I'll add one with these values you provided and have it to you shortly for testing.
  • Client: Oh, and we need this in production ASAP.
  • Me: Got it, the usual.
  • Me to dev sandbox: Add a new picklist field to FooBarWidget please.
  • Spring '16 Dev Sandbox:
  • Me to dev sandbox: It's not really a global picklist. And when I looked further at those there is a big red BETA next to them. Let's just define the values here as a one off thing. That "Strictly enforce picklist values" option sounds good. Definitely don't want those rascally users putting inappropriate values in the picklist again. No siree!
  • <The sound of hammers, saws and typing. Maybe some random metal grinding to look good on camera. End result is a changeset for the new picklist field.>
  • Me to pre-production sandbox: Validate and then quick deploy this change set.
  • Spring '16 Pre-production sandbox: Done, and have a deployment fish for your troubles.
  • <High fives CS6 instance. Which was tricky with the whole cloud thing, but we made it work.>
  • Me to Client: Please test the functionality in pre-prod. When you are happy with it we can deploy it to production.
  • Client: It works. And you did it so quickly! You sir are the most meaningful and valued member of this team!
  • Me to Client: I do what I can.
  • Me to Production: Validate this change set.
  • Spring '16 Production: Woah, woah, woah, back the change set up.
    "Cannot have restricted picklists in this organization."
    No deployment fish for you!
  • Me mumbling to self: What the?
  • Me to Google: "Cannot have restric...
  • <Google reads mind>
  • Google: Error message: Cannot have restricted picklists in this organization
  • <Re-reads release notes>
  • Spring '16 Release Notes:
    If you have a Developer Edition org or sandbox org, no setup is required to use restricted picklists. You can package restricted picklists with an app only in Developer Edition or sandbox orgs.
    For all other editions, restricted picklists are available as a beta feature, which means they’re highly functional but have known limitations. Contact Salesforce to enable restricted picklists.
  • Me grumbling to self: It's enabled by defaulted in all sandboxes and dev orgs, but won't work in production without begging to get on the pilot. AND THERE IS NO UI INDICATION THAT IT IS A BETA FEATURE!
    That's just brilliant!
  • Summer '16 Release Notes: Eliminate Picklist Clutter with Restricted Picklists (Generally Available)
  • Me to client: Soooooo, we can't deploy the change set as is to production. We need to do one of the following:
    1. Wait until Summer '16 releases for the feature to become GA. The trust website has it scheduled for one week from now.
    2. Remove the restriction from the picklist and look for other ways to prevent incorrect values in the short term. Make the field restricted again once Summer '16 deploys
    3. Raise a support case and ask our AE to get on the pilot for the one week until Summer '16 arrives.
  • Client: I'm not so much with the meaningfulness and valuing right now.
    Anyhow... That last one sounds like fun. Let's do that!
  • ...

Things degenerate a bit from there and are best not recorded in this medium. The point is that this one was a bit of a pain. Adding the new picklist field in the sandbox gave no indication that things were anything but fine and business as usual. There were no warnings that the features were still in pilot outside of the sandbox and dev orgs. It even deployed just fine into the pre-production sandbox. Then it exploded when trying to deploy to production.

Needless to say, I'm not a huge fan of implicitly activated pilot features in all sandbox and dev orgs without the corresponding BETA indication in the UI.

Admittedly it was all documented right there in the Spring '16 release notes. And the imminent release of Summer '16 will make it all a moot point.

Moral of the story - keep reading those release notes. And I salute you if you can remember everything you read in them.

Thursday, May 19, 2016

Trailhead - Custom Metadata Types

When Custom Metadata Types were first introduced my first reaction was - What is the difference between Custom Settings and Custom Metadata Types?. Why would you use one over the other?

There is now a convenient new Trailhead module that helps answer this question with gif animations in the first unit.

The key point from these gifs are the configuration records that represent the actual configuration. If you are currently using custom settings or custom objects to hold configuration for your org, it's well worth exploring how Custom Metadata Types can help with deployments.

Other useful areas of Custom Metadata Types you can explore in the module:

  • You can use the Custom Metadata Loader to bulk load up to 200 custom metadata records from a CSV.
  • How to configure new Custom Metadata Types and the corresponding records.
  • How custom metadata records are accessed in testing contexts.
  • How to control access to the Custom Metadata Types and fields
  • How to include both the Custom Metadata Type and the corresponding records in a package.
  • How to convert from list custom settings to custom metadata types.
  • There is also some wisdom hidden away in the challenge questions (although it may not be the answer Trailhead is looking for):

    See also:

    Thursday, May 12, 2016

    Summer '16 new Apex Method - Get a Map of Populated SObject Fields

    I've just found my new favorite Apex method in the Summer '16 Release notes - Get a Map of Populated SObject Fields

    // In Summer ’16, we’ve introduced a new method on the SObject class that returns a map of populated field names and their corresponding values:
    Map<String, Object> getPopulatedFieldsAsMap()

    Where is this immediately helpful? Maybe this error message looks familiar:

    System.SObjectException: SObject row was retrieved via SOQL without querying the requested field: Account.Name

    In the simplest case it comes from something like the following:

    List accs = [Select Id from Account];

    The Accounts were queried for just the Id field, and then the code immediately tries to access the Name field of an Account, which wasn't queried. The SOQL query needs to be expanded to something like: Select Id, Name from Account. In an ideal world fixing this would be as simple as back tracking a few lines of code to find where the Account was queried and then adding the missing fields. In practice the "back tracking" may not be all that simple. From when the object was first queried to when it reached the code that needs a specific field to be populated it could have traversed several methods and classes. Possibly even gone through a namespace change or passed off to an unknown class that implements an interface via Type.forName() and Type.newInstance().

    With the new Summer '16 sObject method we can do some explicit code checks to see if the random Account instance we've been passed has the expected fields populated.

    List<account> accs = [Select Id from Account];
    System.assert(accs[0].getPopulatedFieldsAsMap().ContainsKey('Name'), 'The Account should be queried with the Name field');

    This will also work for sObjects that haven't been inserted yet. You can see which fields have been populated.

    Contact con = new Contact();
    con.firstname = 'John';
    System.assert(con.getPopulatedFieldsAsMap().ContainsKey('LastName'), 'The Contact should have a LastName');

    In practice you would likely store the Map that comes out of getPopulatedFieldsAsMap() in a variable and do numerous checks on it.

    Having an assertion fail isn't the most elegant solution. Not much better than the existing SObjectException. So how do you handle a missing field?

    You could go back to the originating SOQL query and add the missing field. However, as mentioned above, you may have little or no control of that query. Instead, you could build up a dynamic SOQL query that will pull just the identified missing fields for all the sObjects affected. Then use the sObject.put method to merge the results back into the original sObjects.

    Contact con = [Select Id, FirstName from Contact limit 1]; // Whoops, forgot to query the LastName!
    List<string> requiredFields = new List<string> {'FirstName', 'LastName'};
    string dynamicQuery = 'Select Id';
    boolean queryRequired = false;
    Map<String, Object> fieldsToValue = con.getPopulatedFieldsAsMap();
    for(string requiredField : requiredFields) {
        if(!fieldsToValue.containsKey(requiredField)) {
            dynamicQuery += ', '+ requiredField;
            queryRequired = true;
    if(queryRequired) {
        // Exercise for the reader, better bulkificaiton support for dealing with multiple sObjects
        dynamicQuery += ' From Contact where Id in (\''+con.Id+'\')';
        for(Contact mergingContact : Database.query(dynamicQuery)) {
            // TODO: Matchup queried Contacts with base contacts on Id (or an external Id)
            Map<String, Object> mergingFieldsToValue = mergingContact.getPopulatedFieldsAsMap();
            for(string fieldName : mergingFieldsToValue.keySet()) {
                con.put(fieldName, mergingFieldsToValue.get(fieldName));
    System.debug(con); //12:55:50:007 USER_DEBUG [12]|DEBUG|Contact:{FirstName=John, LastName=Doe}

    Things that would make it even more useful:

    • Being able to "depopulate" a field on an sObject. E.g. you've got an Account instance, but only want send specific fields in for a DML update. Currently you would need to create a new sObject and set just the fields you wanted. If you could depopulate them instead you could use the original instance without the risk of updating fields that should be unchanged.
    • A built in way to retrieve additional fields into a collection of sObjects. My sample implementation above should work in theory, but it would be great to provide a collection of sObjects and the names of the additional fields you need populated and have Apex do the rest.
          List<Contact> someListOfContactsWithoutFirstName = //...
          List<string> additionalFields = new List<string> {'FirstName'};
          List<Contact> awesomeListOfContactsNowWithFirstName = Database.queryFields(someListOfContactsWithoutFirstName, additionalFields);

    Other Summer '16 highlights via Summer '16 Highlights for ISV Developers

    See also:

    Thursday, April 7, 2016

    The Search for Astro

    Update: Finding Astro just became a whole lot harder as the module no longer appears to be available.

    In what has become become the hallmark of Trailhead emerges a rather cockamamy module to locate the lost Astro mascot. It may seem like an excuse for the creators to make odd videos and run around in the woods blair witch style, and maybe it was considering it went live on the 1st of April, but you might actually learn something along the way (and maybe win a prize).

    The key to finding Astro will be in decoding the clues left in the trails. Expect to watch the videos of noir interrogations, Tanooki cosplay, goats, ... and then jump out to the indicated modules to find additional clues to complete the code. What code is this you ask? The first unit provides more details, but it looks something like this:

    Don't worry if you never finished reading the Cryptonomicon. Or if you don't know the difference between a one-time pad and using the Apex Crypto class with AES256.

    Instead, download the zip from the first part of the module that includes an Excel file you can use to complete the challenge. Alternatively, I've created an equivalent Google Sheet for decoding the clues.

    Other things you might learn in the module:

    • How long it takes to get Cloudy the Goat groomed for a World Tour event.
    • Who sleeps on a pillow stuffed with Astro's hair.
    • "uploading data you get from a random dog you meet in the woods is NOT a Salesforce best practice"
    • Items that Cloudy the Goat has been using for mastication
    • #PancakeHands

    So, hit the trail. Decrypt the note. And bring Astro home!

    See also:

    Wednesday, March 30, 2016

    Salesforce IDE superpowers uncovered

    Disclaimer: I've been informed by Salesforce that this an exceptional case for functionality is still being refined and will likely be exposed broadly in a future API version (#SafeHarbour). Although perhaps not in this exact form. As with all undocumented API features they could disappear at any time in a future release.

    Borrowing the Warranty phrasing from Scott Hanselman:
    Of course, this is just some dude's blog. Depending on undocumented API functionality is a recipe for losing your job and a messy divorce. Salesforce are likely to make changes to the existing functionality between major releases. As they don't know you are using this functionality they won't tell you. There's no warranty, express or implied. I don't know you and I don't how how you got here. Stop calling. Jimmy no live here, you no call back! [My current landline phone number used to belong to a Thai takeaway shop - True Story]

    My blog disclaimer also applies.

    In putting together an answer to a Salesforce StackExchange question I came across something odd with the IDE source code. The question needed a way to find details about installed packages.

    I knew from my keyprefix list that InstalledPackageVersion existed. It wasn't, however, exposed via SOQL to the Partner API or Tooling API. So why then when I was Googling around does it show up in the source code for the IDE in a SOQL query?:

        // P A C K A G E S
        DEVELOPMENT_PACKAGES("SELECT Id, Name, Description, IsManaged FROM DevelopmentPackageVersion"),
        INSTALL_PACKAGES("SELECT Id, Name, Description, IsManaged, VersionName FROM InstalledPackageVersion"),

    What makes the IDE so special that it can run SOQL queries that other API users can't?

    The answer lies in the SOAP API CallOptions.client header. The docs say this is "A string that identifies a client.". I've used it before in the past after passing the app exchange security review to access the Partner API in professional edition orgs. It turns out this is also the key to accessing the hidden abilities of the IDE. Again from the source code for the IDE:

        //value is critical for Eclipse-only API support
        private final String clientIdName = "apex_eclipse";

    This is later combined with the API version to create the callOptions header.

    So what if we use the same string when we establish our Session using the Partner API and then on subsequent calls?

    The FuseIT SFDC Explorer supports setting the Client Id on the login New Connection screen and on a saved connection string.

       <add name="ForceCom IDE Login" 

    The raw SOAP POST request:

    <?xml version="1.0" encoding="utf-8"?>
    <soap:Envelope xmlns:soap="" xmlns:xsi="" xmlns:xsd="">
        <CallOptions xmlns="">
        <SessionHeader xmlns="">
        <-- etc... -->

    Now we be access to additional sObject Types that were previously inaccessible. So far I've tried:

    • DevelopmentPackageVersion
    • InstalledPackageVersion
    • ApexClassIdentifier
    • ApexClassIdentifierRelationship

    I'll do some more poking around as time permits to see if there are any other hidden treasures.