Saturday, September 10, 2016

Getting involved in the Salesforce Developer Community

I'll let you in on a little secret. Salesforce as a platform is really big, and it is only getting bigger. See all those colored circles, those are the Sales, Service, Marketing, Community, Analytics, Platform & Apps products. Then there is IoT in the middle integrating with many of those areas. Plus at Dreamforce '16 we are going to learn more about Salesforce Einstein, the new AI product and where it will fit in.

I'll go out on a limb here and say there probably isn't anyone with really in depth knowledge of all these products. And by that I mean deep vertical knowledge that would only come from working with it day in and day out solving real world problems. The sort of person you turn to late on a Friday when the deployment fails with some newly discovered error that even Google has never heard of.

How are you, as a developer, supposed to work with any number of these products at any given time?

No developer is an Island

Thankfully, it isn't all doom and gloom.

There are a number of existing Salesforce communities to help fill in the gaps with your Salesforce knowledge. Best of all, they are all free.

Salesforce StackExchange (SFSE)

This is my personal favorite, so it goes first in the list. Think Stack Overflow specifically for Salesforce. It's a well tuned site for asking and answering questions specific to working with Salesforce. It isn't a discussion forum, so you avoid the back and forward chit chat and associated noise that you get in other places. Plus content gets moderated and refined over time by the community.

An example of community moderation in the StackExchange format is the drive towards removing duplicates and having one conical answer where possible. We don't really need one question per person who encounters a System.QueryException: List has no rows for assignment to SObject exception. One is sufficient to outline how to check if a SOQL query returns at least one result.

SFSE is definitely my first stop for any Salesforce related questions I might have. More often than not the system will suggest a related question to the one you are forming, saving you from creating a new one.

If there is a downside to the SFSE system, it is that it can be a bit unfriendly/daunting to new users. Approaching it like any other forum system likely won't get you the results you want. Before asking a new question, have a read of How do I ask a good question? You want the question to be on topic to Salesforce, about a specific problem, and with the right amount of relevant details that someone could answer it.


  • Include the applicable lines of code.
    If your asking a question about an exception, include the code that caused it. More importantly, indicate which of those lines of code caused the exception. It isn't always obvious when dealing with a snippet of code as line numbers aren't meaningful.
  • Put some effort into formatting your question. Code blocks in particular.
    Use the built in code formatting syntax. A bit of clean formatting makes it much easier for anyone attempting to answer your question to focus on the actual problem rather than misaligned code samples.
  • Vote for the constructive benefit of the site.
    Use your votes to encourage good questions and useful answers.
  • Acknowledge and link to any resources you used.
  • Use the tags.
    These help users identify questions they can potentially answer faster.


  • Post without showing any effort on your part to solve the problem.
    Someone else is giving up their time to try and help you. Include some details about what you have already tried and why it didn't work.
  • Post comments as an answer.
    It will get removed. Instead try and gain sufficient reputation so you can use the commenting system.
  • Post answers as comments.
    Comments are considered transient. They can drop off the page if there are several other interactions occurring. Also, voting on comments only serves to reorder them, nothing else.

Salesforce Developer forums

The Salesforce Developer Forum predates the SFSE. It still offers a broad trove of knowledge in the various threads.

The difficulty here is finding the nuggets of current information. It suffers from a number of problems that Stack Overflow was designed to address with Q&A sites. Lots of duplicates, stale posts that can't be edited, a limited notification system for interacting on multiple threads, painfully bad hyperlinks, etc....


If you've got problems that can be expressing in 140 character then the #askforce hash tag can provide 140 character solutions.

Also follow and interact with MVPs and UG leaders.

Local User Groups

Check if there is already a local user group that you could attend. This is a great way to network, attend presentations, and share ideas.

Salesforce blogs

Many members of the Salesforce Developer community also run blogs that run the gamut of products. Rather than trying to compile such a list here I'll refer you to Johan Yu's complied a list of Salesforce Blogs.


First off, you got me, this isn't really a community where you can interact with other developers directly. At least not on the face of it. Initially it is an excellent learning resource for Salesforce. You can take modules on topics of interest and then complete some challenges to help cement what you have just learned. However, the more time you spend doing it the more you find there is a whole community of people behind it as well. See #Trailhead. Also, it is not uncommon to see Trailhead specific events at usergroups.


Suggest new enhancements or vote on existing ideas to see the products changed.

Wednesday, August 24, 2016

Dreamforce 2016 Session Picks and General Tips

Here are some of my current picks for Dreamforce 2016 sessions. They are mostly development or architecture focused. I'll be refining the list as more information becomes available. As with previous years, it's likely that I won't actually get to all of these and will need to prioritize activities that can only happen at Dreamforce vs. things that are being recorded.

If you haven't already registered for Dreamforce you can use the Developer Discount code D15DEV999F


Something is coming in the packaging/change set/metadata deployment area. The first session in this list is definitely worth a visit. Here's hoping for some sort of source control integrated deployments.

Mocking and Testing

The Winter '17 Release notes include the section - Build a Mocking Framework with the Apex Stub API (Pilot). I'm lead to believe that the first of the talks below will have some more details on using the System.StubProvider interface and System.Test.createStub() method. If only because Aaron Slettehaugh from Salesforce is also presenting with Jesse Altman.

Meet The *'s

The Meet the Developers session on the last day of the conference is always an interesting one and might not be recorded. This year I see two other additional variations.


  • Developer Keynote - Thursday Oct 6th 11-12pm
  • Main Dreamforce Keynote - Wednesday Oct 5th 1-3pm
  • Mark & Exec Q&A - Friday Oct 7th 2-3pm
  • Something on the new Einstein AI product? Salesforce Einstein Keynote: AI for Everyone


It goes without saying that everything will be either Lightning or Trailhead based this year. Probably both.

Custom Metadata


Platform Events

Winter '17 Release Notes:

Use platform events to deliver secure and scalable custom notifications within Salesforce or from external sources. Define fields to customize your platform event. Your custom platform event determines the event data that the Platform can produce or consume.


General Tips

Hopefully you've been an adult for long enough by now to know if you're going to do a lot of walking you probably need to wear something comfortable on your feet. Seems like an odd thing to have to remind people about. Then again, all my shoes are comfortable. Why are people buying shoes that aren't comfortable?

  • Don’t bring your laptop. In previous years other channels were promoting travelling light with just a cellphone and maybe a tablet. I say bring a small laptop to the developer zone in Moscone West. Seen something cool in a session and want to try it out straight away? A laptop gives you full access to Salesforce and your favorite developer tools. I've never tried to code Apex on my phone or tablet, but I'm pretty sure it would be a frustratingly slow experience. With Trailhead being such a big part of the developer zone this year, it could be useful to knock out a few modules on the go. There are also the development "Mini Hacks" to be completed. Easier to have your own machine on hand than have to wait for a community machine of unknown configuration.
  • Following on from that, create a blank dev org. Maybe a prerelease org. This gives you a blank canvas to experiment from.
  • Bring a power bank type device to charge your cellphone so you can avoid being tied to a powerpoint. You can probably pick several of these up from vendors as giveaways if need be.
  • Talk to the person next to you, find out what they do for a living with Salesforce. Find out what sessions they liked so far and what they intend to attend.
  • If you get a good photo of a presenter during a session, share it with them. The session audio and slides are often recorded, but there may be no other visual proof that they presented at Dreamforce.
  • Be mindful of who you let scan your badge. By all means, if you want to hear from them again scan away. Otherwise, is it worth giving your contact details to a vendor for some shiny trinkets to take home to the kids?
  • A developer can mostly stay within the Moscone West building and find plenty of suitable sessions and vendors to visit. It will be full of technical sessions and activities. That's not to say the an excursion out to Moscone North for the main expo isn't worth it. (Expo map)
  • Be adaptable with your scheduling. The majority of the sessions are recorded. It's sad for the presenters who have put so much effort into creating their sessions, but focus on things that you can't catch up on later in the recordings.
  • Stop by the Admin Zone. In previous years they have offered professional headshots (Headshot studio?). Do this early in the conference before lack of sleep starts to catch up with you.
  • Get your twitter handle and avatar on your badge. I spend all year interacting with people via twitter, then struggle to identify them in real life if they don't resemble their abstract avatar.
  • Fleet Week San Francisco is on October 3-10. If you like planes the airshow was worth a detour or an extended stay if you can.
  • Track Dreamforce Shuttle Locations in Real-Time:

International Traveler

  • Plan on having an extra bag on the way back incase you pickup some oversized swag.
  • Get a local sim card. Have a plan if you previously relied on SMS 2 factor authentication. Update apps like Uber with your new temp contact details.
  • Switch your Calendar to PST.
  • If you can time it right, drop ship things to the FedEx office at 726 Market St. It is only a quick walk from the conference and you can get a "Hold at FedEx location" when shipping.

See also:

Tuesday, August 16, 2016

Using Two-Factor Authentication in Salesforce with Windows 10 Mobile

As part of the Trailhead User Authentication - Secure Your Users' Identity module I enabled Two-Factor authentication for a user in my developer org.

Upon logging in with the user required to use 2FA I now get the following prompt to download the "Salesforce Authenticator from the App Store or Google Play":

As a Windows Phone / Windows 10 Mobile user this wasn't really an option for me.

Happily, Salesforce is using the IETF RFC 6238 time-based one-time password (TOTP) algorithm. Being a standard we can substitute in another app that is available - such as Microsoft's Authenticator.

  • Use the "Choose Another Verification Method" link at the bottom of the "Connect Salesforce Authenticator" page.
  • Choose "Use verification codes from an authenticator app"
  • Start the Authenticator app on your phone. Use the "add" app bar button. Use the "Scan" button.
  • Optionally tap the new entry to give it a more meaningful name.
  • Use the generated code to complete the authentication process.

See also:

Wednesday, August 3, 2016

Integrating UAV/drone remote data acquisition with Salesforce to enhance logistics

Late last year I did a brief interview at the Auckland Salesforce APAC tour event and talked about capturing sensor data from quadcopters in flight and the potential of integrating this data with Salesforce.

One of the first scenarios that I explored was one that is topical here in New Zealand.

New Zealand has no native terrestrial mammals (except for bats and seals).
[Biodiversity of New Zealand and DOC Native animals]

If something is running around in the bush and it isn't an insect or bird (or a very confused native bat or seal) then in is an introduced species.

A number of introduced species, such as rats, possums, and mustelids survive by predation of native species. And if the they aren't directly eating the natives they are competing for the same food and resources. Possums and feral cats can also spread diseases. All in all they are unwelcome visitors in the local ecosystem. Monitoring and trapping programs are used by various conservation groups to help control the spread and population of introduced species.

A large number of traps and monitoring stations are deployed out in the wild and on farms in locations where it can be time consuming to check them; either because of their remoteness and distribution over many hectares and/or because of their numerousness. Staff or volunteers need to physically visit each site regularly to check if the trap needs any attention. The labour involved in checking and maintaining the traps can be a significant percentage of the overall cost [Source] and a limiting factor in how many traps can be deployed.

Decline (attenuation) in radio signal strength through
forest with increasing distance from a transmitter of
four frequencies compared to transmission through
free space (i.e. with no vegetation). [Source]

There are existing solutions that utilize a sensor and wireless connection on each trap. These sensors rely of having either an internet, cellular, or satellite link to communicate the trap status back. Transmitting through dense forest also reduces radio signal strength.

Which brings me back to Salesforce and UAVs. Can I cut the cost of the equipment deployed with each trap by keeping the sensor and wireless link very basic and then relying on UAV flights in the area to collect the data periodically? The end goal is to allow trap checkers to focus their attention where they will be most productive, and to expand the area of operation.

High level plan:

  • Put a small short range transmitter on each trap that sends out a periodic signal when the trap needs attention. Ideally the sensor on each trap will be very basic and will be able to transmit for a sufficient period of time once triggered. The transmission range should be able to reach a height above the canopy where the UAV can pass by.
  • Record the geolocation where each trap is deployed using a record in Salesforce.
  • Periodically dispatch a UAV to fly a circuit of the traps in the area. The paths for these flights can be determined from the geolocation data in Salesforce.
  • The UAV will carry the receiver and small computer (Raspberry Pi, Ardunio or similar) to capture the signal data plus additional telemetry (GPS location at time of signal).
  • When the UAV has an internet connection relay the collected data back into Salesforce.
  • Use the collected data to notify which traps need attention.
  • Run analytics over the gathered data to identify gains, such as finding areas that would benefit from having more traps deployed.

There are a lot of moving parts here (some more literally than others). So before I get too far ahead of myself we'll start with some of the basics. I'll cover off all the parts in a number of subsequent blogs posts as there is lots to cover.

Trap tracking in Salesforce

A good place to start will be how the trap records are stored in Salesforce. In the simplest case this can just be a custom record with fields to include the applicable details, such as when and where the trap was deployed. A geolocation Compound field is particularly useful for the latter part as it brings native support for calculations of distances around a latitude and longitude pair.

I'll take a slight detour here from the immediate scenario above to explore something similar but still cover a number of important points. Another introduced pest species here in Nelson is the Great white butterfly. The key difference here is that much of the trapping for the great white butterfly is occurring in a suburban environment around residential addresses. This allows the use of the new automatic geocoding for addresses that became available in Summer '16.

Before the auto geocoding will occur you need to review and activate the clean rules. I activated them for Leads as a starting point.
Setup > Administer > Administration > Clean > Clean Rules

I also found it useful to add additional formula fields for the geolocation fields (latitude, longitude and accuracy) as they can't otherwise be directly exposed on the page layout.

Now with just the street address details for the properties of interest entered against Leads in Salesforce a SOQL query can be used to find the points I need to fly to within my operational area. Using the GEOLOCATION function to define the takeoff point and the DISTANCE function to search for sites of interest within the operating area.

SELECT Id, LastName, Latitude, Longitude
WHERE DISTANCE(Address, GEOLOCATION(-41.264268, 173.291987), 'KM') < 2

Unfortunately the native Visualforce mapping controls aren't available in developer edition orgs. Instead I'll export the locations of interest in the Keyhole Markup Language (KML) used by Google Earth. I was going to use the Google Earth API to embed it in a web page, but it was deprecated by Google (Boo!).

A Visualforce page can be used to generate the KML file from the SOQL query. This is a minimal initial version. I'll parameterize the origin location for the search and likely use it as the starting point for the flight.

Visualforce to generate KML

Controller to generate KML

That's sufficient to export the KML file into Google Earth for the points of interest.

Visualforce for Google Maps loading KML

An additional Visualforce page can be setup to embed a Google Map and directly load the KML file in. I've kept the Google API key in a custom setting. The KML file needed to be publicly accessible to the Google Mapping servers, so I set up Sites to expose it and allowed the Public Access Settings profile access to Leads and the Geolocation fields. It appears Google was caching the KML content. Adding a random query string was sufficent to get it updating.

There is still lots to explore here, with the next pressing part being finding a route between all the sites that need to be flown. If you've ever done any computer algorithms or AI course you'll know this as the Travelling Salesman Problem - how to find the optimal (or close to optimal) path that visits every node.

I'll come back to that shortly, as it is way too interesting not to try something like a genetic algorithm or nearest neighbor algorithm in Apex to look for some solutions.

In the meantime...


There are laws and regulations on where and when you can fly a UAV, quadcopter, kite, helium balloon on a really long string, etc...
You will need to educate yourself about the laws and regulations that may be applicable in your country, state, province or locality.

Within New Zealand the Airshare website is a good starting point for the rules defined by the Civil Aviation Authority.

Tuesday, August 2, 2016

Preventing trigger recursion and handling a Workflow field update

Like a good developer, I've included recursion protection in my managed package after update triggers to prevent interactions with other triggers in an org from creating an infinite update loop that would ultimately end in a "maximum trigger depth exceeded" exception. The recursion protection mechanism is fairly basic. It uses a class level static Set of processed record Ids. The first thing the trigger does is skip additional processing from any record ID already in the Set. After the record is first processed by the trigger its Id is added to the static Set.

The functionality for the triggers in question is dynamic in nature. Admins who install the managed package can configure a list of fields of interest on an Opportunity that will be dynamically mapped to custom records related to the Opportunity. E.g. They may map the Opportunity field with the API name "Description" into a custom record related to the Opportunity. This is then used for further processing when integrating with an external system. The important part is that it is entirely dynamic. Users of the managed package should be able to configure any Opportunity API field name and it will be mapped by the trigger to the custom record for further processing.

This setup works well with one exception. What if a subsequent trigger or workflow field update rule makes further changes to one of the mapped fields. In the Triggers and Order of Execution the workflow rules execute after the triggers. The workflow rule will cause the trigger to fire again for the field update, but the current recursion protection will prevent any further processing from occurring.

12. If the record was updated with workflow field updates, fires before update triggers and after update triggers one more time (and only one more time), in addition to standard validations. Custom validation rules, duplicate rules, and escalation rules are not run again. [Source]

I needed a mechanism that detects if one of the dynamically mapped fields has subsequently changed and to run the trigger auto mapping again. In the simplest case where I was only interested in a single field changing a Map from the record ID to the last processed field value could be used (See How to avoid recursive trigger other than the classic 'class w/ static variable' pattern?). The challenge here is that the fields of interest are dynamic in nature so they can't be predefined in a Map.

In my case the trigger field mapping functionality was idempotent. So while it is important that it didn't run recursively if nothing had changed on the base record, I didn't need to be exact in which fields were changing. Given this, I went with storing the System.hashCode(obj) for the Opportunity at the time it was last processed. The hash code helps here as any change to a field on the Opportunity will change the hash code, making it ideal to detect if there has been any field changes on the Opportunity.

The following example was put together directly by hand, so it might contain syntax errors etc...

trigger DynamicOpportunityFieldTrigger on Opportunity (after update) {
    OpportunityFieldMapper ofm = new OpportunityFieldMapper();
    ofm.mapFields(, trigger.oldMap);

public class OpportunityFieldMapper {
    private static Map<Id, Integer> visitedRecordIdsToLastSeenHashMap = new Map<Id, Integer>();

    // List of applicable Opportunities. Expected to be pruned already and only includes records not already processed in the transaction
    private List<sObject> recordsToMapFieldsFor = new List<sObject>();

    public void addOpportunity(Opportunity opp) {
        if(visitedRecordIdsToLastSeenHashMap.containsKey(opp.Id)) {

     Integer lastSeenHash = visitedRecordIdsToLastSeenHashMap.get(opp.Id);
     Integer currentHash = System.hashCode(opp);

     if(lastSeenHash == currentHash) {
      System.debug(LoggingLevel.Debug, '.addOpportunity skipping visited OpportunityId: ' + opp.Id + '. Unchanged hash');

     System.debug(LoggingLevel.Debug, 'AdBookDynamicPropertyMapping.addOpportunity Hash for OpportunityId: ' + opp.Id + ' changed from '+ lastSeenHash + ' to ' + currentHash + '. Not skipping due to change');   

        visitedRecordIdsToLastSeenHashMap.put(opp.Id, System.hashCode(opp));
        // Queue for latter field mapping.

    public void mapFields(List<Account> triggerNew, Map<id, Account> triggerOldMap) {
        for(Opportunity opp : triggerNew) {

        // Use the recordsToMapFieldsFor collection to perform the actual mappings

Thursday, June 30, 2016

Monitoring your Salesforce API usage

This seems to be a fairly common request on the Salesforce forums that developers frequent.

What is REQUEST_LIMIT_EXCEEDED: TotalRequests Limit exceeded. error about?
How can I found out what caused me to hit it?
How to determine what is making the API calls?
How can I get insight into an API call that took place on a certain date for one of our connected apps?

First, some context. This is the Total API Request Limits limit. It is a rolling limit for an organization over the last 24-hours. This means an API call made just now will count towards that limit until 24 hours from now. Don't expect the limit to reset back to zero at midnight.

The exact size of this limit depends on the Salesforce Edition and the number of per user licenses of a given type you have. It is possible to purchase additional API calls without needing more user licenses.

Basic monitoring

There are two locations where it is easy to check the current API usage.

Under Setup > Company Profile > Company Information there is an API Requests field. This will show you the current API call count the the maximum you can reach.

Then, within Reports > Administrative Reports there is API Usage Last 7 Days

This provides slightly finer detail, such and the username that held the session and the Client Id that was used to make the call. The Client Id can be useful as it can identify which external app was consuming the API calls.

Receiving a warning

You can configure an email alert to a user when the API requests exceed a percentage or you maximum requests. This is a RateLimitingNotification record that you create from Setup > Administration Setup > Monitoring > API Usage Notifications.

Monitoring via the API

The REST API has a Limits resource and a specific Organizations Limits resource that includes the "DailyApiRequests".

The SOAP APIs have a LimitInfoHeader that can be used to monitor API usage.

Event Monitoring API

The Event Monitoring API can provide much finer details about API calls over a wider history. With this (paid feature) you can see exactly what API calls were made in a time period.

E.g. From the developer console query editor

select Id, EventType, LogDate, LogFileLength from EventLogFile where EventType = 'API' and LogDate = 2016-02-20T00:00:00.000Z

Look for the EventType of API and the LogDate for the UTC day of interest. Unfortunately there isn't a single comprehensive EventType that will allow you to monitor all events that contribute to the limit. There is also the Bulk API, Metadata API Operation, and REST API.

You can then pull down the single `LogFile` data, which is a base 64 encoded CSV with all the API calls for that day.

select Id, LogFile from EventLogFile where ID = '0AT700000005WDaGAM'

The FuseIT SFDC Explorer has an Event Log tab that uses the same API calls and will extract the file for you and export it to a CSV.

For the LogFile, look for the USER_ID, CLIENT_IP, and CLIENT_NAME to help identify which app is making the calls.

See also:

Tuesday, June 28, 2016

Taming the size of a Salesforce Canvas

For an artist, facing a blank canvas can be a real challenge. For a Salesforce developer a Canvas app can be challenging for an entirely different reason - how to even define what size the Canvas is to start with?

In the ideal world you could just follow the docs and Automatically Resize the Canvas App to suit the content size. In practice this doesn't work so well for all scenarios.

I'm looking to embed an existing web application that was previously located in a Web Tab iframe into a Canvas Visualforce page using <apex:canvasApp />. My ideal goal is that the external web application blends in with Salesforce. There shouldn't be any jarring scrollbars on the iframe that make it look out of place.

The Web Tab approached worked well in that the width of the iframe didn't need to be defined and so it would get an <iframe width="100%">. Now the iframe can shrink and grow to follow along with any changes to the browser size. The nested app can correspondingly adjust its width to suit the available dimensions. The downside to this approach is that the iframes height needs to be specified. This is more problematic, and requires a fixed height that is sufficient to hold the majority of content. E.g. 3000px. Ech! Crazy vertical scroll bar!

Back the the Canvas iframe and the problem with dimensions. Firstly, a plain default Canvas app won't size past 2000px in height and 1000px in width. You need to explicitly set the maxHeight and maxWidth attributes to "infinite" (Documented in Docs but undocumented in other locations.). With the browser full width on a 1080 screen the default 1000px width limit is way too low. Sadly there is currently no corresponding minWidth/minHeight attributes.

Now we've pulled the limits off how big the iframe can get, how to correctly size it to both the browser window and the content within? As mentioned above, the auto resize should be just the thing here. Unfortunately it doesn't play so well with content that dynamically scales to the available space. I found it would either default to the minimum page width defined by the content, or worse still, shrink as the iframe content used Javascript to resize and then reacted to the auto resizing in an unending loop. If there was a way to define the minimum height and to let the iframe width stay at "100%" it would be infinitely more useful.

The autogrow() resizing script appears to come from canvas-all.js. It is basically a timer to periodically call resize. I haven't gone through the fine details, but I believe part of the code is for communicating with the parent iframe so that it can be resized accordingly.

How can I size a Canvas apps iframe in Visualforce to be the full window width with a height to fit the content?

At this stage I'm experimenting with either using a custom canvas-all.js implementation or manually calling Sfdc.canvas.client.resize(). I'll update this page when I get somewhere workable.

Another potential problem with the height is dynamic content, such a modal dialogs, that may not register as affecting the page height.

See also: