Wednesday, August 24, 2016

Dreamforce 2016 Session Picks and General Tips

Here are some of my current picks for Dreamforce 2016 sessions. They are mostly development or architecture focused. I'll be refining the list as more information becomes available. As with previous years, it's likely that I won't actually get to all of these and will need to prioritize activities that can only happen at Dreamforce vs. things that are being recorded.

If you haven't already registered for Dreamforce you can use the Developer Discount code D15DEV999F

Artifacts

Something is coming in the packaging/change set/metadata deployment area. The first session in this list is definitely worth a visit. Here's hoping for some sort of source control integrated deployments.

Mocking and Testing

The Winter '17 Release notes include the section - Build a Mocking Framework with the Apex Stub API (Pilot). I'm lead to believe that the first of the talks below will have some more details on using the System.StubProvider interface and System.Test.createStub() method. If only because Aaron Slettehaugh from Salesforce is also presenting with Jesse Altman.

Meet The *'s

The Meet the Developers session on the last day of the conference is always an interesting one and might not be recorded. This year I see two other additional variations.

Keynotes

  • Developer Keynote - Thursday Oct 6th 11-12pm
  • Main Dreamforce Keynote - Wednesday Oct 5th 1-3pm
  • Mark & Exec Q&A - Friday Oct 7th 2-3pm
  • Something on the new Einstein AI product? Salesforce Einstein Keynote: AI for Everyone

Lightning

It goes without saying that everything will be either Lightning or Trailhead based this year. Probably both.

Custom Metadata

IoT

Platform Events

Winter '17 Release Notes:

Use platform events to deliver secure and scalable custom notifications within Salesforce or from external sources. Define fields to customize your platform event. Your custom platform event determines the event data that the Force.com Platform can produce or consume.

Miscellaneous


General Tips

Hopefully you've been an adult for long enough by now to know if you're going to do a lot of walking you probably need to wear something comfortable on your feet. Seems like an odd thing to have to remind people about. Then again, all my shoes are comfortable. Why are people buying shoes that aren't comfortable?

  • Don’t bring your laptop. In previous years other channels were promoting travelling light with just a cellphone and maybe a tablet. I say bring a small laptop to the developer zone in Moscone West. Seen something cool in a session and want to try it out straight away? A laptop gives you full access to Salesforce and your favorite developer tools. I've never tried to code Apex on my phone or tablet, but I'm pretty sure it would be a frustratingly slow experience. With Trailhead being such a big part of the developer zone this year, it could be useful to knock out a few modules on the go. There are also the development "Mini Hacks" to be completed. Easier to have your own machine on hand than have to wait for a community machine of unknown configuration.
  • Following on from that, create a blank dev org. Maybe a prerelease org. This gives you a blank canvas to experiment from.
  • Bring a power bank type device to charge your cellphone so you can avoid being tied to a powerpoint. You can probably pick several of these up from vendors as giveaways if need be.
  • Talk to the person next to you, find out what they do for a living with Salesforce. Find out what sessions they liked so far and what they intend to attend.
  • If you get a good photo of a presenter during a session, share it with them. The session audio and slides are often recorded, but there may be no other visual proof that they presented at Dreamforce.
  • Be mindful of who you let scan your badge. By all means, if you want to hear from them again scan away. Otherwise, is it worth giving your contact details to a vendor for some shiny trinkets to take home to the kids?
  • A developer can mostly stay within the Moscone West building and find plenty of suitable sessions and vendors to visit. It will be full of technical sessions and activities. That's not to say the an excursion out to Moscone North for the main expo isn't worth it. (Expo map)
  • Be adaptable with your scheduling. The majority of the sessions are recorded. It's sad for the presenters who have put so much effort into creating their sessions, but focus on things that you can't catch up on later in the recordings.
  • Stop by the Admin Zone. In previous years they have offered professional headshots (Headshot studio?). Do this early in the conference before lack of sleep starts to catch up with you.
  • Get your twitter handle and avatar on your badge. I spend all year interacting with people via twitter, then struggle to identify them in real life if they don't resemble their abstract avatar.
  • Fleet Week San Francisco is on October 3-10. If you like planes the airshow was worth a detour or an extended stay if you can.
  • Track Dreamforce Shuttle Locations in Real-Time: http://www.dreamforcebuses.com/
  • The Salesforce Events App: iPhone and Android
  • Dreamforce BART promotion for round trip from SFO to downtown SF.
    A Clipper card for Muni, BART, & CalTrain to get around! FYI, 511.org brings Bay Area transit all together in one site.

International Traveler

  • Plan on having an extra bag on the way back incase you pickup some oversized swag.
  • Get a local sim card. Have a plan if you previously relied on SMS 2 factor authentication. Update apps like Uber with your new temp contact details.
  • Switch your Calendar to PST.
  • If you can time it right, drop ship things to the FedEx office at 726 Market St. It is only a quick walk from the conference and you can get a "Hold at FedEx location" when shipping.

See also:

Tuesday, August 16, 2016

Using Two-Factor Authentication in Salesforce with Windows 10 Mobile

As part of the Trailhead User Authentication - Secure Your Users' Identity module I enabled Two-Factor authentication for a user in my developer org.

Upon logging in with the user required to use 2FA I now get the following prompt to download the "Salesforce Authenticator from the App Store or Google Play":

As a Windows Phone / Windows 10 Mobile user this wasn't really an option for me.

Happily, Salesforce is using the IETF RFC 6238 time-based one-time password (TOTP) algorithm. Being a standard we can substitute in another app that is available - such as Microsoft's Authenticator.

  • Use the "Choose Another Verification Method" link at the bottom of the "Connect Salesforce Authenticator" page.
  • Choose "Use verification codes from an authenticator app"
  • Start the Authenticator app on your phone. Use the "add" app bar button. Use the "Scan" button.
  • Optionally tap the new entry to give it a more meaningful name.
  • Use the generated code to complete the authentication process.

See also:

Wednesday, August 3, 2016

Integrating UAV/drone remote data acquisition with Salesforce to enhance logistics

Late last year I did a brief interview at the Auckland Salesforce APAC tour event and talked about capturing sensor data from quadcopters in flight and the potential of integrating this data with Salesforce.

One of the first scenarios that I explored was one that is topical here in New Zealand.

New Zealand has no native terrestrial mammals (except for bats and seals).
[Biodiversity of New Zealand and DOC Native animals]

If something is running around in the bush and it isn't an insect or bird (or a very confused native bat or seal) then in is an introduced species.

A number of introduced species, such as rats, possums, and mustelids survive by predation of native species. And if the they aren't directly eating the natives they are competing for the same food and resources. Possums and feral cats can also spread diseases. All in all they are unwelcome visitors in the local ecosystem. Monitoring and trapping programs are used by various conservation groups to help control the spread and population of introduced species.

A large number of traps and monitoring stations are deployed out in the wild and on farms in locations where it can be time consuming to check them; either because of their remoteness and distribution over many hectares and/or because of their numerousness. Staff or volunteers need to physically visit each site regularly to check if the trap needs any attention. The labour involved in checking and maintaining the traps can be a significant percentage of the overall cost [Source] and a limiting factor in how many traps can be deployed.

Decline (attenuation) in radio signal strength through
forest with increasing distance from a transmitter of
four frequencies compared to transmission through
free space (i.e. with no vegetation). [Source]

There are existing solutions that utilize a sensor and wireless connection on each trap. These sensors rely of having either an internet, cellular, or satellite link to communicate the trap status back. Transmitting through dense forest also reduces radio signal strength.

Which brings me back to Salesforce and UAVs. Can I cut the cost of the equipment deployed with each trap by keeping the sensor and wireless link very basic and then relying on UAV flights in the area to collect the data periodically? The end goal is to allow trap checkers to focus their attention where they will be most productive, and to expand the area of operation.

High level plan:

  • Put a small short range transmitter on each trap that sends out a periodic signal when the trap needs attention. Ideally the sensor on each trap will be very basic and will be able to transmit for a sufficient period of time once triggered. The transmission range should be able to reach a height above the canopy where the UAV can pass by.
  • Record the geolocation where each trap is deployed using a record in Salesforce.
  • Periodically dispatch a UAV to fly a circuit of the traps in the area. The paths for these flights can be determined from the geolocation data in Salesforce.
  • The UAV will carry the receiver and small computer (Raspberry Pi, Ardunio or similar) to capture the signal data plus additional telemetry (GPS location at time of signal).
  • When the UAV has an internet connection relay the collected data back into Salesforce.
  • Use the collected data to notify which traps need attention.
  • Run analytics over the gathered data to identify gains, such as finding areas that would benefit from having more traps deployed.

There are a lot of moving parts here (some more literally than others). So before I get too far ahead of myself we'll start with some of the basics. I'll cover off all the parts in a number of subsequent blogs posts as there is lots to cover.

Trap tracking in Salesforce

A good place to start will be how the trap records are stored in Salesforce. In the simplest case this can just be a custom record with fields to include the applicable details, such as when and where the trap was deployed. A geolocation Compound field is particularly useful for the latter part as it brings native support for calculations of distances around a latitude and longitude pair.

I'll take a slight detour here from the immediate scenario above to explore something similar but still cover a number of important points. Another introduced pest species here in Nelson is the Great white butterfly. The key difference here is that much of the trapping for the great white butterfly is occurring in a suburban environment around residential addresses. This allows the use of the new automatic geocoding for addresses that became available in Summer '16.

Before the auto geocoding will occur you need to review and activate the Data.com clean rules. I activated them for Leads as a starting point.
Setup > Administer > Data.com Administration > Clean > Clean Rules

I also found it useful to add additional formula fields for the geolocation fields (latitude, longitude and accuracy) as they can't otherwise be directly exposed on the page layout.

Now with just the street address details for the properties of interest entered against Leads in Salesforce a SOQL query can be used to find the points I need to fly to within my operational area. Using the GEOLOCATION function to define the takeoff point and the DISTANCE function to search for sites of interest within the operating area.

SELECT Id, LastName, Latitude, Longitude
FROM Lead
WHERE DISTANCE(Address, GEOLOCATION(-41.264268, 173.291987), 'KM') < 2

Unfortunately the native Visualforce mapping controls aren't available in developer edition orgs. Instead I'll export the locations of interest in the Keyhole Markup Language (KML) used by Google Earth. I was going to use the Google Earth API to embed it in a web page, but it was deprecated by Google (Boo!).


A Visualforce page can be used to generate the KML file from the SOQL query. This is a minimal initial version. I'll parameterize the origin location for the search and likely use it as the starting point for the flight.

Visualforce to generate KML

Controller to generate KML

That's sufficient to export the KML file into Google Earth for the points of interest.

Visualforce for Google Maps loading KML


An additional Visualforce page can be setup to embed a Google Map and directly load the KML file in. I've kept the Google API key in a custom setting. The KML file needed to be publicly accessible to the Google Mapping servers, so I set up Sites to expose it and allowed the Public Access Settings profile access to Leads and the Geolocation fields. It appears Google was caching the KML content. Adding a random query string was sufficent to get it updating.


There is still lots to explore here, with the next pressing part being finding a route between all the sites that need to be flown. If you've ever done any computer algorithms or AI course you'll know this as the Travelling Salesman Problem - how to find the optimal (or close to optimal) path that visits every node.

I'll come back to that shortly, as it is way too interesting not to try something like a genetic algorithm or nearest neighbor algorithm in Apex to look for some solutions.

In the meantime...


Caveats

There are laws and regulations on where and when you can fly a UAV, quadcopter, kite, helium balloon on a really long string, etc...
You will need to educate yourself about the laws and regulations that may be applicable in your country, state, province or locality.

Within New Zealand the Airshare website is a good starting point for the rules defined by the Civil Aviation Authority.

Tuesday, August 2, 2016

Preventing trigger recursion and handling a Workflow field update

Like a good developer, I've included recursion protection in my managed package after update triggers to prevent interactions with other triggers in an org from creating an infinite update loop that would ultimately end in a "maximum trigger depth exceeded" exception. The recursion protection mechanism is fairly basic. It uses a class level static Set of processed record Ids. The first thing the trigger does is skip additional processing from any record ID already in the Set. After the record is first processed by the trigger its Id is added to the static Set.

The functionality for the triggers in question is dynamic in nature. Admins who install the managed package can configure a list of fields of interest on an Opportunity that will be dynamically mapped to custom records related to the Opportunity. E.g. They may map the Opportunity field with the API name "Description" into a custom record related to the Opportunity. This is then used for further processing when integrating with an external system. The important part is that it is entirely dynamic. Users of the managed package should be able to configure any Opportunity API field name and it will be mapped by the trigger to the custom record for further processing.

This setup works well with one exception. What if a subsequent trigger or workflow field update rule makes further changes to one of the mapped fields. In the Triggers and Order of Execution the workflow rules execute after the triggers. The workflow rule will cause the trigger to fire again for the field update, but the current recursion protection will prevent any further processing from occurring.

12. If the record was updated with workflow field updates, fires before update triggers and after update triggers one more time (and only one more time), in addition to standard validations. Custom validation rules, duplicate rules, and escalation rules are not run again. [Source]

I needed a mechanism that detects if one of the dynamically mapped fields has subsequently changed and to run the trigger auto mapping again. In the simplest case where I was only interested in a single field changing a Map from the record ID to the last processed field value could be used (See How to avoid recursive trigger other than the classic 'class w/ static variable' pattern?). The challenge here is that the fields of interest are dynamic in nature so they can't be predefined in a Map.

In my case the trigger field mapping functionality was idempotent. So while it is important that it didn't run recursively if nothing had changed on the base record, I didn't need to be exact in which fields were changing. Given this, I went with storing the System.hashCode(obj) for the Opportunity at the time it was last processed. The hash code helps here as any change to a field on the Opportunity will change the hash code, making it ideal to detect if there has been any field changes on the Opportunity.

The following example was put together directly by hand, so it might contain syntax errors etc...


trigger DynamicOpportunityFieldTrigger on Opportunity (after update) {
    OpportunityFieldMapper ofm = new OpportunityFieldMapper();
    ofm.mapFields(trigger.new, trigger.oldMap);
}

public class OpportunityFieldMapper {
    private static Map<Id, Integer> visitedRecordIdsToLastSeenHashMap = new Map<Id, Integer>();

    // List of applicable Opportunities. Expected to be pruned already and only includes records not already processed in the transaction
    private List<sObject> recordsToMapFieldsFor = new List<sObject>();

    public void addOpportunity(Opportunity opp) {
        if(visitedRecordIdsToLastSeenHashMap.containsKey(opp.Id)) {

     Integer lastSeenHash = visitedRecordIdsToLastSeenHashMap.get(opp.Id);
     Integer currentHash = System.hashCode(opp);

     if(lastSeenHash == currentHash) {
      System.debug(LoggingLevel.Debug, '.addOpportunity skipping visited OpportunityId: ' + opp.Id + '. Unchanged hash');
      return;
     }

     System.debug(LoggingLevel.Debug, 'AdBookDynamicPropertyMapping.addOpportunity Hash for OpportunityId: ' + opp.Id + ' changed from '+ lastSeenHash + ' to ' + currentHash + '. Not skipping due to change');   
 }

        visitedRecordIdsToLastSeenHashMap.put(opp.Id, System.hashCode(opp));
    
        // Queue for latter field mapping.
        recordsToMapFieldsFor.add(opp);
    }

    public void mapFields(List<Account> triggerNew, Map<id, Account> triggerOldMap) {
        for(Opportunity opp : triggerNew) {
            addOpportunity(opp);
        }

        // Use the recordsToMapFieldsFor collection to perform the actual mappings
    }
}