Monday, 21 January 2019

Unsaved Changes in Spring 19

Unsaved Changes in Spring 19


As I mentioned in my last post, it’s often the little things in a release that make all the difference. Another small change in the Spring 19 release is the lightning:unsavedChanges aura component (yes aura, remember that in Spring 19 Lightning Web Components are GA and Lightning Components are renamed Aura Components). The release notes are somewhat understated - "Notify the UI about unsaved changes in your component”, but what this means is that I can retire a bunch of code/components that I’ve written in the past to sop the user losing their hard earned changes to a record. Even better, I’m wiring in to the standard Lightning Experience behaviour so I don’t need to worry if Salesforce change this behaviour, I’ll just pick it up automatically.


My example component is a simple form with a couple of inputs - first and last name:

<aura:component implements="flexipage:availableForAllPageTypes">
    <aura:attribute name="firstname" type="String" />
    <aura:attribute name="lastname" type="String" />
    <aura:handler name="change" value="{!v.firstname}" action="{!c.valueChanged}"/>
    <aura:handler name="change" value="{!v.lastname}" action="{!c.valueChanged}"/>

    <lightning:card variant="narrow" title="Unsaved Example">
        <div class="slds-p-around_medium">
            <lightning:input type="text" label="First Name" value="{!v.firstname}" />
            <lightning:input type="text" label="Last Name" value="{!v.lastname}" />
            <lightning:button variant="brand" label="Save" title="Save" onclick="{! }" />

    <lightning:unsavedChanges aura:id="unsaved"
                ondiscard="{!c.discard}" /> 

I have change handlers on the first name and last name attributes, so that I can notify the container that there are unsaved changes. I achieve this via the embedded lightning:unsavedChanges component, which exposes a method named setUnsavedChanges. In my change handler I invoke this method:

valueChanged : function(component, event, helper) {
    var unsaved = component.find("unsaved");
    unsaved.setUnsavedChanges(true, { label: 'Unsaved Example' });

So now if I add this component to a Lightning App page, enter some text and then try to click on another tab, I get a nice popup telling me that I may be making a big mistake. It also displays the label that I passed the setUnsavedChanges method, so that if there are multiple forms on the page then the user can easily identify which one I am referring to. 

Screenshot 2019 01 20 at 11 20 02

Of course, my Evil Co-Worker immediately spotted that they could pass the label of a different component and keep the user in a loop where they believe they have saved the changes but keep getting the popup. If I’m honest I expected something a bit more evil from them, like calling a method that clears the unsaved changes without saving the record when the user clicks the Save button,  clearly getting lazy as well as Evil in their old age.

What’s even nicer is that if I click the Discard Changes or Save buttons, my controller methods to discard or save the record are invoked, because I specified these on the component:

<lightning:unsavedChanges ..." onsave="{!}" ondiscard="{!c.discard}" /> 

so I have full control over what happens.

This kind of interaction with the user is possible when I click on a link that is managed by the Lightning Experience, but not so much if I reload the page or navigate to a bookmark. In this scenario I’m at the mercy of the beforeunload event, which is browser specific and much more limited.

Screenshot 2019 01 20 at 11 36 40

This behaviour can’t be customised, the idea being that you can’t hold a user on your page against their will, regardless of whether you are doing it for their own good. 

On another note, when I created the original lightning app page for this I used a single column, but the inputs then spanned the whole page. Thanks to the Spring 19 feature of changing lightning page templates, I was able to switch to header, body and right sidebar in a few seconds, rather than having to create a whole new page. I sense that feature is the gift that keeps giving.

Related Posts


Saturday, 12 January 2019

Change Lightning Page Template in Spring 19

Change Lightning Page Template in Spring 19


Sometimes it’s the little things that make all the difference, and in the Spring 19 release of Salesforce there’s a small change that will save me a bunch of time - support for changing the template of a Lightning Page. I seem to spend a huge amount of time recreating Lightning pages once I add a component that needs more room than is available. Often this is historic Bob's fault, because I’ve written a custom component that needs full width, but when I originally created the page I didn’t foresee this and chose a template that is now obviously unsuitable.

Changing Template

Here’s my demo page - as you can see the dashboard is a bit squashed as I’ve gone for main region and sidebar:

Screenshot 2019 01 12 at 16 13 44

Changing template is pretty simple - in the Lightning App builder (it can’t be long before this is renamed the Lightning Page builder can it?) edit the page and find the new option on the right hand side:

Screenshot 2019 01 12 at 16 15 13

Clicking this opens the change template dialog - initially, much like creating a page I get to choose from a list. I’ve gone for header and right sidebar so that my dashboard can take the full width header section:

Screenshot 2019 01 12 at 16 15 51

The next page is a departure from previous experience, in that I need to tell the dialog how to map the components from the old to the new template - I’ve left it as the defaults aside from the main region (which has my dashboard) which I add to the new header region:

Screenshot 2019 01 12 at 16 16 10

Once I’ve saved the page, I can view it and see the dashboard in it’s full width glory:

Screenshot 2019 01 12 at 16 16 41

Deploying Changes

While it’s great that we can do this through the UI, if I change the template of a Lightning Page in a sandbox I’d like to be able to deploy it via the metadata Api to production. When I tried this initially I had my API version set to 44.0 in my package.xml, which meant that I wasn’t hitting the Spring 19 metadata Api, but the Winter 19 version which doesn’t support template changes. Once I updated that, it all worked fine. If you are wondering how I can make such an obvious mistake, allow me to point you at this post, which explains everything.

Related Posts



Saturday, 1 December 2018

Apex Snippets in VS Code

Apex Snippets in Visual Studio Code



As I’ve written about before, I switched over to using Microsoft Visual Studio Code as soon as I could figure out how to wire the Salesforce CLI into it for metadata deployment. I’m still happy with my decision, although I do wish that more features were also supported when working in the metadata API format rather than the new source format. When using an IDE with plugins for a specific technology, such as the Salesforce Extension Pack, it’s easy to focus on what the plugins provide and forget to look at the standard features. Easy for me anyway. 


Per the docs, "Code snippets are small blocks of reusable code that can be inserted in a code file using a context menu command or a combination of hotkeys”. What this means in practice is that if I type in ‘try”, I’ll get offered the following snippets, which I can move between with the up and down arrow (cursor) keys:

Screenshot 2018 12 01 at 16 34 09

Note the information icon on the first entry - clicking this shows me the documentation for each snippet and what will be inserted if I choose the snippet. This is particularly important if I install an extension containing snippets from the marketplace that turns out to have been authored by my Evil Co-Worker - I can see exactly what nefarious code would be inserted before I continue:

Screenshot 2018 12 01 at 16 34 17

Which is pretty cool - with the amount of code i write, saving a few keystrokes here and there can really add up.

User Defined Snippets

The good news is that you aren’t limited to the snippets provided by the IDE or plugins - creating user defined snippets is, well a snip (come on, you knew I was going there).  On MacOS you access the Code ->Preferences -> User Snippets menu option:

Screenshot 2018 12 01 at 16 41 46

and choose the language - Apex in my case - and start creating your own snippet.


Here’s an example from my apex snippets:

	"SObject_List": {
		"prefix": "List<",
		"body": [
			"List<${1:object}> ${2:soList}=new List<${1}>();"
		"description":"List of sObjects"

Breaking up the JSON:

  • The name of my snippet is “SObject_List”.
  • The prefix is “List<“ - this is the sequence of characters that will activate offering the snippet to the user.
  • The body of the snippet is "List<${1:object}> ${2:soList}=new List<${1}>();” 
    •  $1 and $2 are tabstops, which are cursor locations. When the user chooses the snippet, their cursor will initially be placed in $1 so they can enter a value, then they hit tab and move to $2 etc. 
    • $1.object is a tabstop with a placeholder - in this case the first tabstop will contain the value “object” for the user to change. 
    • If you use the same tabstop in several places, when the user updates the first instance this will change all other references
  • Description is the detail that will be displayed if the user has hit the information  icon.

The following video shows the snippet in use while creating an Apex class - note how I only type ‘Contact’ once, but both instances of ‘object’ get changed.

Nice eh? And all done via a configuration file.

Related Posts



Sunday, 18 November 2018

Wrapping the Salesforce CLI

Wrapping the Salesforce CLI



(This post is aimed at beginners to the command line and scripting - all of the examples in this post are for MacOS)

Anyone who has read my blog or attended my recent talks at the London Salesforce Developers or Dreamforce will know I’m a big fan of the Salesforce CLI. I use it for pretty much everything I do around metadata and testing on Salesforce, and increasingly for things that don’t involve directly interacting with Salesforce. I think everyone should use it, but I also realise that not everyone is that comfortable with the command line, especially if their career didn’t start with it!

The range of commands and number of options can be daunting, for example to deploy local metadata to production and execute local tests, waiting for up to 2 minutes for the command to complete:

sfdx force:mdapi:deploy -d src -w 2 -u keir.bowden@sfdx.deploy -l RunLocalTests

If you are a developer chances are you’ll be executing commands like this fairly regularly, but for infrequent use, it’s quite a bit to remember. If you have colleagues that need to do this, consider creating a wrapper script so that they don’t have to care about the detail.

Wrapper Script

A wrapper script typically encapsulates a command and a set of parameters,  simplifying the invocation and easing the burden of remembering the correct sequence.A wrapper script for the Salesforce CLI can be written in any language that supports executing commands - I find that the easiest to get started with is bash, as it’s built in to MacOS.

A Bash wrapper script to simplify the deploy command is as follows:


echo "Deploying to production as user $1"

sfdx force:mdapi:deploy -d src -w 2 -u $1 -l RunLocalTests

Taking the script a line at a time:


This is known as a shebang - it tells the interpreter to execute the '/bin/bash' command, passing the wrapper script as a parameter.

echo "Deploying to production as user $1"

This outputs a message to the user, telling them the action that is about to be taken. The ‘$’ character access the positional parameters, or arguments, passed to the script. ‘$0' is set to the name that the script is executed with, '$1' is the first argument, ‘$2' the second and so on. The script expects the ‘targetusername' for the deployment to be passed as the first argument, and wastes no time checking if this is the case or not :)

sfdx force:mdapi:deploy -d src -w 2 -u $1 -l RunLocalTests

This executes the embedded Salesforce CLI command, once again accessing the parameter at position 1 via ‘$1' to set the ‘targetusername’ of the command.

Executing the Wrapper Script

The script assumes it is being executed from the project directory (the parent directory of src), so I’ve put it there, named as ‘deployprod’.

To execute the script, I type ‘./deployprod’ in the terminal - the ‘./‘ prefix simply tells the interpreter that the command is located in the local directory.

Attempting to execute the script after I create it shows there is still some work to do:

> ./deployprod keir.bowden@sfdx.deploy
-bash: ./deployprod: Permission denied

In order to allow the wrapper script to be as a command, I need to make it executable, via the chmod command:

> chmod +x deployprod

Re-running the command then produces the expected output:

> ./deployprod keir.bowden@sfdx.deploy

Deploying to production as user keir.bowden@sfdx.deploy
2884 bytes written to /var/folders/tn/q5mzq6n53blbszymdmtqkflc0000gs/T/ using 60.938ms
Deploying /var/folders/tn/q5mzq6n53blbszymdmtqkflc0000gs/T/


So in future, when the user wants to deploy to production, they simply type:

./deployprod keir.bowden@sfdx.deploy

rather than

sfdx force:mdapi:deploy -d src -w 2 -u keir.bowden@sfdx.deploy -l RunLocalTests

which is a lot less for them to remember and removes any chance that they might specify the wrong value for the -d or -l switches.

Of course there is always the chance that my Evil Co-Worker will update the script for nefarious purposes (to carry out destructive changes, for example) and the user will be none the wiser unless they inspect the output in detail (which they never will unless they see an error trust me), but the risk of this can be mitigated by teaching users good security practices around allowing access to their machines. And reminding them regularly of the presence of the many other evil people that they don’t happen to work with.

All Bash, all the time?

I  created the original command line tools on top of the Force CLI  for deployment of our BrightMedia appcelerator using Bash, and it allowed us to get big quick. However, there were a couple of issues:

  1. Only I knew how to write  bash scripts
  2. Bash isn’t the greatest language for carrying out complex business logic
  3. When the Salesforce CLI came along I wanted to parse the resulting JSON output, and that is quite a struggle in Bash.

Point 1 may or may not be an issue in your organisation, though I’d wager that you won’t find a lot of bash scripting experience outside of devops teams these days. Points 2 and 3 are more important - if you think you’ll be doing more than simple commands (or blogs about those simple commands!) then my advice would be to write your scripts in Node JS. You’ll need to be comfortable writing JavaScript, and you have to do a bit more in terms of installation, but in the long run you’ll be able to accomplish a lot more.

Bash does allow you to get somewhat complex quite quickly, so you’ll likely be some way down the wrong path when you realise it - don’t be tempted to press on. The attitude that “we’ve invested so much in doing the wrong thing that we have to see it through” never pays off!


Monday, 5 November 2018

Situational Awareness



This is the eighth post in my ‘Building my own Learning System” series, in which I finally get to implement one of the features that started me down this path in the first place - the “Available Training” component.

The available training component has situational awareness to allow it to do it’s job properly. Per Wikipedia, ituational awareness comprises three elements:

  • Perception of the elements in the environment - who the user is and where they are an application
  • Comprehension of the situation - what they are trying to do and what training they have taken
  • Projection of future status - if there is more training available they will be able to do a better job

Thus rather than telling the user that there is training available, regardless of whether they have already completed it, this component tells the user that there is training, how much of it they have completed, and gives them a simple way to continue. Push versus pull if you will.

Training System V2.0

Some aspects were already in place in V1 of the training system - I know who the user is based on their email address, for example, However, I only knew what training they had taken for the current endpoint so some work was required there. Polling all the endpoints to find out information seemed like something that could be useful to the user, which lead to sidebar search. Sidebar search allows the user to enter terms and search for paths or topics containing those terms across all endpoints:

Screen Shot 2018 11 05 at 15 47 18

Crucially the search doesn’t just pull back the matching paths. it also includes information about the progress of the currently logged in user for those paths - in this case, like so much of my life, I’ve achieved nothing:

Screen Shot 2018 11 05 at 15 47 24

In order to use this search, the available training component needs to know what the user is trying to do within the app. It’s a bit of a stretch for it to work this out itself, so it takes a topic attribute. 

I also want the user to have a simple mechanism to access the training that they haven’t taken, so the available training component also takes an attribute of there URL of a page containing the training single page application. 


Hopefully everyone read this in the voice of Jools from Pulp Fiction, otherwise the fact that there’s only a single example means it doesn’t make a lot of sense.

An example of using the new component is from my BrightMedia dev org. It’s configured in the order booking page as follows:

<c:TrainingAvailable modalVisible="{!v.showTraining}" 
    topic="Order Booking" />

where trainingSPAURL is the location of the single page application.

Also in my booking page I have an info button (far right): 

Screen Shot 2018 11 05 at 16 12 45

clicking this toggles the training available modal:

Screen Shot 2018 11 05 at 16 15 10


Which shows that there are a couple of paths that I haven’t started. Clicking the ‘Open Training’ button takes me to the training page with the relevant paths pre-loaded:

Screen Shot 2018 11 05 at 16 15 24

Related Posts


Saturday, 13 October 2018

All Governor Limits are Created Equal

All Governor Limits are Created Equal



Not all Salesforce governor limits inspire the same fear in developers. Top of the pile are DML statements and SOQL queries, followed closely by heap size, while the rest are relegated to afterthought status, typically only thought about when they breach. There’s a good reason for this - the limits that we wrestle with most often bubble to the top of our thoughts when designing a solution. Those that we rarely hit get scant attention, probably because on some level we assume that if we aren’t breaching these limits regularly, we must have some kind of superpower to write code that is inherently defensive against it. Rather than what it probably is - either the limit is very generous or dumb luck.

This can lead to code being written that is skewed to defending against a couple of limits, but will actually struggle to scale due to the lack of consideration for all limits. To set expectation, the example that follows is contrived - a real world example would require a lot more code and shift the focus away from the point I’m trying to make. 

The Example

For some reason, I want to create two lists in my Apex class - one that contains all leads in the system where the last name starts with the letter ‘A’ and another list containing all the rest. Because I’m scared of burning SOQL queries, I query all the leads and process the results:

List<Lead> leads=[select id, LastName from Lead];
List<Lead> a=new List<Lead>();
List<Lead> btoz=new List<Lead>();
for (Lead ld : leads)
    String lastNameChar1=ld.LastName.toLowerCase().substring(0,1);
    if (lastNameChar1=='a')

System.debug('A size = ' + a.size());
System.debug('btoz size = ' + btoz.size());

The output of this shows that I’ve succeeded in my mission of hoarding as many SOQL queries as I can for later processing:

Screen Shot 2018 10 13 at 16 20 14

But look towards the bottom of the screenshot - while I’ve only used 1% of my SOQL queries, I’ve gone through 7% of my CPU time limit. Depending on what else needs happens in this transaction, I might have created a problem now or in the future. But I don’t care, as I’ve satisfied myself requirement of minimising SOQL queries.

If I fixate a bit less on the SOQL query limit, I can create the same lists in a couple of lines of code but using an additional query:

List<Lead> a=[select id, LastName from Lead where LastName like 'a%'];
List<Lead> btoz=[select id, LastName from Lead where ID not in :a];
System.debug('A size = ' + a.size());
System.debug('btoz size = ' + btoz.size());

because the CPU time doesn’t include time spent in the database, I don’t consume any of that limit:

Screen Shot 2018 10 13 at 16 23 59

Of course I have consumed another SOQL query though, which might create a problem now or in the future. There’s obviously a trade-off here and fixating on minimising the CPU impact and ignoring the impact of additional SOQL queries is equally likely to cause problems.


When designing solutions, take all limits into account. Try out various approaches and see what the impact of the trade-offs is, and use multiple runs with the same data to figure out the impact on the CPU limit, as my experience is that this can vary quite a bit. There’s no silver bullet when it comes to limits, but spreading the load across all of them should help to stretch what you can achieve in a single transaction. Obviously this means the design time takes a bit longer, but there’s an old saying that programming is thinking not typing, and I find this to be particularly true when creating Salesforce applications that have to perform and scale for years. The more time you spend thinking, the less time you’ll spend trying to fix production bugs when you hit the volumes that everybody was convinced would never happen.



Saturday, 6 October 2018

Dreamforce 2018

Dreamforce 2018

IMG 4447

2018 marked my 9th Dreamforce, although the first of these was staffing a booth in the Expo for 4 days so it’s difficult to count that. In a change of pace, the event started on day -1 with a new initiative from Salesforce.

Monday - CTA Summit

IMG 4432

The CTA Summit was pretty much my favourite part this year - an audience of CTAs and travelling bands of Product Managers giving us lightning (the pace, not the technology, although we did get some of that as well!) presentations and answering difficult questions - for me the questions were as informative as the presentations, especially if it was an area that I haven’t done a lot of work in. Nothing like knowing what others are struggling with so you can be ahead of the game.

The format was one that I’m reasonably familiar with, having been lucky enough to attend 3 MVP summits over the years, especially the constant reminders that you are under NDA and can’t share any of the content. Sorry about that! One thing I can share is that Richard Socher (Salesforce Chief Scientist) is an excellent speaker and if you get a chance to attend one of his talks, grab it. Some of the sessions were hosted at the Rincon Centre, where I saw a couple of outfits that looked really familiar.

IMG 4444

Tuesday - Keynote

Early start as the Community Group Leaders State of the Union session was a breakfast presentation at the Palace Hotel from 7am, then more CTA Summit sessions before heading over to Moscone Center.

For the first (no pun intended) time that I can remember, Marc Benioff’s keynote took place on Day 1. As an MVP I’m lucky enough to get reserved seating for the keynote and made sure I was there in plenty of time - queueing with Shell Black prior to security opening put us in the first row of reserved seating, three rows back from the stage.

IMG 4464

The big announcements from the keynote were Einstein Voice (conversational CRM + voice bots) and Customer 360. If you want to know more about these, here’s a slide from the BrightGen Winter 19 release webinar with links to the appropriate keynote recordings (if you can’t read the short links, you can access the original deck here


Screen Shot 2018 10 06 at 15 57 48

The keynote wasn’t the end of the day though, with CTA and MVP receptions on offer I was networking and learning late into the evening,

Wednesday - Speaking

After a couple of days listening to other people talking, it was my turn. First up was the Climbing Mount CTA panel session over at the Partner Lodge, where I was up the front in some stellar company. We even had parity between the male and female CTAs on the panel, which is no mean feat when you look at the stats, something the Ladies Be Architects team in the foreground and working hard to address.



After this I headed back to Trailhead in Moscone West, showing incredible self-control as the partner reception was just starting. I limited myself to a single bite sized corn dog and ignored the voice in my head telling me that a couple of beers would loosen me up nicely, and went straight over the speaker readiness room to remind myself of the content for my Developer Theatre session “Quickstart Templates with the Salesforce CLI” (picture courtesy of my session owner, Caroline Hsieh).

IMG 1632

Once the talk was over I continued in a tradition started last year and went out for a beer with some of the contingent from Sweden - Johan (aka Kohan, Mohan) Karlsteen, the creator of Top Trailblazers, and one of his colleagues. As is also traditional, we took a picture and included a snapshot of the guy who couldn’t make it so that he didn’t feel left out :)



Thursday - More Keynotes and the Return of @adamse

Thursday was the second highlight of the event for me - the Developer Keynote, closely followed by the Trailhead keynote. A big surprise at the Dev Keynote was the presence of Adam Seligman, formerly of Salesforce but now with Google.


And I had pretty good seats in the second row for this keynote, better than Dave Carroll in fact. Did I mention I’m an MVP.



The gap between the Dev and Trailhead keynotes was only 30 minutes, but as they are literally over the road from each other I made it in about 20 (there’s a few people attend Dreamforce, so crossing the road isn’t as simple as it sounds!). I typically sit a bit further back in this one as there is huge amount of exuberance in the front few rows and I try to avoid displaying any excitement or enthusiasm in public if I can. 

After the keynotes I caught a couple more sessions before heading over to Embarcadero for the annual BrightGen customer dinner. I’d said to my colleagues when I bumped into them on Monday night that I’d see them again on Thursday, Some of those that hadn’t been thought I was joking, but they weren’t laughing when I showed up at 7pm. 

I also got to catch up with the winner of the UK and Ireland social media ambassador at Dreamforce competition - long time BrightGen customer, Cristina Bran, who told me all about her amazing week.


Friday - Salesforce Tower Trip

I’d been lucky enough to have my name picked out of the hat to get to visit the Ohana floor of the Salesforce Tower. It was a bit of a foggy day (what are the odds) but the views were still pretty spectacular.

IMG 4517

and that was Dreamforce over for another year!

Vegas Baby!

The BrightGen contingent traveled home via Las Vegas for a little unwinding and team building. Like the CTA Summit, what happens in Vegas stays in Vegas, so I can only show this picture of me arriving at the airport, still sporting some Salesforce branding.

IMG 4541