Thursday, 13 September 2018

Background Utility Items in Winter 19

Background Utility Items in Winter 19

Introduction

The Winter 19 release of Salesforce introduces the concept of Background Utility Items, Lightning Components that are added to the Lightning Experience utility bar but don’t take up any real estate, can’t be opened and have no user interface. This is exactly what I was looking for when I put together my Toast Message from a Visualforce Page blog - the utility bar component that received the notification to show a toast message doesn’t need to interact with the user, but still has an entry. The user can also click on the item and receive a lovely empty popup window:

Screen Shot 2018 09 08 at 08 09 52

Not the worst user experience in the world, but not the best either. I guess in production I’d probably put a message that this item isn’t user configurable. One item like this isn’t so bad, but imagine if there were half a dozen - a large chunk of the utility bar would be taken up with items that only serve to distract the user, although I’d definitely loo to combine them all into a single item if I could.

Refresher

In case you haven’t committed the original blog post to memory (and I’m not going to lie, that hurts), here’s how it works:

Toast

 

I enter a message in my Lightning component, which fires a toast event (1), this is picked up by the event handle in the Visualforce JavaScript, which posts a message (2) that is received by the Lightning component in the utility bar. This fires it’s own toast event (3) that, as it is executing in the one.app container, displays a toast message to the user.

Implement the Interface

Removing the UI aspect is as simple as implementing an interface - lightning:backgroundUtilityItem. Once i’ve updated my component definition with this (and changed the domain references to match my pre-release org):

<aura:component 
    implements="flexipage:availableForAllPageTypes,lightning:backgroundUtilityItem" 
    access="global" >

    <aura:attribute name="vfHost" type="String"
             default="kabprerel-dev-ed--c.gus.visual.force.com"/>
    <aura:handler name="init" value="{!this}" action="{!c.doInit}"/>
</aura:component>

When I open my app now, there’s nothing in the utility bar to consume space or attract the user, but my functionality works the same:

Toast2

You can find the updated code at my Winter 19 Samples github repo.

Related

 

Tuesday, 4 September 2018

Callable in Salesforce Winter 19

Callable in Salesforce Winter 19

Call

Introduction

The Winter 19 Salesforce release introduces the Callable interface which, according to the docs:

Enables developers to use a common interface to build loosely coupled integrations between Apex classes or triggers, even for code in separate packages. 

upon reading this I spent some time scratching my head trying to figure out when I might use it. Once I stopped thinking in terms of I and started thinking in terms of we, specifically a number of distributed teams, it made a lot more sense.

Scenario

The example scenario in this post is based on two teams working on separate workstreams in a single org, the Core team and the Finance team. The Core team create functionality used across the entire org, for the Finance team and others.

The Central Service

The core team have created a Central Service interface, defining key functionality for all teams (try not to be too impressed by the creativity behind my shared action names):

public interface CentralServiceIF {
    Object action1();
    Object action2();
}

and an associated implementation for those teams that don’t have specific additional requirements:

public class CentralServiceImpl implements CentralServiceIF {
    public Object action1() {
        return 'Interfaced Action 1 Result';
    }
public Object action2() { return 'Interfaced Action 2 Result'; } }

The Finance Implementation

The Finance team have specific requirements around the Central Service, so they create their own implementation - in the real world this would likely delegate to the Central Service and enrich with finance data, but in this case it returns a slightly different string (artist at work eh?) :

public class CentralServiceFinanceImpl implements CentralServiceIF {
    public Object action1() {
        return 'Finance Action 1 Result';
    }
public Object action2() { return 'Finance Action 2 Result'; } }

The New Method

Everything ticks along quite happily for a period of time, and then the Core team updates the interface to introduce a new function - the third action that everyone thought was the stuff of legend. The interface now looks like:

public interface CentralServiceIF {
    Object action1();
    Object action2();
    Object action3();
}

and the sample implementation:

public class CentralServiceImpl implements CentralServiceIF {
    public Object action1() {
        return 'Interfaced Action 1 Result';
    }
public Object action2() { return 'Interfaced Action 2 Result'; }
public Object action3() { return 'Interfaced Action 3 Result'; } }

This all deploys okay, but when the finance team next trigger processing via their implementation, there’s something rotten in the state of the Central Service:

Line: 58, Column: 1 System.TypeException: 
Class CentralServiceFinanceImpl must implement the method:
Object CentralServiceIF.action3()

Now obviously this was deployed to an integration sandbox, where all the code comes together to make sure it plays nicely, so the situation surfaces well away from production. However, if the Core team have updated the Central Service interface in response to an urgent request from another team, then the smooth operation of the workstreams has been disrupted. As the interface is a shared resource that is resistant to change - updating it requires coordination across all teams.

The Callable Implementations

Implementing the Core Central Service as a callable:

public class CentralService implements Callable {
   public Object call(String action, Map<String, Object> args) {
       switch on action {
           when 'action1' {
               return 'Callable Action 1 result';
           }
           when 'action2' {
               return 'Callable Action 2 result';
           } 
           when else {
               return null;
           }
       }
   }
}

and the Finance equivalent:

public class CentralService implements Callable {
    public Object call(String action, Map<String, Object> args) {
        switch on action {
            when 'action1' {
                return 'Callable Action 1 result';
            }
            when 'action2' {
                return 'Callable Action 2 result';
            } 
            when else {
                return null;
           }
       }
    }
}

Now when a third method is required in the Core implementation, it’s just another entry in the switch statement:

switch on action {
    when 'action1' {
        return 'Callable Action 1 result';
    }
    when 'action2' {
        return 'Callable Action 2 result';
    } 
    when 'action3' {
        return 'Callable Action 3 result';
    } 
    when else {
        return null;
    }
}

while the Finance implementation can remain blissfully unaware of the new functionality until it is needed, or a task to provide support for it can be added into the Finance workstream.

Managed Packages

I can also see a lot of use cases for this if you have common code distributed via managed packages to a number of orgs. You can include new functions in a release of your package without requiring every installation to update their code to the latest interface version - as you can’t update a global interface once published, you have to shift everything to a new interface (typically using a V<release> naming convention), which may cause some churn across the codebase.

Conclusion

So is this something that I’ll be using on a regular basis? Probably not. In the project that I’m working on at the moment I can think of one place where this kind of loose coupling would be helpful, but it obviously makes the code more difficult to read and understand, especially if the customer’s team doesn’t have a lot of development expertise.

My Evil Co-Worker likes the idea of building a whole application around a single Callable class - everything else would be a thin facade that hands off to the single Call function. They claim it's a way to obfuscate code, but I think it's just to annoy everyone.

Related

 

Monday, 27 August 2018

Lightning Emp API in Winter 19

Emp API in Winter 19

Introduction

It’s August. After weeks of unusually sunny days, the schools in the UK have broken up and the weather has turned. As I sit looking out at cloudy skies, my thoughts turn to winter. Winter 19 to be specific - the release notes are in preview and some of the new functionality has hit my pre-release org. The first item that I’ve been playing with is the new Emp API component, which takes away a lot of the boilerplate code that I have to copy and paste every time I create a component that listens for platform events.

From the preview docs, this component

Exposes the EmpJs Streaming API library which subscribes to a streaming channel and listens to event messages using a shared CometD connection. This component is supported only in desktop browsers. This component requires API version 44.0 and later.

What I Used to Do

Previously, to connect to the streaming API and start listening for events, I’d need code to:

  • Download the cometd library and put it into a static resource
  • Add the static resource to my component
  • Instantiate org.cometd.CometD
  • Call the server to get a session id
  • Configure cometd with the session id and the Salesforce endpoint
  • Carry out the cometd handshake
  • Subscribe to my platform event channel
  • Wiat for messages

What I do Now

  • Add the lightning:empApi component inside my custom component:
  • Add an error handler in case anything goes wrong
  • Subscribe to my platform event channel
  • Wait for a message

In practice this means my controller code has dropped from 70 odd lines to around 20,

Example

The first thing I need for an example is a platform event - I’ve created one called Demo_Event__e, which contains a single field named ‘Message__c’. This holds the message that I’ll display to the user.

My example component (Demo Events) actually uses a couple of standard components - the Emp API and the notifications library - the latter is used to show a toast message when I receive an event:

<lightning:empApi aura:id="empApi" /> 
<lightning:notificationsLibrary aura:id="notifLib"/>

The controller handles all the setup via a method invoked when the standard init event is fired. Before I can do anything I need a reference for the Emp API component:

var empApi = component.find("empApi");

Once I have this I can subscribe to my demo event channel - I’ve chosen a replayId of -1 to say start with the next event published. Note that I also capture the subscription object returned by the promise so that I can unsubscribe later if I need to (although my sample component doesn’t actually do anything with it).

var channel='/event/Demo_Event__e';
var sub;
var replayId=-1;
empApi.subscribe(channel, replayId, callback).then(function(value) {
      console.log("Subscribed to channel " + channel);
      sub = value;
      component.set("v.sub", sub);
});

I also provide a callback function that gets invoked whenever I receive a message. This simply finds the notification library and executes the showToast aura method that it exposes.

var callback = function (message) {
	component.find('notifLib').showToast({
      	"title": "Message Received!",
        "message": message.data.payload.Message__c
	}); 
}.bind(this);

On the server side I have a class exposing a single static method that allows me to publish a platform event:

public class PlatformEventsDemo 
{
    public static void PublishDemoEvent(String message)
    {
	    Demo_Event__e event = new Demo_Event__e(Message__c=message);
                Database.SaveResult result = EventBus.publish(event);
                if (!result.isSuccess()) 
        {
            for (Database.Error error : result.getErrors()) 
            {
                System.debug('Error returned: ' +
                             error.getStatusCode() +' - '+
                             error.getMessage());
            }
        }
    }
}

Running the Example

I’ve added my component to a lightning page - s it doesn’t have any UI you’ll have to take my word for it! Using the execute anonymous feature of the dev console, I publish a message:

Screen Shot 2018 08 25 at 13 51 12

 

And on my lightning app page, shortly afterwards I see the toast message:

 

Screen Shot 2018 08 25 at 13 49 55

 

More Information

 

Sunday, 5 August 2018

Putting Your TBODY on the Line

Putting Your TBODY on the Line

Table

Introduction

This week I’ve been working on a somewhat complex page built up from a number of Lightning components. One of the areas of the page is a table showing the paginated results of a query, with various sorting options available from the headings, and a couple of summary rows at the bottom of the page. The screenshot below shows the last few rows, the summary info and the pagination buttons.

Screen Shot 2018 08 04 at 17 34 06

The markup for this is of the following format:

<table>
<thead>
<tr>
<aura:iteration ...>
<th> _head_ </th>
</aura:iteration ...>
<tr>
</thead>
<tbody>
<tr>
<aura:iteration ...>
<td> _data_ </td>
</aura:iteration ...>
</tr>
...
<tr>
<td>_summary_</td>
<td>_summary_</td>
</tr>
<tr>
<td>_summary_</td>
<td>_summary_</td>
</tr>
</tbody>
</table>

So pretty much a standard HTML table with a couple of aura:iteration components to output the headings and rows. Obviously there’s a lot more to it than this, and styling etc, but for the purposes of this post those are the key details.

The Problem

Once I’d implemented the column sorting (and remembered that you need to return a value from an inline sort function, otherwise it’s deemed to mean that all the elements are equal to each other!), I was testing by mashing the column sort buttons and after a few sorts something odd happened:

Screen Shot 2018 08 04 at 17 35 10

The values inserted by the aura:iteration were sandwiched in between the two summary rows that should appear at the bottom.  I refreshed the page and tried again and this time it got a little worse:

Screen Shot 2018 08 04 at 17 33 32

This time the aura:iteration values appeared below both summary rows. I tested this on Chrome and Firefox and the behaviour was the same for both browsers. 

The Workaround

I’ve hit a few issues around aura:iteration in the past, although usually it’s been the body of that components rather than the surround ones, and I recalled that often the issue could be solved by separating the standard Lightning components with regular HTML. I could go with <tfoot>, but according to the docs this indicates that if the table is printed the summary rows should appear at the end of each page, which didn’t seem quite right.

I already had a <tbody>, but looking at the docs a table can have multiple <tbody> tags, to logically separate content, so another one of these sounded exactly what I wanted. Moving the summary rows into their own “section” as follows:

<table>
<thead>
<tr>
<aura:iteration ...>
<th> _head_ </th>
</aura:iteration ...>
<tr>
</thead>
<tbody>
<tr>
<aura:iteration ...>
<td> _data_ </td>
</aura:iteration ...>
</tr>
...
</tbody>
<tbody>
<tr>
<td>_summary_</td>
<td>_summary_</td>
</tr>
<tr>
<td>_summary_</td>
<td>_summary_</td>
</tr>
</tbody>
</table>

worked a treat. Regardless of how much I bounced around and clicked the headings, the summary rows remained at the bottom of the table as they were supposed to.

I haven’t been able to reproduce this with a small example component - it doesn’t appear to be related to the size of the list backing the table as I’ve tried a simple variant with several thousand members and the summary rows stick resolutely to the bottom fo the table. Given that I have a workaround I’m not sure how much time I’ll invest in digging deeper, but if I do find anything you’ll read about it here.

Related Posts

 

Saturday, 21 July 2018

Exporting Folder Metadata with the Salesforce CLI

Exporting Folder Metadata with the Salesforce CLI

Folder

Introduction

In my earlier post, Exporting Metadata with the Salesforce CLI, I detailed how to replicate the Force.com CLI export command using the Salesforce CLI. One area that neither of these handle is metadata inside folders, so reports, dashboards and earl templates. Since then I’ve figured out how to do this, so the latest version of the CLIScripts Github repo has the code to figure out which folders are present, and includes the contents of each of these in the export.

Identifying the Folders

This turned out to be a lot easier than I expected - I can simply execute a SOQL query on the Folder sobject type and process the results:

let query="Select Id, Name, DeveloperName, Type, NamespacePrefix from Folder where DeveloperName!=null";
let foldersJSON=child_process.execFileSync('sfdx', 
    ['force:data:soql:query',
        '-q', query, 
        '-u', this.options.sfdxUser,
        '--json']);

Note that I’m not entirely sure what it means when a folder has a DeveloperName of null - I suspect it indicates a system folder, but as the folders I was interested in appeared, I didn’t look into this any further.

I then created a JavaScript object containing a nested object for each folder type:

this.foldersByType={'Dashboard':{},
                    'Report':{}, 
                    'Email':{}}

and then parsed the resulting JSON, adding each result into the appropriate folder type object as a property named as the folder Id. The property contains another nested object wrapping the folder name and an array where I will store each entry from the the folder:

var foldersForType=this.foldersByType[folder.Type];
if (foldersForType) {
    if (!foldersForType[folder.Id]) {
        foldersForType[folder.Id]={'Name': folder.DeveloperName, 'members': []};
    }
}

Retrieving the Folder Contents

Once I have all of the folders stored in my complex object structure, I can query the metadata for each folder type - the dashboards in this instance - and build out my structure modelling all the folders and their contents:

let query="Select Id, DeveloperName, FolderId from Dashboard";
let dashboardsJSON=child_process.execFileSync('sfdx', 
    ['force:data:soql:query',
        '-q', query, 
        '-u', this.options.sfdxUser,
        '--json']);

I then iterate the results and add these to the members for the specific folder:

var foldersForDashboards=this.foldersByType['Dashboard'];
var folderForThisDashboard=foldersForDashboards[dashboard.FolderId];
if (folderForThisDashboard) {
    folderForThisDashboard.members.push(dashboard);
}

Adding to the Manifest

As covered in the previous post on this topic, once I’ve identified the metadata required, I have to add it to the manifest file - package.xml. 

I already had a method to add details of a metadata type to the package, so I extended that to include a switch statement to change the processing for those items that have folders. Using dashboards as the example again, I iterate all the folders and their contents, adding the appropriate entry for each:

case 'Dashboard':
    var dbFolders=this.foldersByType['Dashboard'];
    for (var folderId in dbFolders) {
        if (dbFolders.hasOwnProperty(folderId)) {
            var folder=dbFolders[folderId];
            this.addPackageMember(folder.Name);
            for (var dbIdx=0; dbIdx<folder.members.length; dbIdx++) {
                var dashboard=folder.members[dbIdx];
                this.addPackageMember(folder.Name + '/' + dashboard.DeveloperName);
            }
        }
    }
    break;

In the case of our BrightMedia appcelerator, the package.xml ends up looking something like this:

<types>
    <members>BG_Dashboard</members>
    <members>BG_Dashboard/BrightMedia</members>
    <members>BG_Dashboard/BrightMedia_digital_dashboard</members>
    <members>Best_Practice_Service_Dashboards</members>
    <members>Best_Practice_Service_Dashboards/Service_KPIs</members>
    <members>Sales_Marketing_Dashboards</members>
    <members>Sales_Marketing_Dashboards/Sales_Manager_Dashboard</members>
    <members>Sales_Marketing_Dashboards/Salesperson_Dashboard</members>
    <name>Dashboard</name>
</types>

Exporting the Metadata

One thing to note is that the export is slowed down a bit as there are now six new round trips to the server - three for each of the folder types, and three more to retrieve the metadata for each type of folder. Exporting the metadata using the command:

node index.js export -u <username> -d output

creates a new output folder containing the zipped metadata. Unzipping this shows that the dashboard metadata has been retrieved as expected:

> ls -lR dashboards

total 24
drwxr-xr-x 5 kbowden staff 160 21 Jul 15:29 BG_Dashboard
-rw-r--r-- 1 kbowden staff 154 21 Jul 14:27 BG_Dashboard-meta.xml
drwxr-xr-x 5 kbowden staff 160 21 Jul 15:29 Best_Practice_Service_Dashboards
-rw-r--r-- 1 kbowden staff 174 21 Jul 14:27 Best_Practice_Service_Dashboards-meta.xml
drwxr-xr-x 6 kbowden staff 192 21 Jul 15:29 Sales_Marketing_Dashboards
-rw-r--r-- 1 kbowden staff 180 21 Jul 14:27 Sales_Marketing_Dashboards-meta.xml

dashboards//BG_Dashboard:
total 48
-rw-r--r-- 1 kbowden staff 2868 21 Jul 14:27 BrightMedia.dashboard
-rw-r--r-- 1 kbowden staff 6917 21 Jul 14:27 BrightMedia_digital_dashboard.dashboard


dashboards//Best_Practice_Service_Dashboards:
total 88
-rw-r--r-- 1 kbowden staff 8677 21 Jul 14:27 Service_KPIs.dashboard

dashboards//Sales_Marketing_Dashboards:
total 96
-rw-r--r-- 1 kbowden staff 10317 21 Jul 14:27 Sales_Manager_Dashboard.dashboard
-rw-r--r-- 1 kbowden staff 8046 21 Jul 14:27 Salesperson_Dashboard.dashboard

One more thing

I also fixed the export of sharing rules, so rather than specifying SharingRules as the metadata type, it specifies the three subtypes of sharing rule (SharingCriteriaRule, SharingOwnerRule, SharingTerritoryRule) required to actually export them!

Related Posts

 

 

Saturday, 14 July 2018

Lighting Component Attributes - Expect the Unexpected

Lighting Component Attributes - Expect the Unexpected

 

UnexpectedIntroduction

A couple of weeks ago my BrightGen colleague, Firoz Mirza, described some unexpected (by us at least) behaviour when dealing with Lightning Component attributes. I was convinced that there must be something else going on so spent some time creating a couple of simple tests, only to realise that he was absolutely correct and attributes work a little differently to how I thought, although if I’m honest, I’ve never really thought about it.

Tales of the Unexpected 1

The basic scenario was assigning a Javascript variable to the result of extracting the attribute via component.get(), then calling another method that also extracted the value of the attribute, but crucially also changed it and set it back into the component. A sample component helper that works like this is:

({
    propertyTest : function(cmp) {
	var testVar=cmp.get('v.test');
        alert('Local TestVar = ' + JSON.stringify(testVar));
        this.changeAttribute(cmp);
        alert('Local TestVar after leaving it alone = ' + JSON.stringify(testVar));
    },
    changeAttribute : function(cmp) {
        var differentVar=cmp.get('v.test');
        differentVar.value1='value1';
        cmp.set('v.test', differentVar);
    }
})

The propertyTest function gets the attribute from the component and assigns it to testVar. After showing the value, it then invokes changeAttribute, which gets the attribute and assigns it to a new variable, updates it and then sets it back into the component.

Executing this shows the value of the property is empty, as expected to begin with:

Screen Shot 2018 07 14 at 13 49 09

but after the attribute has been updated elsewhere, my local variable reflects this:

Screen Shot 2018 07 14 at 13 50 27

 

Tales of the Unexpected 2

After a bit more digging, I came across something that I expected even less - updating a local variable that stored the result of the component.get() for the attribute, changed the value in the component. So I updated my sample component to test this:

    propertyTest2: function(cmp) {
	var test2=cmp.get('v.test');
        test2.value2='value2';
        alert('Local test2 = ' + JSON.stringify(test2));
        this.checkAttribute(cmp);
    },
    checkAttribute: function(cmp) {
        var otherVar=cmp.get('v.test');
        alert('Other var = ' + JSON.stringify(otherVar));
    }

Executing this shows my local property value being updated as expected:

Screen Shot 2018 07 14 at 14 01 51

but then retrieving this via another variable showed that it appeared to have been updated in the component:

Screen Shot 2018 07 14 at 14 02 28

“Maybe it’s just because everything is happening in the same request” I thought, but after this request completed I executed the first test, so extracting the value in a separate request (transaction) showed that the attribute had indeed been updated in a way that persisted across requests:

Screen Shot 2018 07 14 at 14 03 44

Checking the Docs

Convinced that I must have missed this in the docs, I went back to the Lightning Components Developer Guide, Working with Attribute Values in JavaScript section, but only found the following:

component.get(String key) and component.set(String key, Object value) retrieves and assigns values associated with the specified key on the component.

which didn’t add anything to what I already knew. 

Asking the Experts

I then asked the question of Salesforce, with my sample components, and received the following response:

The behavior you’re seeing is expected. 

component.set() is a signal to the framework that you’ve changed a value. The framework then fires change handlers, rerenders, and so forth. 

The underlying value is a JavaScript object, array, primitive, or otherwise. JavaScript object’s and arrays are mutable. Aura’s model requires that you send that signal when you mutate an object or array.

Conclusion

So there you have it - a local variable assigned the result of component.get() gives you a live hook to the component attribute, and if it’s mutable then any changes you make to the local variable update the attribute regardless of whether you call component.set() or not. Good to know!

I think the docs could certainly use with some additional information to make this clear. It’s written down here, at least, so hopefully more people will know going forward!

Related Posts

 

 

Saturday, 7 July 2018

Exporting Metadata with the Salesforce CLI

Exporting Metadata with the Salesforce CLI

Exit

Introduction

Regular readers of this blog are all too aware that I’m a big fan of the Salesforce CLI. I rarely use anything else to interact with Salesforce orgs, outside of the main UI. Prior to this I used the Force.com CLI, which is another excellent tool, though didn’t give me quite enough information about operations to switch to it exclusively.

One area where the Force.com CLI still beats the default functionality of the Salesforce CLI is the export command, which pull back most of the metadata from an org (except reports, dashboards, and email templates, where you have to jump through a few hoops to get folder names first). Switching back to the Force.com CLI to take a backup of an org started to get a bit repetitive, so I decided to write something to replicate the functionality using the Salesforce CLI.

To Plugin or Not to Plugin

After some consideration, I decided not to plugin. I need to do some work and then execute a deployment via the Salesforce CLI, and there’s currently no way to execute a command from a plugin. I could shell out of the plugin and execute the CLI as a new child process, but that seemed a bit clunky. Plus where does it end? We’d end up with plugins creating child sfdx commands that were plugins that created child sfdx commands - a house of cards made of processes and just as fragile. I’d also seen Dave Carroll call this approach an anti-pattern on twitter and I didn’t want him mad at me, so I went with a node script that wraps everything up into a single command from the users point of view.

The Node Code

As I plan to do more of this sort of thing, rather than having loads of different scripts to execute I went with one script that offers subcommands. Kind of like a CLI all of it’s own, but very lightweight.

In order to support subcommands I used the commander module - I’ve used this a few times in the past and it’s really straightforward. I just configure it with the subcommands and their options and it figures out which one to execute and parses the arguments for me:

var exportCmd=program.command('export')
		 .option("-u, --sfdx-user [which]", "Salesforce CLI username")
		 .option("-d, --directory [which]", "Output directory")
                 .action(function(options) {new ExportCommand(options).execute()});

program.parse(process.argv);

The export command is responsible for generating a package.xml manifest file containing details of all the metadata to retrieve and then executing a metadata API retrieve.

 

It contains a list of the names metadata components to extract:

const allMetadata=[
    "AccountSettings",
    "ActivitiesSettings",
    "AddressSettings",
    "AnalyticSnapshot",
    "AuraDefinitionBundle",
         ...
“CustomObject”,
... "ValidationRule", "Workflow" ];

The one wrinkle here is the standard objects. If I wildcard the CustomObject metadata type, this will just retrieve my custom objects. To include the standard objects, I have to explicitly name each of them. Luckily the Salesforce CLI provides a way for me to extract these - force:schema:sobject:list, so I can pull these back and turn them into a list of names:

var standardObjectsObj=child_process.execFileSync('sfdx', 
                    ['force:schema:sobject:list', 
                        '-u', this.options.sfdxUser, 
                        '-c', 'standard']);

var standardObjectsString=standardObjectsObj.toString();
standardObjects=standardObjectsString.split('\n');

for (var idx=standardObjects.length-1; idx>=0; idx--) {
    if ( (standardObjects[idx].endsWith('__Tag')) ||
         (standardObjects[idx].endsWith('__History')) ||
         (standardObjects[idx].endsWith('__Tags')) ) {
         standardObjects.splice(idx, 1);
    }
}

Then I can generate the package.xml by iterating the list of metadata names and adding a wildcard entry for each of them, and append the standard objects when the metadata name is CustomObject:

for (var idx=0; idx<allMetadata.length; idx++) {
    var mdName=allMetadata[idx];
    this.startTypeInPackage();
    if (mdName=='CustomObject') {
        for (var soIdx=0; soIdx<standardObjects.length; soIdx++) {
            this.addPackageMember(standardObjects[soIdx]);
        }
    }
    this.addPackageMember('*');
    this.endTypeInPackage(mdName);
}

Once the package.xml is written, I can then execute a force:mdapi:retrieve to pull back all of the available metadata:

child_process.execFileSync('sfdx', 
                ['force:mdapi:retrieve', 
                    '-u', this.options.sfdxUser, 
                    '-r', this.options.directory, 
                    '-k', this.packageFilePath]);

Export all the Metadata!

(The full source code is available at the Github repo. If you want to try this out yourself, clone the repo, change into the directory containing the clone and run npm install to pull down the dependencies.)

To export the metadata I execute:

node index.js export -u keir.bowden@train.bb -d output

Where -u specifies the username that I’ve previously authorised with the Salesforce CLI (I’m using the client dev org for my training system), and -d is the directory where the package.xml will be created and the metadata extracted to. If this directory doesn’t exist it will be created.

The output is just a couple of lines telling me it succeeded:

Exporting metadata
Metadata written to output/unpackaged.zip

The contents of the output subdirectory shows the package.xml and zip file retrieved from Salesforce:

$ ls -lrt
total 928
-rw-r--r-- 1 kbowden staff 20110 7 Jul 12:49 package.xml
-rw-r--r-- 1 kbowden staff 453236 7 Jul 12:50 unpackaged.zip

unzipping the unpackaged.zip file creates an unpackaged directory containing the retrieved metadata:

$ unzip unpackaged.zip
Archive: unpackaged.zip
inflating: unpackaged/labels/CustomLabels.labels
   …

$ cd unpackaged
$ ls -l
total 48
drwxr-xr-x 16 kbowden staff 512 7 Jul 12:52 applications
drwxr-xr-x 4 kbowden staff 128 7 Jul 12:52 assignmentRules
    …
drwxr-xr-x 7 kbowden staff 224 7 Jul 12:52 tabs
drwxr-xr-x 4 kbowden staff 128 7 Jul 12:52 triggers

and hey presto, my org is exported without having to revert back to the Force.com CLI.

Related Posts

 

Friday, 29 June 2018

Adding Signature Capture to a Lightning Flow

Adding Signature Capture to a Lightning Flow

Screen Shot 2016 11 07 at 18 36 19

Introduction

As regular readers of this blog know, a few years ago I wrote a Lightning Component for the launch of the Component Exchange called Signature Capture which, in a shocking turn of events, allows a user to capture a signature image and attach it to a Salesforce record. Over the years it’s received various enhancements and when these are notable I write a blog post about it. The latest version satisfies this requirement as it adds support for inclusion in Lightning flows. Note that the rest of this post assumes that you are working with V1.31 of Signature Capture or higher.

The Scenario

I’m going for a pretty simple scenario - a custom button on the a Salesforce record starts a flow that contains just the Signature Capture component. Once the user is happy with the signature and saves it, finishing the flow returns them to the record that they started from.

Creating the Flow

If you are a metadata type, you can install the demo flow from the Signature Capture Samples Git repository. It’s called   don’t forget to activate it.

If not, carry out the following steps to create the flow:

  1. Open the flow designer
  2. Click on the ‘Resources’ tab and on the resulting list click ‘Variable'

    Screen Shot 2018 06 28 at 16 51 15

  3. Create a new variable named ‘recordId’ and click OK - this will contain the Id of the record that the user wishes to capture a signature against.

    Screen Shot 2018 06 28 at 16 47 32
     
  4. Click the ‘Palette’ tab and on the resulting list drag ‘Screen’ onto the canvas:FillFil

    Screen Shot 2018 06 28 at 16 52 41

  5. Fill in the ‘Name’ as desired - I went for ‘Signature Capture’ as I’m a creative type:

    Screen Shot 2018 06 28 at 16 56 54

  6. Click the ‘Add a Field’ tab and scroll to the bottom of the list and choose ‘Lightning Component’:

    Screen Shot 2018 06 28 at 16 57 27

  7. Click the ‘Field Settings’ and configure as shown below - note that I had to remove the entry that was automatically added to the ‘Outputs’ tab before I could save. Make sure to set the value of the 'Record Id' attribute to the 'recordId’ variable - this ensures that the captured signature is stored against the correct Salesforce record.

    Screen Shot 2018 06 28 at 16 58 23

  8. Set the screen as the start element
  9. Save the flow
  10. Activate the flow

And that’s it! Straightforward but the flow isn’t very useful as it’s not attached to anything.

Creating the Custom Button

Using the URL of the flow seems pretty clunky, but what can we do?

  1. Open the flow and copy the URL value:

    Screen Shot 2018 06 28 at 17 02 40

  2. Navigate to the setup page for the object you want to be able to launch the flow from (I’ve gone for Account) and click ‘Buttons, Actions, and Links’
  3. On the resulting page, click ‘New Button or Link'
  4. Configure the button as follows, replacing the ‘Button or Link URL’ with the concatenation of your flow URL (copied in step 1) and the appropriate record id for for the sobject type. The retURL parameter is the secret sauce to send the user back to the record when they finish the flow:

    Screen Shot 2018 06 28 at 17 19 11

  5. After saving, click the ‘Page Layouts’ link from the sObject setup page.
  6. Choose the layout to add the button to.
  7. On the resulting page builder, select the 'Mobile & Lightning Actions’ item from the palette:

    Screen Shot 2018 06 28 at 17 08 23

  8. And drag your new custom button on to the 'Salesforce Mobile and Lightning Experience Actions’ section:

    Screen Shot 2018 06 28 at 17 08 26

  9. Save the page layout

Enable Lightning Flow

Flows containing Lightning components require lightning flow to be enabled.

  1. Navigate to Setup -> Process Automation -> Process Automation Settings
  2. Check the ‘Enable Lightning runtime for flows’ box:


    Screen Shot 2018 06 28 at 17 12 53

  3. Save the settings.

Take it for a Spin

Once everything is in place you can launch the flow by navigating to a record of the appropriate sObject type and clicking the custom button. The video below shows my flow configured above being launched and completed from an account record:

Related Posts

 

Sunday, 24 June 2018

Sharing your Salesforce App on Github

Sharing your Salesforce App on Github

Share

Introduction

Over the years there’s been a few ways to give away Salesforce apps. When I first started 10 years ago there was a Google Code Share accessible from the Developerforce site, but that star waned some time ago. There’s also unmanaged packages, but that typically requires someone to install your package into an org before they can look at the contents of the app, which is fine if they know they want to use it in it’s entirety, but less useful if they want to use your app as s starting point or example for something that they are doing.

Over the last few years, Github has become the de-facto way to share and collaborate on code. As well as providing (free for public) Git repository hosting, it has a bunch of other cool features to allow people to collaborate:

  • Pull requests, to allow changes to be reviewed, discussed and refined before they are merged.
  • Issue tracking, to capture ideas, feature requests, bugs
  • Releases, so that you can wrap and ship specific versions

A collection of useful applications in Github is also a signal to potential employers that you are passionate about app creation and I’m sure won’t do your career any harm. It’s also a great way to share a sample application after presenting at a user group or World Tour event.

I’m using my Summer 18 samples codebase as the example Salesforce application for this post. Note that this post assumes you have created a Github account and completed Git setup.

But Github is Microsoft now

It is, but under Satya Nadella we’re seeing a new Microsoft, particularly around open source. Even if you are nervous around what might happen to Github in the future, you are putting your application out there for everyone to use, so it’s not like it’s some big secret that Microsoft might help themselves to. I can see how a Microsoft competitor with a business account might be worried about security,  but for public applications it’s not something I’d be concerning myself with.

Metadata

The most common mechanism of storing an application outside of Salesforce is extracting the metadata via the Metadata API (SalesforceDX Scratch Org format is the new kid on the block, but at the moment is more likely to be of interest to ISVs in my opinion). There are various mechanisms for extracting metadata, requiring various degrees of setup, such as:

  • An IDE, such as the Force.com IDE, where you can choose from a subset of metadata to retrieve
  • The Force.com Migration Tool (ant), where you’ll need to define a manifest file (package.xml) describing the metadata you want to retrieve
  • The Salesforce CLI, via the force:mdapi:retrieve subcommand - again you’ll need a manifest file
  • The Force.com CLI, via the export and fetch commands.

In my view the Force.com CLI is still the easiest tool to use to export metadata - I wrote a blog post explaining how to do this a while ago.

Once you have your metadata exported you will have a structure similar to the following:

Screen Shot 2018 06 24 at 10 27 04

 

Creating a Local Repository

Github uses Git to manage the set of files that make up your application, stored in a data structure known as a repository. This enables tracking of changes to files (and much more besides). In order to create a repository, however, you need to extract your application from Salesforce.

Once your metadata is extracted from Salesforce it’s in a state where you can easily migrate it to other orgs, but it isn’t set up to be managed by Git. To turn your codebase into a local Git repository, you use the git init command:

$ git init .

Initialized empty Git repository in /Users/kbowden/GithubBlog/.git/

You can then run git status and see that you have indeed turned your app into a repository:

$ git status 

On branch master
No commits yet
Untracked files:

  (use "git add ..." to include in what will be committed)

       src/

nothing added to commit but untracked files present (use "git add" to track)

So you have a repository, but none of your applications files are being tracked by it. To achieve this, execute git add to stage the files and then git commit to commit them to the repository:

$ git add src

$ git commit
[master (root-commit) 35fa327] Created.
31 files changed, 202 insertions(+)
create mode 100644 src/aura/Navigator/Navigator.auradoc
create mode 100644 src/aura/Navigator/Navigator.cmp
      ...

create mode 100644 src/classes/UtilityController.cls
create mode 100644 src/classes/UtilityController.cls-meta.xml

Note that the git commit command will open an editor for your commit message.

Create the Remote Repository

Now that the code is being tracked by Git, the final step is to create the remote Github repository and push your local changes to it.

There are various ways to set up a remote repository, I use the following steps as it reminds me to set up the README.md file that serves as the “home page” for the repository on Github:

  1. Login to Github
  2. Click the + button on the top right and on the resulting dropdown menu, select New Repository

    Screen Shot 2018 06 24 at 11 02 08
  3. Fill out the resulting form, making sure to tick the box titled 'Initialize this repository with a README’, and click the Create Repository button

    Screen Shot 2018 06 24 at 11 03 45

  4. On the resulting page, click the ‘Clone or Download’ button and copy the Web URL:

    Screen Shot 2018 06 24 at 11 06 44

  5. Add the remote repository, using the Web URL copied in the step above. I’ve named the remote ‘origin’ as this is going to be the main repository going forward:
    $ git remote add origin https://github.com/keirbowden/GithubBlog.git
  6. Verify the remote:
    $ git remote -v

    origin https://github.com/keirbowden/GithubBlog.git (fetch)
    origin https://github.com/keirbowden/GithubBlog.git (push)

Pushing Local to Remote

The remote Github repository is now created and associated with the local repository. There’s nothing in the remote repository apart from the README.md file, so the next step is to push the contents of the local repository. Before this can be done, the contents of the remote repository must be merged in, as it has changes (README.md) that aren’t in the local repo.

The git pull command fetches the changes from the remote repository and merges them into the named branch in a single action. As the local and the remote repository have been created separately, use the '--allow-unrelated-histories' option to allow everything to be coerced into a single set of files:

$ git pull origin master --allow-unrelated-histories

From https://github.com/keirbowden/GithubBlog
* branch master -> FETCH_HEAD
Merge made by the 'recursive' strategy.
README.md | 2 ++
1 file changed, 2 insertions(+)
create mode 100644 README.md

Finally, push the consolidated repository to Github via the git push command:

$ git push origin master

Counting objects: 30, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (24/24), done.
Writing objects: 100% (30/30), 4.71 KiB | 1.57 MiB/s, done.
Total 30 (delta 2), reused 0 (delta 0)
remote: Resolving deltas: 100% (2/2), done.
To https://github.com/keirbowden/GithubBlog.git
f956682..be01baf master -> master

All Done

Accessing the repository on Github shows the entire Salesforce application is now available:

Screen Shot 2018 06 24 at 11 28 15

Related Posts

 

Thursday, 14 June 2018

Building my own Learning System - Part 7

Building my own Learning System - Part 7

Learning

Introduction

Since I last blogged about my learning system I’ve started using it at BrightGen, for example to onboard developers onto our BrightMedia accelerator, where it has been generally well received. Not least by me, as it means that I don’t need to spend multiple hours with every developer, going through the same content which they may or may not be listening to. I’m also seeing some customers indicating that they’d like to pick up development tasks in BrightMedia projects. Thanks to my original design decision of distributing content across multiple endpoints, I can expose content to upskill the customers without giving away internal BrightGen content, or them even knowing that other content exists.

I’ve added in some features that are useful, to me at least, including:

  • Filtering by topic
  • Executing as a user (the github repo has the concept of running as a specific email address, as it’s unauthenticated)
  • Added an info modal, detailing the recent changes, which I keep forgetting to update, so if you use the unmanaged package to install it will likely be at least one version out of date :)

The great thing about building my own learning system though, is that I am learning by building it.

Learning by Building

As I’m adding features to my learning system I’m also learning more. Not just the Salesforce features that I expected to learn about, such as using the latest standard lightning components, but other areas that aren’t even Salesforce related.

Most recently I’ve been learning more about how to operate on Github. Up until now I’ve just typically created a repo when I wrote a blog post and added to it when I wrote another blog post on the same topic. Now that I’ve got a project that I’m trying to keep a bit more organised, I’m :

  • Adding details of fixes/features into CHANGELOG.md (I’m pretty good on this one - much better than my modal. I probably need to automate updating the modal from the change log).
  • Creating releases
  • Adding diffs to CHANGELOG.md that show all changes between releases 

This wasn’t something that I expected when I started working on my learning system and, once again, shows the value of side projects. I knew that I’d learn, but surprised even myself.

Related Information

All previous posts about the learning system have moved to a dedicated page on my blog. The code is available in the following repositories.

 

 

Tuesday, 12 June 2018

Accessing Lightning Component URL Parameters

Accessing Lightning Component URL Parameters

Introduction

One of the more common challenges when working with Lightning Components is how to access parameters passed on the URL. In a Lightning App it’s straightforward - simply define an attribute with the parameter name:

<aura:application >
    <aura:attribute name="param1" type="String" />
        Param1 = {!v.param1}
</aura:application>

and then add that parameter to the URL for the app:

    https://kabtutorial-developer-edition.lightning.force.com/KAB_TUTORIAL/UAApp.app?param1=test

and this is picked up and displayed in the resulting page:

Screen Shot 2018 06 11 at 08 07 30

However, Lightning Apps are rarely the solution - not least because they execute outside the Lightning Experience.

Accessing via JavaScript

In a Lightning Component’s JavaScript controller or helper, the parameters can be accessed directly from the window location, or with the locker service enabled for a component at API 40 and higher, the secure window location:

getURLParameter : function(param) {
    var result=decodeURIComponent
        ((new RegExp('[?|&]' + param + '=' + '([^&;]+?)(&|#|;|$)').
            exec(location.search)||[,""])[1].replace(/\+/g, '%20'))||null;
    console.log('Param ' + param + ' from URL = ' + result);
    return result;
}

In case you were wondering, I didn’t come up with the regular expression myself. If it weren’t for stack overflow I suspect I’d be using indexOf and working around errors for some time after.

There’s nothing wrong with this approach, but it feels a little .. clunky.

isUrlAddressable Interface

Summer 18 introduces the lightning:isUrlAddressable interface - I wrote a post about it’s navigational capability a short while ago. Near the bottom of the documentation for this component is an example of how to replace the deprecated force:navigateToComponent with a custom component implementing isUrlAddressable. This example includes an attribute that I hadn’t come across before - v.pageReference - which allows access to the parameters on the URL via the state property. 

I can now create a component that accesses the parameters as easily as an app:

<aura:component implements="flexipage:availableForAllPageTypes,lightning:isUrlAddressable" 
                access="global" >
    <lightning:card>
        Param1 = {!v.pageReference.state.param1}
    </lightning:card>
</aura:component>

Really? That easy?

Yes and no. It’s that easy if you are navigating directly to a component that implements the isUrlAddressable interface, but not if you are:

  • Opening a Lightning Application Page created with the Lightning App Builder
  • Inside another component, even if that component implements the isUrlAddressable interface

In these situations, and any other where the component accessing the v.pageReference attribute isn’t that component that you navigated to, the v.pageReference attribute will be undefined and you’ll have to fall back on accessing via JavaScript, as shown earlier. 

So definitely a step in the right direction, but not the silver bullet I’m still looking for.

Related Posts

 

Sunday, 27 May 2018

Building a Salesforce CLI Plugin

 

Introduction

Plug

As regular readers of this blog will know, I’m a big fan of the command line in general and the Salesforce CLI in particular. Up until now I’ve been layering functionality onto the CLI by wrapping it in bash or node scripts, so I was interested to read about the Plugin Development beta program as this seemed like a neater way to extend it.

The examples I’ve seen to date are mainly focused on extracting information from the org, or performing local processing on files that have already been generated by another CLI command, so I wanted to push things a little further and change something in the org. In the command line tool that I’ve written to manage our BrightMedia projects we update the org with a bunch of information, including the Git commit id, after a successful deployment, and this felt like a good challenge to start with

Creating the Plugin

Simply following the instructions from the plugin generator Github repo gave me the initial version of the plugin. This creates fairly hefty folder structure, but the most interesting bit is the src/commands folder - anything I put in here becomes a command for my plugin and the folder structure from here down gives me the topic/subtopic etc.

I have the structure :

Screen Shot 2018 05 27 at 15 08 58

which gives me the topic bbuzz and the command gitstamp. This concluded the simple part of creating my plugin.

More to Learn

The example plugin that the generator creates is coded in TypeScript, a typed superset of JavaScript that compiles down to plain old JavaScript. I hadn’t used this before, but it’s not too different to JavaScript so I was able to get going reasonably quickly, with the compiler telling me what I’d missed or broken.

The Command

A plugin command needs to extend SfdxCommand - I inadvertently left the name of mine as Org, which is the name for the command created by the plugin generator, but thus far it hasn’t caused me any problems.

export default class Org extends SfdxCommand {

and as I will be connecting to an org I’ll need to enable that. I also want this to be executed from a SalesforceDX project, as the name of the custom setting and the specific field to write the commit id to can be overridden through the sfdx-project.json file:

 protected static requiresUsername = true;
 protected static requiresProject = true;

I then fetch the configuration from the project configuration file, supplying defaults if no config is found:

const projectJson = await this.project.retrieveSfdxProjectJson();
const settingConfig = _.get(projectJson.get('plugins'),'bb.gitSetting');
const settingName=settingConfig.settingName || 'Git_Info__c';
const commitIdField=settingConfig.commitIdField || 'Commit_Id__c';

I then query the setting from the org, as if there is already one I need to update it with it’s id:

const query = 'Select Id, SetupOwnerId, ' + commitIdField + ' from ' + settingName + ' where SetupOwnerId = \'' + orgId + '\'';

interface Setting{
  Id? : string,
  SetupOwnerId : string,
  [key: string] : string
}

// Query the setting to see if we need to insert or update
const result = await conn.query<Setting>(query);

let settingInstance : Setting;
settingInstance={"SetupOwnerId": orgId};

if (result.records && result.records.length > 0) {
  settingInstance.Id=result.records[0].Id
}

Note that I define a typescript interface for the custom setting. As I don’t know the name of the field that the commit id will be stored in, I just define the generic [key:string] rather than a specific name.

Then I get the current commit id and fill that in on the settingInstance:

const util=require('util');
const exec=util.promisify(require('child_process').exec);

const {error, stdout, stderr} = await exec('git rev-parse HEAD');

const commitId=stdout.trim();
settingInstance[commitIdField]=commitId;

Then I had to figure out how to insert or update the custom setting record. Checking the Org class of the Salesforce DX Core library didn’t show me many options, but after a short while I realised that the core library is built on top of jsforce, and sure enough Connection extends jsforce.Connection which had the CRUD commands that I needed:

let opResult;
if (settingInstance.Id) {
  opResult=await conn.sobject(settingName).update(settingInstance);
}
else {
  opResult=await conn.sobject(settingName).insert(settingInstance);
}

this.ux.log('Done');

if (!opResult.success) {
  this.ux.log('Error updating commit id' + JSON.stringify(opResult.errors));
}
else {
  this.ux.log('Stamped the Git commit id in the org');
}

and finally I output a JSON format result string so that if I need to use it in a script I can:

return { orgId: this.org.getOrgId(), 
         commitId: commitId, 
         success: opResult.success, 
         errors: opResult.errors };

Something I wasn’t expecting was that every time I made code changes, simply executing the plugin caused it to be recompiled, which was a real time saver.

Publishing

Once I had my plugin working, I then had to figure out how to publish to the npm.js registry - once again I hadn’t done this before so it was another opportunity to learn.  Luckily this is well documented so it wasn’t as hard as I was expecting. There’s a little bit of hoop jumping around creating the releases on Github, but I’d recently been through at least this aspect with my training system.

Take it for a Spin

To try out the plugin, follow the instructions on the npm page. If you’d like to dig into the code, it's available at the Github repo.

Related Posts

 

Saturday, 19 May 2018

Lightning Utility Bar Click Handling in Summer 18

Lightning Utility Bar Click Handler in Summer 18

Bar

Introduction

As I’ve written about before, I’m a big fan of the lightning utility bar.It allows me to add components to an application that persist on the screen regardless of which tab/record the user is working in/on. One area that has always been a bit of a let down is that I have no way of knowing that the user has opened the utility bar item. In our BrightMEDIA appcelerator I have some utility items that can be added to applications that can change during the application’s lifecycle as components register and deregister. While I can hook in to the original rendering, the item doesn’t get fire a rerender when it is opened, so there is nowhere for me to latch on. In this case I usually put a message up that the information may be out of date and a ‘Refresh’ button - very familiar if the user has viewed Salesforce dashboards in the past.

Enter Summer 18

This problem goes away in Summer 18 (assuming this feature goes live, forward looking statement, safe harbor etc) as the lightning;utilityBar API gets a new event handler - onUtilityClick(). As per the docs, this allows you to register an event handler that is called whenever the utility is clicked.The utility bar API is a headless service component, which is the direction Salesforce are clearly heading for this kind of functionality. The only slightly tricksy aspect to this is that the headless component needs to be rendered in the utility before it can be accessed and a handler can be registered. I tried initialising the utility component at startup ,via the standard checkbox, but everything went null. For this reason I register the event handler from a custom renderer when my utility component handles the render event.

There’s a Sample, right?

As usual I’ve created a sample component to demonstrate this new functionality - this is a utility item that shows the total open opportunity value for my pre-release org user. It executes a server side method whenever the utility is clicked to make it visible. The short video below shows how the utility refreshes itself as when it is visible and displays the correct amount, even if I’ve just edited the opportunity I’m on.

The component includes the lightning utility API :

<lightning:utilityBarAPI aura:id="utilitybar" />

Which is then located by the rendered function to register the handler (a function from the helper) :

var utilityBarAPI = component.find("utilitybar");
var eventHandler = function(response){
    helper.handleUtilityClick(component, response);
};

and the helper method checks if the component is visible, via the panelVisible function of the response parameter passed to it by the platform, and makes a call to the server to recalculate the open opportunity value and display it:

handleUtilityClick : function(cmp, response) {
    if (response.panelVisible) {
        this.recalculate(cmp);
    }
},
recalculate : function(cmp) {
    cmp.set('v.message', 'Calculating ...');
    var action = cmp.get("c.GetOpenOpportunityValue"); 

    self=this;
    action.setCallback(self, function(response) {
        var result = response.getReturnValue();
        cmp.set('v.message', 'Open opps = ' + result);
    });
    $A.enqueueAction(action);
}

You can access the entire component, and it’s Apex controller, at the 

What else can this API do?

Check out Andy Fawcett’s blog for more details.

Related Posts

 

 

Friday, 11 May 2018

Lightning Component Navigation in Summer 18

Lightning Component Navigation in Summer 18

Introduction

In my last but one post (Toast Message from a Visualforce Page) I explained how I needed to solve the problem due to the force:createRecord event not respecting overrides, instead always opening the standard modal-style form and there was no GA mechanism to navigate to a component, force:navigateToComponent being still in beta.

A couple of days after, the Summer 18 release notes came out in preview and towards the end I saw that there was finally a solution for this.

lightning:isUrlAddressable Interface

The new lightning:isUrlAddressable interface allows me to expose a Lightning Component via a dedicated URL. I simply implement this on my component and my component is available at the (current) URL :

   https://<instance>.lightning.force.com/lightning//cmp/<namespace>__<component name>

However, one thing we can be certain of with Lightning Components is change, so rather than hardcoding the URL to my component, there’s a way to get the platform to dynamically generate it.

lightning:navigation Component

The new lightning:navigation component is a headless (i.e. does not have a user interface aspect) service component that exposes methods to help with navigation. The method that I’m interested in is navigate(PageReference). Using this I can create a PageReference JavaScript object that describes the location I want to navigate to and leave it up to the lightning:navigation component to figure out the appropriate URL to hit. 

The beauty of using this mechanism is that my components are decoupled from the URL format. If the Lightning Components team decide the change this, it has no impact on me - the Lightning Navigation component gets updated and everything still works as it always did.

Example

Note - as these features are part of Summer 18, at the time of writing they aren’t available outside of pre-release orgs, which was where I created my example.

I’ve created a component (WebinarOverride)that implements lightning:actionOverride, so that I can use it to override the default new action, just to show that it has no effect, and lightning:isUrlAddressable so that I can navigate to it.

<aura:component implements="force:appHostable,flexipage:availableForAllPageTypes,force:hasRecordId,lightning:actionOverride,lightning:isUrlAddressable">
	<lightning:card iconName="standard:event" title="Override">
	    <lightning:formattedText class="slds-p-left_small" 
value="This is the webinar override component."/> </lightning:card> </aura:component>

Note that it doesn’t do anything in terms of functionality, it’s just intended to obviously identify that we’ve ended up at this component.

I then have a component that can navigate to the override called, creatively, NavigationDemo:

<aura:component implements="force:appHostable,flexipage:availableForAllPageTypes">
    <lightning:navigation aura:id="navService"/>
    <lightning:button label="Force Navigate" onclick="{!c.forceNavigate}" />
    <lightning:button label="Lightning Navigate" onclick="{!c.lightningNavigate}" />
</aura:component>

That this has two buttons - one that uses force:createRecord:

var createRecordEvent = $A.get("e.force:createRecord");
createRecordEvent.setParams({
    "entityApiName": "Webinar__c"
});
createRecordEvent.fire();

and one that uses the lightning:navigation component:

var navService = cmp.find("navService");
var pageReference = {
    "type": "standard__component",
    "attributes": {
        "componentName": "c__WebinarOverride" 
    }, 
    "state": {}
};

navService.navigate(pageReference); 

Note that as the navigation aspect is provided by a headless service component, I have to locate the components via cmp.find() and then execute the method on that. Note also my pageReference object, which has a type of standard__component (even though this is a custom component - go figure) and supplies the component name as an attribute.

I’ve overridden the Lightning Experience actions for the Webinar custom object with my WebinarOverride component:

Screen Shot 2018 05 06 at 10 14 56

So clicking the New button on the webinar home page takes me to the override. If I now open my NavigationDemo component:

Screen Shot 2018 05 06 at 10 18 27

clicking the Force Navigate button is disappointing as always:

Screen Shot 2018 05 06 at 10 18 37

however, clicking on the Lightning Navigate button takes me to my WebinarOverride component:

Screen Shot 2018 05 06 at 10 18 55

You can find the full component code at the Github repository.

Conclusion

The lightning:navigate component also retires the force:navigateToComponent() function, although it’s not clear if there was a WWE-style ladder match to determine who retired. This function has had a troubled existence, leaking out before it was ready and then being withdrawn, causing wailing and gnashing of teeth in the developer community, then entering a short beta before being deprecated. Farewell little buddy, you gave it your best shot but couldn’t cut it.

The headless component mechanism feels like the way that new functionality like this will be added, rather adding functions/events etc to the framework. I guess this keeps the framework as lightweight as it can be and doesn’t force functions on applications that don’t care about them.

One additional point to note - if you forget the lightning:isUrlAddressable interface on your target component the lightning:navigation function will still happily return the URL, but when you navigate to it you will get a not particularly helpful error message that the page isn’t supported in the Lightning Experience.

Finally, I’d still prefer force:createRecord to respect overrides. While I can make my new object component configurable via a custom setting or custom metadata type, from a low code perspective I don’t want to have to change two places to use a different one. Still, we’re in a better place that we have been for a couple of years.

Related Posts