Friday 30 December 2016

JavaScript Promises in Lightning Components

JavaScript Promises in Lightning Components

Promise

Introduction

Promises have been around for a few years now, originally in libraries or polyfills but now natively in JavaScript for most modern browsers (excluding IE11, as usual!).

The Mozilla Developer Network provides a succinct definition of JavaScript promises:

A Promise is a proxy for a value not necessarily known when the promise is created

Promises can be in one of three states:

  • pending - not fulfilled or rejected (Heisenberg, for Breaking Bad fans!)
  • fulfilled  - the asynchronous code successfully completed
  • rejected - an error occurred executing the asynchronous code

For me, the key advantage with Promises is that they allow asynchronous JavaScript code to be written in a way that looks somewhat like synchronous code, and is thus easier for someone new to the implementation to understand.

Creating a Promise

var promise = new Promise(function(resolve, reject) {

  // asynchronous code goes here

  if (success) {
     /* everything worked as expected */
     resolve("Excellent :)");
  }
  else {
    /* something went wrong */
    reject(Error("Bogus :("));
  }
});
 

The Promise constructor takes a callback function as a parameter. This callback function contains the asynchronous code to be executed (to retrieve a record from the Salesforce server, for example). The callback function in turn takes two parameters that are also functions - note that you don’t have to write these functions, just invoke them based on the outcome of the asynchronous code:

  • resolve - this function is invoked if the asynchronous code executes successfully. Executing this function moves the Promise to the fulfilled state
  • reject - this function is invoked with an Error if anything goes wrong in the asynchronous code. Executing this function moves the Promise to the rejected state.

Handling the Result

Thus far all well and good, but pretty much all of the asynchronous code that I write is carrying out a remote activity and returning the result, so I need some way to be notified when the asynchronous code has completed. Enter the Promise.then() function:

promise.then(function(data) {
                alert('Success : ' + data);
             },
             function(error) {
                alert('Failure : ' + error.message);
             });

Promise.then() takes two functions as parameters - the first is a success callback, invoked if and when the promise is resolved, and the second is an error callback, invoked if and when the promise is rejected.

A Real World Example

When building Lightning Components, asynchronous interaction with the Salesforce server is typically carried out via an action. The following function takes an action and creates a Promise around it:

executeAction: function(cmp, action, callback) {
    return new Promise(function(resolve, reject) {
        action.setCallback(this, function(response) {
            var state = response.getState();
            if (state === "SUCCESS") {
                var retVal=response.getReturnValue();
                resolve(retVal);
            }
            else if (state === "ERROR") {
                var errors = response.getError();
                if (errors) {
                    if (errors[0] && errors[0].message) {
                        reject(Error("Error message: " + errors[0].message));
                    }
                }
                else {
                    reject(Error("Unknown error"));
                }
            }
        });
	$A.enqueueAction(action);
    });
}

the executeAction function instantiates a Promise that defines the action callback handler and enqueues the action. When the action completes, the callback handler determines whether to fulfil or reject the Promise based on the state of the response.

This function can then be used to create a Promise to retrieve an account: 

var accAction = cmp.get("c.GetAccount");
var params={"accountIdStr":accId};
accAction.setParams(params);
        
var accountPromise = this.executeAction(cmp, accAction);

and callback handlers provided to process the results:

accountPromise.then(
        $A.getCallback(function(result){
            // We have the account - set the attribute
            cmp.set('v.account', result);
        }),
        $A.getCallback(function(error){
            // Something went wrong
            alert('An error occurred getting the account : ' + error.message);
        })
     );

Note that the success and error callbacks are encapsulated in $A.getCallback functions as they are executed asynchronously, and therefore are outside of the Lightning Components lifecycle. Note also that if you forget to do this, quite a lot of the promise functionality will still work, which will make it difficult to track down what the exact problem is!

Chaining Promises

The Promise.then() function can return another Promise, thus setting up a chain of asynchronous operations that each complete in turn before the next one can start. Repurposing the above example to retrieve a Contact from the Account:

accountPromise.then(
        $A.getCallback(function(result){
            // We have the account - set the attribute
            cmp.set('v.account', result);

            // return a promise to retrieve a contact
            var contAction = cmp.get("c.GetContact");
            var contParams={"accountIdStr":accId};
            contAction.setParams(contParams);
            var contPromise=self.executeAction(cmp, contAction);
            return contPromise;
        }),
        $A.getCallback(function(error){
            // Something went wrong
            alert('An error occurred getting the account : ' + error.message);
        })
   )
   .then(
        $A.getCallback(function(result){
            // We have the contact - set the attribute
            cmp.set('v.contact', result);
        }),
        $A.getCallback(function(error){
            // Something went wrong
            alert('An error occurred getting the contact : ' + error.message);
        })
     );
    

However there is a side effect here - the second then() is executed regardless of the success/failure of the first. If the first Promise was rejected, the success callback for the second then() is executed with a null value. While this would be benign in the above example, it’s probably not behaviour that is desired in most cases. What would be better is that the second then() is only executed if the first one is successful. Enter the Promise.catch() function.

Catching Errors

The Promise.catch() function is invoked if a Promise is rejected, but the then() function didn’t provide an error callback:

promise.then(function(data) {
                alert('Success : ' + data);
             })
        .catch(function(error) {
                alert('Failure : ' + error.message);
             });

When chaining Promises, the catch() function becomes more powerful, as if a Promise is rejected and the then() function did not provide an error callback, control moves forward to either the next then() function that does provide an error callback, or the next catch() function.

Refactoring the example again:

accountPromise.then(
        $A.getCallback(function(result){
            // We have the account - set the attribute
            cmp.set('v.account', result);

            // return a promise to retrieve a contact
            var contAction = cmp.get("c.GetContact");
            var contParams={"accountIdStr":accId};
            contAction.setParams(contParams);
            var contPromise=self.executeAction(cmp, contAction);
            return contPromise;
        })
   )
   .then(
        $A.getCallback(function(result){
            // We have the contact - set the attribute
            cmp.set('v.contact', result);
        })
   .catch(
        $A.getCallback(function(error){
            // Something went wrong
            alert('An error occurred : ' + e.message);
        })
     ); 

The second then() function is now only executed if the account retrieval is successful. An error occurring retrieving either the account or the contact immediately jumps forward to the catch() function to surface the error.

Going back to the original point made in the introduction, I now have two asynchronous operations, with the second dependent on the success of the first, but coded in a readable fashion.

Further Reading

Promises are a tricky one to wrap your head around, and its certainly worth spending some time learning the basics and playing around with examples. I've found the following resources very useful:

Related Posts

 

 

Sunday 11 December 2016

Lightning Design System in Visualforce Part 1 - Getting Started

Lightning Design System in Visualforce Part 1 - Getting Started

Lightningall

Introduction

A short while ago I wrote a Medium post containing tips for Visualforce developers moving over to Lightning Components. A pre-requisite for developing Lightning Components is being able to write JavaScript to some level, which isn’t something that every existing Visualforce developer has, especially those that from a functional background that have gained enough of an understanding of Visualforce and Apex to become productive, but aren’t in a position to quickly train themselves up on a new programming language. 

If you are in this boat, the good news is that if you don’t have JavaScript you can still create a custom user interface matching the Lightning Experience look and feel - simply carry on building Visualforce pages but use the Salesforce Lightning Design System (LDS) to style them.

Get the LDS

LDS is automatically included in various Lightning Component scenarios, but to use it with Visualforce you still need a static resource. Previously it was possible just to download a zip file, but now a custom scope is required in the CSS so you have to generate your own download via the CSS customizer tool. I chose BBLDS - if you go with something else, remember to update the CSS style classes in the sample code. Once you’ve generated the file, upload it to Salesforce as a static resource - I’ve gone with the name BBLDS_W17, again if you change this remember to update the references in the sample code.

Create the Basic Page

I create a new Visualforce page, making sure to check the box so that my page is available in LEX:

Screen Shot 2016 12 10 at 18 22 54

and change the <apex:page /> tag to the following:

<apex:page showHeader="false" sidebar="false" standardStylesheets="false"
           standardController="Account" applyHTmlTag="false">

Note that I have to turn off all aspects of standard Visualforce styling - no header, no sidebar and no standard stylesheets. The example page displays some basic details of an Account so I specify the Account standard controller. Finally, the applyHtmlTag attribute is set to false - this gives me control over the <html>, <head> and <body> tags, which is necessary to use the SVG icons in the LDS. Next, I add the <html> tag, specifying the SVG namespace information, followed by the <body> tag:

<html xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<body>

Next, I include the LDS - as noted above, if you’ve chosen a different name to BBLDS_W17, remember to change the name (and remember I die a little inside when my recommendations are rejected, but I accept that everyone has free will):

<apex:stylesheet value="{!URLFOR($Resource.BBLDS_W17,
                         'assets/styles/salesforce-lightning-design-system-vf.min.css')}" />

 Then I add the Lord of the Divs - one Div to rule them all - that wraps all content and specifies the LDS namespace. Again, if you haven’t used the same name as me (which hurts, it’s like a knife to my heart) remember to update the code:

<div class="BBLDS">

then I close everything off - the <div>, <body>, <html> and <page> tags.

Add LDS Components

The next task is to add some LDS components. Here I use the time-honoured approach of finding something in the LDS samples that looks appropriate, copying the sample and hacking it around, and I’d certainly recommend that approach when you first start out.

The first element I’ve added is the Record Home Page Header - I’m not going to cover the code in detail here, save for a couple of key points.

First - this is a Visualforce page so I can just drop in merge syntax to display details of the Account, for example:

<li class="slds-page-header__detail-block">
  <p class="slds-text-title slds-truncate slds-m-bottom--xx-small" title="Description">Description</p>
  <p class="slds-text-body--regular slds-truncate" title="{!Account.Description}">{!Account.Description}</p>
</li>

 gives me:

Screen Shot 2016 12 10 at 17 55 38

Second, SVG icons requires me to specify the location of the icon in the static resource file:

<div class="slds-media__figure">
  <svg aria-hidden="true" class="slds-icon slds-icon-standard-account">
    <use xlink:href="{!URLFOR($Resource.BBLDS_W17, 
'/assets/icons/standard-sprite/svg/symbols.svg#account')}">
</use>
</svg>
</div>

Screen Shot 2016 12 10 at 17 59 50

Third - I can still use standard component tags to generate formatted output:

<div class="slds-form-element slds-hint-parent slds-has-divider--bottom">
  <span class="slds-form-element__label">Created By</span>
  <div class="slds-form-element__control">
    <span class="slds-form-element__static">{!Account.CreatedBy.Name},
      &nbsp;<apex:outputField value="{!Account.CreatedDate}"/>
    </span>
  </div>
</div>

 

Screen Shot 2016 12 10 at 18 02 29

Note that I don’t use an output field tag to generate the details of the user that created the Account, as this has a side effect of adding a hover popup to the output link, which won’t be styled correctly as I’m not using any of the default Visualforce styling. Always consider the full effects of any standard component you use, and make sure you are happy with them - you don’t want to confuse your users with broken or half-styled elements.

Boring - where’s the final page?

Here’s  a screenshot of the final page - pretty simplistic, but definitely styled appropriately for LEX users:

Screen Shot 2016 12 10 at 18 07 33

Enough talk, show me the code

As usual with LDS posts, the code is in my LDS Samples Github Repository. There’s also an unmanaged package available to save wasting time copying and pasting - see the README.

In Conclusion

This post is in no way intended to suggest that you avoid learning Lightning Components in favour of making your Visualforce look like LEX. Lightning Components are the future and they are in and of LEX rather than iframed in from a different server, so you’ll be able to do a lot more with them in the future. I’d also imagine there isn’t a large team developing Visualforce nowadays, more likely a small team working on essential maintenance items, so you are unlikely to see much in the way of new features.

Related Posts

 

Sunday 27 November 2016

Force CLI Part 5 - Node Wrapper

Force CLI Part 5 - Node Wrapper

Json

Introduction

In earlier posts in this series I shared the odd example bash script that carried out some combination of commands in conjunction with the Force CLI to achieve things like creating a zip backup of Salesforce metadata. The reason I went with bash scripts is that I know the syntax really well and was able to quickly build some tools for managing deployment of our BrightMedia Full Force solution. In essence it allowed us to get big quick.

However, there are a couple of downsides to this approach:

  • Not that many people know bash these days - I spent many years building systems on Linux so I’m pretty familiar with it, but I didn’t want to be a bottleneck
  • Bash isn’t the easiest tool to process XML or JSON format data. You either end up running the files through additional commands such as XMLStarlet, which gives you a whole new syntax/configuration to learn, or you start parsing it using tools like sed and awk, which works for the basics but quickly becomes unusable with complex data.
I’m pushing most of our people towards JavaScript now, with the advent of lightning components, so JavaScript running on the Node platform seemed like the obvious choice to rewrite the scripts in. I then found out that the CLI for SalesforceDX will be built in JavaScript on Node, and will have a plugin architecture to allow us to extend it, which more than justified my choice. 

JSON takes Salesforce Data

While my code behind the tools is now a lot more flexible and easier to maintain, the key use case I’ve found is when extracting Salesforce record data for processing - e.g. carrying out a data fix as part of a release. The Force CLI can output the results of a query formatted as a JSON string, and JavaScript can parse this into an object using the built in JSON.parse() function.

The following example reflects a real-world scenario I encountered a few weeks ago. I was bitten by a known issue with the Winter 17 release of Salesforce, which meant that I couldn’t login to the Lightning Experience. I needed to flip the users in this org to the classic interface, which I started doing via the data loader before creating a script to do the same thing with the Force CLI wrapped in a Node script.

Getting Started

First, download and install Node from the NodeJS org website.

Then create a new directory for the project - in this example a subdirectory named cli.

  • mkdir cli
  • cd cli
Next, create a Node package file for the project:
  • npm init

Accept the defaults for everything and add a description if you so desire.

Note that you don’t strictly need to do this, for the simple example in this post, but it’s a really good habit to get into for when you need to include dependencies on other packages.

Show Me the Code

Save the following code into a file named index.js :

var child_process=require('child_process');

var instance=process.argv[2];
if (instance===undefined) {
    instance='login';
}
var loginOpts=['login', '-i=' + instance];
child_process.execFileSync('force', loginOpts);

// extract the user records as JSON
var queryOpts=['query', 'select id, Username, UserPreferencesLightningExperiencePreferred from User',
               '--format:json'];

var uJSON=child_process.execFileSync('force', queryOpts);

// turn into a JS object
var users=JSON.parse(uJSON);

// dump the users out to the console
for (var idx=0, len=users.length; idx<len; idx++) {
    var user=users[idx];
    console.log('User = ' + JSON.stringify(user, null, 4));
}

 Executing this produces the following (edited) output: 

kbowden@Keirs-MacBook-Air cli $ node index.js
User = {
    "Id": "00524000003vqIgAAI",
    "UserPreferencesLightningExperiencePreferred": true,
    "Username": "ltg.test@bgmctest.brightgen.com",
    "attributes": {
        "type": "User",
        "url": "/services/data/v36.0/sobjects/User/00524000003vqIgAAI"
    }
}
User = {
    "Id": "00524000000FctAAAS",
    "UserPreferencesLightningExperiencePreferred": true,
    "Username": "keirbowden@bgmctest.brightgen.com",
    "attributes": {
        "type": "User",
        "url": "/services/data/v36.0/sobjects/User/00524000000FctAAAS"
    }
}

So I can see that a number of users are setup for the Lightning Experience.

Explain Yourself

Breaking down the code a little, the first line loads the child_process module, which allows a script to spawn other processes:

var child_process=require('child_process');

The next lines figure out the correct Salesforce instance to connect to - the default is login, but you can override this by passing an additional argument after index.js, such as test to connect to the sandbox login endpoint. Once this is done, a child process is spawned to execute the login. 

var instance=process.argv[2];
if (instance===undefined) {
    instance='login';
}
var loginOpts=['login', '-i=' + instance];
child_process.execFileSync('force', loginOpts);

Note that the chid process function takes two parameters - the first is the command to execute (force in this case) and the second an array of parameters to pass to the command. Note also that I use the execFileSync variant to execute the command synchronously and return the output as the result which I promptly ignore!

Next, a query is executed via the Force CLI to extract details of all users in JSON format:

// extract the user records as JSON
var queryOpts=['query', 'select id, Username, UserPreferencesLightningExperiencePreferred from User',
               '--format:json'];

var uJSON=child_process.execFileSync('force', queryOpts);

and this is converted to an array of JavaScript objects:

var users=JSON.parse(uJSON);

Updating the Users

Once I verified that I could extract the users successfully, I could then update each user to use Salesforce Classic, by replacing the final loop with the following code which executes the Force CLI record update command for each returned user:

for (var idx=0, len=users.length; idx<len; idx++) {
    var user=users[idx];
    if (user.UserPreferencesLightningExperiencePreferred) {
        console.log('Reverting user ' + user.Username + ' to classic');
        var updateOpts=['record', 'update', 'user', user.Id,
                        'UserPreferencesLightningExperiencePreferred:false'];
        child_process.execFileSync('force', updateOpts);
    }
}

Running the revised script gives the following (edited) output:

kbowden@Keirs-MacBook-Air cli $ node index.js login
Reverting user ltg.test@bgmctest.brightgen.com to classic Reverting user keirbowden@bgmctest.brightgen.com to classic

Logging in after executing this script confirms that the users have been updated correctly as I am taken straight to the classic UI rather than a sad snowman with an error message! 

Related Posts

 

 

Thursday 10 November 2016

Lightning Component Actions with Signature Capture Part 2

Lightning Component Actions with Signature Capture Part 2

Introduction

In the first part of this blog I covered how to set up a quick action using the Lightning Component Action in the Signature Capture package. In this post I’ll show how to create a custom lightning component action that wraps the Signature Capture component and allows overriding of the default text and labels.

Messaging

The following list shows the Signature Capture message/label attributes and their default values:

  • startMsg - “Click the ‘Capture Signature’ button to begin 
  • enterMsg - “Sign here please"
  • completeMsg - “Here is the captured signature"
  • captureButtonLabel - “Capture Signature"
  • saveButtonLabel - “Save”     (new in Winter 17)
  • clearButtonLabel - “Clear”    (new in Winter 17)

Text Literals

The packaged component action, SignatureCaptureAction, leaves the default attribute values as-is. A custom action, on the other hand, allows some or all of these to be overridden with different text literals:

<BGSIGCAP:SignatureCapture recordId="{!v.recordId}"
          startMsg="Let's Go - Click the button to start” />

 

Screen Shot 2016 11 10 at 08 11 42

Custom Labels

Attributes may also be defined as custom labels, which have the added advantage of automatically picking up the correct translation for the user:

Screen Shot 2016 11 10 at 08 16 03

<BGSIGCAP:SignatureCapture recordId="{!v.recordId}"
         saveButtonLabel="{!$Label.c.SigCapSave}" />

Screen Shot 2016 11 10 at 08 25 01

 

Example Custom Action

Here’s a couple of screenshots from an example custom component with all text content overridden :

Screen Shot 2016 11 10 at 08 33 41

Screen Shot 2016 11 10 at 08 33 53

Screen Shot 2016 11 10 at 08 34 03

The source for this can be found in the Signature Capture Samples GITHUB repository.

Related Posts

 

Monday 7 November 2016

Lightning Component Actions with Signature Capture

Lightning Component Actions with Signature Capture

Screen Shot 2016 11 07 at 18 36 19

Introduction

As some readers of this blog may be aware, a year or so ago I wrote a Lightning Component for the Component Exchange that was soon to be launched at Dreamforce. The component is called Signature Capture and unsurprisingly allows a user to capture a signature image and attach it to a Salesforce record. Up until the Winter 17 release of Salesforce, the only way to associate this with a record declaratively was to use the Lightning App Builder to include it in a custom record home page. This works fine but is a slightly odd user experience, as the component is always present even after the user has captured the signature. This all changed once Winter 17 was live thanks to Lightning Component Actions.

Lightning Component Actions

As the name suggests, Lightning Component Actions are custom actions that invoke a lightning component. Many lightning components are called, but few are chosen for custom actions. Specifically those that implement one of two interfaces:

  • force:LightningQuickAction
  • force:LightningQuickActionWithoutHeader

The difference being that if you implement force:LightningQuickActionWithoutHeader the entire user interface is down to you.

You can use Lightning Component Actions in Salesforce1 and the Lightning Experience.

SignatureCaptureAction

The latest version of Signature Capture package contains a simple component that implements the force:LightningQuickAction interface and includes the Signature Capture component itself. To create a quick action that uses this is a matter of a few clicks. 

Setting Up

First choose which sObject type you want to associate the quick action with - in the screenshots below I’m using account.

Navigate to the Buttons, Links, and Actions section of the setup for the sObject and choose New Action. Select ‘Lightning Component’ as the action type and ‘BGSIGCAP:SignatureCaptureAction' as the Lightning Component. I find that a height of 500px works best, but your mileage may vary. Then enter a Label and Name and click the Save button:

Screen Shot 2016 11 07 at 18 14 31

Once the quick action is defined, it must be added to the page layout in the Salesforce1 and Lightning Experience Actions section (if you haven’t added any quick actions to this section before you’ll need to override the default via the link provided):

Screen Shot 2016 11 07 at 18 24 09

Take it for a Spin

After setup is complete, navigate to any account record in the Lightning Experience and you will see the quick action in the set of buttons on the top right:

Screen Shot 2016 11 07 at 18 26 04

Clicking the button opens the action in its own dialog, allowing a signature to be drawn via mouse, trackpad etc:

Screen Shot 2016 11 07 at 18 27 45

To my mind this is a much better user experience, as the component only appears on demand, so there is no confusion around a component that is always present. The slight downside is that once the signature has been captured the user still has to close the dialog via the standard Cancel button, but that’s a minor drawback to my mind.

Mobility Included

Adding the quick action to the page layout also makes it available to the Salesforce1 application on mobile devices. A word of warning around this - the locker service currently doesn’t surface touch events to a component, so if you enable the locker service the Signature Capture component won’t work on tablets or phones.

Repo Man

The Signature Capture component has its own github repo with sample apps (or app at the time of writing!). This is also where feature requests are tracked, so if there’s anything you’d like to see added, let me know and I’ll see if it can be accommodated. Just to set expectation, this is something that I work on mainly in my spare time (I’m currently on vacation, hence having time for this post) so I may take a while to respond.

Related Posts

 

 

 

Saturday 29 October 2016

Lightning Components - New Tab, New You

Lightning Components - New Tab, New You

Lctab

Introduction

This is something I’ve been meaning to blog about for a while now, but things like Dreamforce, preparing for Dreamforce and the Winter 17 release kept getting in the way. I didn’t get as far as setting up a nice example, due to the sheer volume of code involved, so I haven’t been back and tried it out recently. Therefore it’s possible that you’ll never have the same experience I did, but as it’s not exactly onerous it seemed worth putting it out there anyway. Your mileage may vary.

Signature move

While building out the MVP for our BrightMedia Lightning booking page to demonstrate at Dreamforce, I needed to change the signature of one of my Apex controller methods, so to get a simple example from something like:

public static Account GetAccount(String idStr)

to

public static Account GetAccount(String idStr, String name)

and changed the parameters that the Lightning Component passed on the action from:

var params={idStr:'0018000001QtQVA', name:'Blog Post Account'};

That went well!

Once I’d made the various changes and saved all the artefacts, I refreshed the page and starting seeing some unexpected behaviour. The name parameter came through as null regardless of what I set it to. Debug statements in the lightning component showed that the name had a value (it wasn’t a hardcoded string unlike the example above!), but some time after I added it to the action it went bad. Making it a hardcoded string didn’t help, nor did juking around the other parameters. I even tried adding more parameters that definitely didn’t exist server side, but nothing helped.

After a some time and about and thousand debug statements later, I found that while I might be adding the new parameter to my params object, when that was sent to the server it had disappeared, hence the server always reported null.

Cache is king

I then tried what I should have as soon as this manifested itself, the browser equivalent of turning it off and on again. I closed the browser tab and opened a new one. Lo and behold everything worked as expected. We all know that the lightning framework aggressively caches on the front end to improve performance, and it looks like the client retained a metadata view of the server side signature after I changed it.

Now I need to stress that for once I’m not complaining about the behaviour of the framework! This is exactly the kind of thing that I’d expect when caching metadata. It’s not always possible to check everything in a cache is still valid - you quickly start to lose the benefit if you go back and check the source on every request. 

The key takeaway here is that when your application involves caching and you change something which doesn’t work as you expected, open a new tab (or window), or even a new browser. It only takes a few seconds and will save time and heartache in the long run.

Related posts

 

Friday 14 October 2016

Visualforce Development Cookbook Second Edition

Visualforce Development Cookbook Second Edition

8086cov

Introduction

As regular readers of this blog will know, I wrote a book three years ago - the Visualforce Development Cookbook - and I really haven't stopped banging on about it since. Well things just got worse as I finished the second edition in ?September? and I'm obviously keen to shift as many copies as possible!

What’s a Second Edition?

A second edition isn't a new book. Instead it's an iteration on the previous version, minus content that is no longer relevant and including new features, so the first job is to identify what should be cut and what should be added.

One of the nice things about working on Salesforceis that over time new features are added that remove the need for custom code, so I was able to chop a few recipes out that are now handled natively by the platform. The next job is to update all existing code to the latest API version - something that typically happens several times as Salesforce produces releases faster than I can write a book (even a second edition!). 

Then its back to the writing - for existing recipes making sure that the text is still appropriate and creating new recipes from scratch. This was pretty compressed this time around, as Packt proposed a pretty aggressive timescale to complete before Dreamforce which I was more than happy to sign up to, as I knew that would take up all of my time through to mid-October. I wouldn't recommend this if you don't have some vacation time you can dedicate to writing though, as it would have been pretty tough to complete with only evenings and weekends available.

What’s changed?

The main changes in the second edition are:

  • The chapter on Force.com sites has been reworked to use the Salesforce Lightning Design System
  • The chapter on jQueryMobile has been replaced with Salesforce1 recipes.
  • A new chapter on Troubleshooting has been added, covering debugging, managing the view state and advice on avoiding common problems.
  • All the retained recipes have been revised as appropriate.

You want me to buy it again?

Now I’m not like the music industry that expects you to buy the same  music every time they think up a different format, and I can understand that for someone who already purchased the first edition, this might not represent enough of a change to justify buying the second edition (although it was for Fabrice Cathala, who is clearly going for the full set - thanks Fabrice!).

I raised this with the publishers and they gave me a couple of discount codes to use, both valid until 17th October, which you can use on the Packt website:

  • Use code VISDSE50 for 50% off the Ebook
  • Use code VISDSE15 for 15% off the print book

Note that you don't need to have purchased the first edition to use these codes, you can simply use them to scoop up a bargain.

Show me the code!

For those who just need the code and no explanation, its available free of charge on github at:

 https://github.com/PacktPublishing/Visualforce-Development-Cookbook-2e

A word of advice though - best not come to me asking for help around adapting the recipes or producing unit tests if you've decided not to purchase the book - that would be asking me to cannibalize my own sales!

Related Posts

 

Friday 23 September 2016

Force CLI Part 4 - Deploying Metadata

Force CLI Part 4 - Deploying Metadata

Depfis2

Introduction

In previous posts in this series I’ve covered Getting Started, Extracting Metadata and Accessing Data via the Force CLI. In this post I’ll cover the functionality that I use probably 95% of the time that I use the Force CLI - deploying metadata from the local filesystem to a Salesforce instance.

The Building Blocks

Like other tools for deploying to Salesforce (e.g. migration tool, Eclipse IDE) there are a two items required to deploy via the Force CLI

Project Manifest (aka package.xml)

The package.xml file defines the metadata components that will be deployed - you can find out more about this in the Salesforce Metadata API Developer’s Guide

Metadata Components

The metadata components need to appear in the directory structure expected by the metadata API - e.g. Apex classes in a directory named ‘classes’, Visualforce pages in a directory named ‘pages'

I don’t make the rules (but I do make the deployment directory)

The rule when deploying is that the components specified in the package.xml manifest file need to match those present in the metadata components directory structure. If you specify components that are missing you will get an error, and if you provide additional components that aren’t mentioned in the package.xml, you’ll get a different flavour of error.

My metadata comes from GIT, so the metadata components structure of my entire project is defined there. However, I don’t always want to carry out a full deployment so I create a new structure in a temporary directory and create a package.xml file reflecting that. I also use a naming convention for my temporary directory so that I can look back at the order that I did things in - drop_<YYYYMMDDhhmmss>, where YYYYMMDD is the year/month/day of the date and hmmss is the hours/minutes/seconds of the time. The reason I go with this naming convention is that when I use the ls command, my directories are listed in order of creation:

$ ls -l

drwxr-xr-x  6 kbowden  wheel    204 16 Sep 14:39 drop_20160916141152
drwxr-xr-x  6 kbowden  wheel    204 17 Sep 10:11 drop_20160916184125
drwxr-xr-x  6 kbowden  wheel    204 17 Sep 12:45 drop_20160917121232
drwxr-xr-x  6 kbowden  wheel    204 18 Sep 12:56 drop_20160918092957
drwxr-xr-x  6 kbowden  wheel    204 18 Sep 19:21 drop_20160918185148
drwxr-xr-x  6 kbowden  wheel    204 20 Sep 15:39 drop_20160920152837

To use a very simple example, here’s the directory structure to deploy just the User object. My starting directory is /tmp/drop_20160923080432:

  /tmp/drop_20160923080432/target/objects/User.object
  /tmp/drop_20160923080432/target/package.xml

 My package.xml file just contains an entry to deploy the objects directly (it’s called CustomObject, but covers both standard and custom):

<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
  <types>
    <members>*</members>
    <name>CustomObject</name>
  </types>
  <types>
    <members>*</members>
    <name>ApexPage</name>
  </types>
  <version>37.0</version>
</Package>

Deploy all the Metadata

Once the directory structure and manifest are in place, use the force import command to deploy your metadata, using the -d switch to specify the directory containing the metadata:

force import -d /tmp/drop_20160923080432/target

The Force CLI will update you every five seconds on progress

Not done yet: Queued  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.
Not done yet: InProgress  Will check again in five seconds.

Once the deployment is complete, you’ll get details of successes/failures - the default behaviour is simply the to output the number of each

Failures - 0

Successes - 24
Imported from /tmp/drop_20160923080432/target

While if you specify the -v switch, you'll get full details of each success/failure:

Successes - 24
User.DFP_Id__c
	status: unchanged
	id=00N80000005st6iEAA
User.Default_Tab__c
	status: unchanged
	id=00N80000005st6kEAA
User.Department__c
	status: unchanged
	id=00N80000005st6lEAA

   ...

Deploying to Production

As everyone in the Force.com world knows, in order to deploy metadata to production you need to run some unit tests. The Force CLI provides support for this via the -l switch, which takes the same options as the migration tool:

  • NoTestRun 
  • RunSpecifiedTests
  • RunLocalTests
  • RunAllTestsInOrg

Verification Only

Another very useful switch is -c. This carries out a verification only deployment and rolls back all changes upon completion. I use this as part of our continuous integration environment to make sure I can successfully deploy to a clean development environment.

Related Posts

 

Sunday 28 August 2016

Stand and Deliver with Salesforce Geocoding

Stand and Deliver with Salesforce Geocoding

Introduction

The Summer 16 release of Salesforce introduced Automatic Geocodes for Addresses which, as the name suggests, allows you to geocode records with addresses automatically. To enable this just go to Setup, type ‘Clean' into the search and click on Clean Rules from the resulting setup nodes.  Then activate one or more of the clean rules and wait a few minutes. Much easier than setting up a batch job to integrate with Google and then finding yourself in breach of the license terms! 

In conjunction with the Geolocation API, this new feature allows you to build some cool applications without too much effort. In this post I’ll show how to combine these two technologies to build an application that assigns a delivery to the nearest user (driver, bike messenger etc) to the collection location.

User Check In

The functionality to allow users to check in (record their location) is provided by a Lightning Component. The component includes the Lightning Design System (yeah, I know it's part of the platform now, but including it myself means that I can also use this component in lightning out) and after the CSS has loaded, uses the geolocation API to capture the current location:

    doInit : function(cmp, ev) {
        var self=this;
        if (navigator.geolocation) {
            navigator.geolocation.getCurrentPosition(
                function(result) {self.gotGeoCode(result, cmp, self);},
                self.error,
                {
                maximumAge: 0,
                timeout:30000,
                enableHighAccuracy: true
                }
            );
        }
    },

The parameters passed to the geolocation function are as follows:

  • maximumAge - the maximum age of a cached position. Setting this to zero ensures that a new position is captured
  • timeout - the maximum number of milliseconds to wait for a response
  • enableHighAccuracy - requests that the most accurate position is captured. A request specifying this typically takes longer and consumes more power, and may be denied by the hardware. It’s just a suggestion really.

Once captured, the a server side Apex action is invoked to store the position in a custom field on the user record, along with a timestamp of when it was captured:

    @AuraEnabled
    public static String StoreUserLocation(Decimal latitude, Decimal longitude)
    {
        String result='SUCCESS';
        try
        {
	    User u=new User(id=UserInfo.getUserId(),
    	                    Location__Latitude__s=latitude,
                           Location__Longitude__s=longitude,
            	            Location_Timestamp__c=System.now());
	    update u;
        }
        catch (Exception e)
        {
            result=e.getMessage();
        }
        
        return result;
    }

Note that I don’t need to retrieve the user record from the database to update it, I can just instantiate a new User object in memory and specify the id based on the UserInfo global - always worth saving a SOQL call if possible.

The component in action is shown in the screen capture video below:

Delivery Request

The flip side of the user check in is the delivery request. This is where a customer requires something to be collected from one (their) account and delivered to another. As this isn’t something that is likely to be requested on the fly, this is handled by a Visualforce page, although still using the Lightning Design System to style a Visualforce form.

To create a new delivery, the customer (me in this case) specifies the From and To account and any specific delivery instructions. I’m sending a package to Universal Containers as they seem to have a new social media team and I am keen to engage with them:

Screen Shot 2016 08 28 at 17 55 11

The package is being sent from Sea Palling Beach, where myself and the Jobs Admin have recently checked in:

Screen Shot 2016 08 28 at 17 56 44

When the record is saved, the controller sets the owner to the user who has most recently checked in to a location within 50 miles - if no such user can be found, the owner will be left as the user that created the record:

 

    public PageReference save()
    {
        Account acc=[select ShippingLatitude, ShippingLongitude
                     from Account
                     where id=:delivery.From__c];
	List users=[SELECT id, Location__c, Location_Timestamp__c
	 	    FROM User
	 	    WHERE DISTANCE(Location__c, GEOLOCATION(:acc.ShippingLatitude, :acc.ShippingLongitude), 'mi') < 50
                    ORDER BY Location_Timestamp__c DESC];
	
	if (users.size()>0)
        {
            delivery.OwnerId=users[0].id;
        }
        
        insert delivery;
        ApexPages.StandardController std=new ApexPages.standardController(delivery);
        return std.view();
    }

 So even though I created the delivery request, it was assigned to the job admin user as they checked in more recently:

Screen Shot 2016 08 28 at 18 18 31

That’s It?

That is indeed  all there is to it - a couple of custom fields on the user object, a custom object, one lighting component, one Visualforce page and two Apex classes. If you are so inclined, you can view the full code at the github repository. 

 

Saturday 16 July 2016

Force CLI Part 3 - Accessing Data

Force CLI Part 3 - Accessing Data

Clidata

Introduction

In part 2 of this series, Extracting Metadata, I covered how to extract configuration data from your Salesforce instance, which it is something pretty much every other deployment tools offer. A key difference of the Force CLI is that you aren’t limited to configuration - you can retrieve and update your Salesforce data from the command line or a script. Very useful if you have to carry out some data migration as part of a deployment for example.

NOTE: In the examples below the commands are split over multiple lines for readability, but should be entered on a single line if you are trying them out for yourself.

Executing SOQL

The command for executing SOQL is force query, followed by the SOQL string enclosed in quotes:

> force query "select id, Name, Industry from Account 
order by CreatedDate limit 5"

The default format is ascii table style:

 Id                 | Industry       | Name
--------------------+----------------+---------------------------------
 0018000000Q9v1VAAR | Construction   | Pyramid Construction Inc.
 0018000000Q9v1WAAR | Consulting     | Dickenson plc
 0018000000Q9v1XAAR | Hospitality    | Grand Hotels & Resorts Ltd
 0018000000Q9v1YAAR | Transportation | Express Logistics and Transport
 0018000000Q9v1ZAAR | Education      | University of Arizona
 (5 records)

which is fine for displaying data on the screen, but not that easy for processing. Luckily the output format can be changed via the -format switch.

CSV Formatting

Specifying csv as the value of the format switch outputs the results of the query in the familiar comma separated format, along with a list of headers:

> force query "select id, Name, Industry from Account 
order by CreatedDate limit 5" -format:csv
"Id","Industry","Name"
"0018000000Q9v1VAAR","Construction","Pyramid Construction Inc."
"0018000000Q9v1WAAR","Consulting","Dickenson plc"
"0018000000Q9v1XAAR","Hospitality","Grand Hotels & Resorts Ltd"
"0018000000Q9v1YAAR","Transportation","Express Logistics and Transport"
"0018000000Q9v1ZAAR","Education","University of Arizona"

 while this is more useful for processing, it doesn’t work well with subqueries:

 
> force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:csv
"Contacts","Id","Name" "[map[Id:0038000000asT8TAAU FirstName:Pat]]","0018000000Q9v1VAAR","Pyramid Construction Inc." "[map[Id:0038000000asT8UAAU FirstName:Andy]]","0018000000Q9v1WAAR","Dickenson plc" "[map[Id:0038000000asT8VAAU FirstName:Tim] map[Id:0038000000asT8WAAU FirstName:John]]","0018000000Q9v1XAAR","Grand Hotels & Resorts Ltd" "[map[Id:0038000000asT8ZAAU FirstName:Babara] map[FirstName:Josh Id:0038000000asT8aAAE]]","0018000000Q9v1YAAR","Express Logistics and Transport" "[map[Id:0038000000asT8bAAE FirstName:Jane]]","0018000000Q9v1ZAAR","University of Arizona"

JSON Formatting

Specifying json as the format switch outputs the result in JavaScript Object Notation:

force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:json
 
[{"Contacts":{"done":true,"records":[{"FirstName":"Pat","Id":"0038000000asT8TAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8TAAU"}}],"totalSize":1},"Id":"0018000000Q9v1VAAR","Name":"Pyramid Construction Inc.","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1VAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Andy","Id":"0038000000asT8UAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8UAAU"}}],"totalSize":1},"Id":"0018000000Q9v1WAAR","Name":"Dickenson plc","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1WAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Tim","Id":"0038000000asT8VAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8VAAU"}},{"FirstName":"John","Id":"0038000000asT8WAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8WAAU"}}],"totalSize":2},"Id":"0018000000Q9v1XAAR","Name":"Grand Hotels \u0026 Resorts Ltd","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1XAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Babara","Id":"0038000000asT8ZAAU","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8ZAAU"}},{"FirstName":"Josh","Id":"0038000000asT8aAAE","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8aAAE"}}],"totalSize":2},"Id":"0018000000Q9v1YAAR","Name":"Express Logistics and Transport","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1YAAR"}},{"Contacts":{"done":true,"records":[{"FirstName":"Jane","Id":"0038000000asT8bAAE","attributes":{"type":"Contact","url":"/services/data/v36.0/sobjects/Contact/0038000000asT8bAAE"}}],"totalSize":1},"Id":"0018000000Q9v1ZAAR","Name":"University of Arizona","attributes":{"type":"Account","url":"/services/data/v36.0/sobjects/Account/0018000000Q9v1ZAAR"}}]

 suitable for processing by a myriad variety of tools.

Prettified JSON

For the best of both worlds - JSON formatted for automated processing but laid out so that humans can read it, specify json-pretty as the format switch:

> force query "select id, Name, (select id, FirstName from Contacts) 
from Account order by CreatedDate limit 5" -format:json-pretty
[ { "Contacts": { "done": true, "records": [ { "FirstName": "Pat", "Id": "0038000000asT8TAAU", "attributes": { "type": "Contact", "url": "/services/data/v36.0/sobjects/Contact/0038000000asT8TAAU" } } ], "totalSize": 1 }, "Id": "0018000000Q9v1VAAR", "Name": "Pyramid Construction Inc.", "attributes": { "type": "Account", "url": "/services/data/v36.0/sobjects/Account/0018000000Q9v1VAAR" } }
<removed for clarity>
]

Working with Records

If you know the ID(s) of the records you need to work with, the force record command allows you to access and manipulate record instances.

Retrieving Records

To retrieve a record, execute force record <type> <id> - this brings back all fields :

> force record get Contact 0038000000asT8TAAU
AccountId: 0018000000Q9v1VAAR AssistantName: Jean Marie AssistantPhone: (014) 427-4465 Birthdate: CreatedById: 00580000001ju2CAAQ CreatedDate: 2009-04-19T11:27:45.000+0000 Department: Finance Description: Fax: (014) 427-4428 FirstName: Pat

<removed for clarity>
SystemModstamp: 2013-01-10T16:17:39.000+0000 Title: SVP, Administration and Finance attributes: type: Contact url: /services/data/v36.0/sobjects/Contact/0038000000asT8TAAU

Updating Records

To update a record, execute force record update <type> <id> <field>:<value>

> force record update Contact 0038000000asT8TAAU FirstName:"Patrick"
Record updated

I can then confirm that the record has been updated by retrieving it again and extracting the FirstName field:

>force record get Contact 0038000000asT8TAAU | grep FirstName
FirstName: Patrick

(the grep command extracts any lines in the output containing the search term, FirstName in this case.

Creating Records

You aren’t just limited to existing records with the Force CLI - you can also created them, either individually on en-masse from a file.

To create a single record execute force record create <type> <field> <field> … :

> force record create Contact FirstName:"Keir" LastName:"Bowden"
Record created: 00380000023TUCxAAO > force record get Contact 00380000023TUCxAAO
AccountId:

<removed for clarity>
FirstName: Keir LastModifiedById: 00580000001ju2CAAQ LastModifiedDate: 2016-07-16T10:48:38.000+0000 LastName: Bowden
 <removed for clarity>

To create multiple records execute force record create:bulk <type> <file> - the default format is CSV, but you can specify the content type of the file if it is different. Thus the following file, named contacts.csv:

"FirstName","LastName"
"First","CLI Blog"
"Second","CLI Blog"
"Third","CLI Blog"

can be loaded as follows

> force record create:bulk Contact contacts.csv

Batch 1 of 1 added with Id 75180000006NK0pAAG
Job created ( 75080000004fh7wAAA ) - for job status use
 force bulk batch 75080000004fh7wAAA 75180000006NK0pAAG

and I can verify the load using the query command introduced earlier

> force query "select FirstName, LastName from Contact where LastName ='CLI Blog'"

 FirstName | LastName
-----------+----------
 First     | CLI Blog
 Second    | CLI Blog
 Third     | CLI Blog
 (3 records)

Related Posts

 

Sunday 26 June 2016

Force CLI Part 2 - Extracting Metadata

Force CLI Part 2 - Extracting Metadata

Extract

Introduction

In Part 1 of this series, Getting Started, I covered downloading and installing the Force CLI, and logging in to Salesforce. This post covers extracting metadata from your Salesforce org.

Why do I need the Metadata?

This is a very good question - a lot of the time the answer is that you don’t. If you do, its highly likely that your IDE of choice will handle this for you - pulling down the metadata components that you are working on and uploading the changes when you save or deploy. The classic use case for this is to make a backup of your implementation, typically on a regular schedule in an unattended fashion.

EXTRACT ALL THE METADATA

At first glance, this looks about as straightforward as it could be. Executing force help shows there is a command that looks tailor made:

export    Export metadata to a local directory

By default, this command dumps the metadata to a subdirectory of your current directory named (fittingly enough) metadata, although you can supply a different location for the output as an additional parameter.

After logging in, I can export the metadata by executing the following command:

force export

However, at the time of writing this doesn’t quite bring everything back. Here’s the contents of my metadata directory- spot the deliberate mistake: 

analyticSnapshots	homePageComponents	quickActions
applications		homePageLayouts		remoteSiteSettings
approvalProcesses	labels			reportTypes
assignmentRules		layouts			roles
autoResponseRules	networks		samlssoconfigs
callCenters		objectTranslations	scontrols
classes			objects			sharingRules
communities		package.xml		sites
components		pages			staticresources
connectedApps		permissionsets		tabs
dataSources		portals			translations
flows			profiles		triggers
groups			queues			workflows

Spotters Badge for those eagle-eyed readers that wondered where the Lightning Components metadata directory(aka aura) is.

There’s currently no way to influence the metadata elements that are included by the export command, but luckily there is a mechanism to extract individual metadata component types.

Fetch! Good boy!

The fetch subcommand provides granularity around metadata extraction, not just down to a single type of metadata, but as far as a single metadata component. I’m not going down to that level, but if you are interested then execute force help fetch which gives full details of the options.

The fetch command takes a -t switch to identify which metadata type you want to extract. For all types you can specify the name of the metadata type as detailed in the Metadata API Developer’s Guide, so to extract the Force.com Sites metadata, for example, you would execute:

force fetch -t CustomSite

The metadata name for Lightning Components is AuraDefinitionBundle, but you can also specify Aura as the type, which will extract the Lightning Component metadata via the REST API - I tend to use the latter version as I find it is slightly faster than the metadata route.

Executing 

force fetch -t Aura

extracts the Lightning Components to the metadata directory, and the entire Salesforce configuration is now in one place:

analyticSnapshots	homePageComponents	remoteSiteSettings
applications		homePageLayouts		reportTypes
approvalProcesses	labels			roles
assignmentRules		layouts			samlssoconfigs
aura			networks		scontrols
autoResponseRules	objectTranslations	sharingRules
callCenters		objects			sites
classes			package.xml		staticresources
communities		pages			tabs
components		permissionsets		translations
connectedApps		portals			triggers
dataSources		profiles		workflows
flows			queues
groups			quickActions

Putting these commands together in a bash script, with a dash of compression, gives me a handy backup file containing my metadata:

#/bin/bash
force login
force export
force fetch -t Aura
zip -r metadata > backup.zip

Note - if you run the above commands directly from the command line, ignore the first one. That tells the OS which command to use to execute the script, which makes no sense for individual commands.

Version Control to Major Tom

Where this can be particularly powerful is if you integrate with version control.  You can use a script to periodically extract the latest metadata and merge into your Git repository for example.

Related Posts

 

Sunday 29 May 2016

Force CLI Part 1 - Getting Started

Force CLI Part 1 - Getting Started

Cli

Introduction

In one of my earlier blog posts around Lightning Components deployment (Deploying Lightning Components - Go Big or Go Missing) I made a passing reference to the fact that I use the Force CLI for automated deployments. A few people contacted me after this and said that while they were vaguely aware of the CLI, they’d never spent any time with it, so I decided to show this unsung hero some love with a series of blog posts.

A Command Line Interface to Force.com!

As someone who spent 20 or so years developing on Unix and Linux, I’m pretty comfortable with command line tools, and in many cases (resolving git conflicts!) I actually prefer them to a GUI. So when the CLI was released at Dreamforce 13, I couldn’t wait to get hold of it.

Downloading

The first thing to do is download the CLI - you can do this from the home page at : https://force-cli.heroku.com/ - this has pre-built binaries for Mac OSX, Windows and Debian/Ubuntu. I’m a Mac user so any instructions/examples are based on OSX.

Screen Shot 2016 04 03 at 18 57 08

Clicking on the ‘Download for Mac OS X 64’ link downloads the ‘force’ application to my ‘Downloads’ directory. I clear this directory out regularly (yeah right, regularly being when I run out of disk space) so the first thing I need to do is move this to a safe place. I create a tools directory underneath my home directory and drag the file into that:

Trying it Out

Via the terminal (If you aren’t familiar with the terminal then I’d recommend following this tutorial from MacWorld) I can then test out whether the application works correctly 

> cd ~/tools
> ./force

and of course it doesn’t:

-bash: ./force: Permission denied

This is because files are downloaded without execute permission. To rectify this, execute:

> chmod +x ./force

Executing again now shows the (long) list of available options:

> ./force

Usage: force  []
 
Available commands:
   login     force login [-i=] [ ]
   logout    Log out from force.com
   logins    List force.com logins used
   active    Show or set the active force.com account
   whoami    Show information about the active account
   describe  Describe the object or list of available objects
   sobject   Manage standard & custom objects
   bigobject  Manage big objects
   field     Manage sobject fields
   record    Create, modify, or view records
   bulk      Load csv file use Bulk API
   fetch     Export specified artifact(s) to a local directory
   import    Import metadata from a local directory
   export    Export metadata to a local directory
   query     Execute a SOQL statement
   apex      Execute anonymous Apex code
   trace     Manage trace flags
   log       Fetch debug logs
   eventlogfile  List and fetch event log file
   oauth     Manage ConnectedApp credentials
   test      Run apex tests
   security  Displays the OLS and FLS for a give SObject
   version   Display current version
   update    Update to the latest version
   push      Deploy artifact from a local directory
   aura      force aura push -resourcepath=
   password  See password status or reset password
   notify    Should notifications be used
   limits    Display current limits
   help      Show this help
   datapipe  Manage DataPipes
 
Run 'force help [command]' for details.

Access All Areas

By executing ‘./force’ I’m asking the operating system to execute the script named force in the local directory (the ./ part). This is going to be pretty unwieldy if I have to provide the location each time, so the last thing to do to get set up is to add the tools directory to my path - if you chose a different directory name to tools, just alter the following commands to use your name.

The default shell on OSX is bash (bourne again shell, as it was a replacement for the venerable Bourne shell - this is both a blog post and a history lesson!). When you login (or open a new terminal window on Mac OSX) the bash shell executes the .bash_profile script from your home directory, if one exists, so this is the easiest way to update your path. 

Using your favourite editor (mine is vim, but that’s a bit like saying the developer console is your favourite IDE), open the .bash_profile file, from your home directory (/Users/<yourname>), or create one if it isn’t already there and add the following at the bottom:

export PATH=$PATH:~/tools

Save the file, then execute the .bash_profile script to update your current context:

> . ~/.bash_profile

Next, change to your home directory and execute the force command without a path to check all is good:

> cd
> force

and once again you should see the full output.

Congratulations - now you can run the force executable from anywhere without worrying about where it has been installed.

Logging In

Now that everything is set up, you can login to Salesforce via the CLI by executing :

> force login

This opens up a browser window for you to enter your credentials: 

Screen Shot 2016 05 08 at 08 26 08

and authorise access via oauth:

Screen Shot 2016 05 08 at 08 26 30

(Note that this process involves a Heroku app communicating with a local web server to return the oauth token to the command line, which may be problematic if you are behind a firewall that you can’t configure or using NAT. See the end of this post for an alternative, non-oauth approach).

This is the number 1 reason why I use the Force CLI for deployments - it removes the need to store usernames and passwords on disk, which has always been my issue with the Force Migration Tool.

If you have a number of orgs its quite easy to forget which one you have logged into, which is where the logins command comes in handy, as it will show you all the orgs that you have logged in to and which login is currently active:

> force logins

Screen Shot 2016 05 08 at 08 33 17

Non-OAuth Authentication

If you can’t use the oauth mechanism described above, you can always fall back on supplying the username and password to the CLI via the following command:

> force login -u=<username> -p=<password>

While this is slightly better than storing credentials on disk, it does mean that if someone were to compromise your machine they’d be able to see the username and password in your command history. For example, after executing:

> force login -u=fake@fake.fake -p=Faking1t

If I then look at my command history

> history

This shows the credentials:

  685  force login -u=fake@fake.fake -p=Faking1t
  686  history

Luckily I can clean up afterwards and remove the command from my history, First delete the entry:

> history -d 685

then write the history to disk:

> history -w

viewing the history again shows that entry 685 has been replaced with the entry that was at 686:

685  history

Logging in to a Sandbox

The logins command supports production orgs, sandboxes and custom domains. To view the options, for this or any other command, execute ‘force <command> -help’:

force login -help

Usage: force login
  force login [-i=<instance>] [<-u=username> <-p=password> <-v=apiversion]
  Examples:
    force login
    force login -i=test
    force login -u=un -p=pw
    force login -i=test -u=un -p=pw
    force login -i=na1-blitz01.soma.salesforce.com -u=un -p=pw

That’s all Folks

This concludes getting started with the Force CLI. Assuming all went well you now have the CLI installed, in your path and you’ve been able to login to an org. In the next post I’ll take a look at interacting with your metadata.

Related Posts

 

Saturday 7 May 2016

Salesforce - It's not just for Ladies

Salesforce - It’s not just for Ladies

Wit

Introduction

This week I (and many, many others - check the picture above!) attended the “An Evening with Ladies who Salesforce" event at Salesforce Tower in London, organised by the team behind the London Salesforce Women in Tech - Freya Crawshaw, Louise Lockie and Jodi Wagner. This was a vertical journey for me, as the BrightGen offices are on the 18th floor!

Stemettes

The keynote talk was from Anne-Marie Imafidon, Head Stemette. Stemettes are doing an amazing job of enabling the next generation of girls into STEM (Science, Technology, Engineering and Maths).  Note that I’ve chosen the word enabling rather than encouraging deliberately - part of Anne-Marie’s slot included a video from the Outbox Incubator, and its clear that these girls already have an enormous amount of interest and enthusiasm (and energy!) for technology and development. 

Panel Time

The second part of the evening was a panel discussion on career pathing in the Salesforce ecosystem, with an impressive panel lineup including Salesforce MVP Jenny Bamber, Certified Technical Architect Doina Figural (the first female CTA outside of the US and one of only three worldwide) and Salesforce EVP and GM of App Cloud, Adam Seligman.

I found this an enlightening discussion, as it covered a lot of areas that I’ve never really thought about before - planning where you want to be and structuring an approach to get there for example. My own “career” (which it has taken me a long time to accept that I have one, as opposed to a series of jobs) has tended to be more about following the tech that I’m interested in, so I’ve worked for a number of small companies that either haven’t made it or have turned into very different places from the one I’ve joined. When that has happened I’ve moved on to the next one, usually taking a pay cut and dropping down from a senior to a mid-level position while I learn the ropes.

Only once or twice, when this has happened in the midst of a recession for example, have I actually worried about whether I’d be able to find another position, so it was very interesting to hear from people who have had to take a very different approach to their career, purely because through an accident of birth they happen to be a different gender.

One of the questions from the audience was around how companies can attract more women to apply for their open positions - something very close to my heart as I’ve been on a long journey to attract the first female developer to BrightGen, which finally came to an end in early 2016. It wasn’t that we were interviewing females and rejecting them, but we were getting very few CVs, and understandably many of those that we interviewed weren’t keen on being the one and only female writing code. One  panelist’s answer was around what do you tell the recruitment agents in terms of the diversity that you want from the CVs that they send you. This resonated enormously, as it has never occurred to me that I have to do anything other than ask agents to send me their best CVs.

Its not just for Ladies!

So what was I doing there? The clue is in the name - An Evening with Ladies who Salesforce, not for Ladies who Salesforce! I wasn’t the only man either - there were three or four others outside of Adam on the panel and various Salesforce representatives. It was unusual, shall we say, being at a Salesforce community event in London where I was very much in the minority, but how dull would life be if everything was always the same.

So I’d recommend these events to any men in the Salesforce ecosystem, especially those hoping to lead - its vital to understand things from the viewpoint of others, and the best way to achieve this is to listen to others. If you are worried about being the only man there, tough! We expect the women that attend the developer events to be able to handle being in the minority (and at times flying solo), so we should be able to handle this when the roles are reversed.