Using LWC components with Visual Force Pages

Did you ever have an older Visual Foce page that a client doesn’t want to pay to have updated? But they need some extended functionality that would be rather difficult to duplicate using Visual Force? I recently had a client using an older site page with Bootstrap request that I update a page with data table values on it. The original page used the older <apex:pageBlockTable> to render the data. See the screen shot below:

So while the page looks good and works, when the page grows over a dozen we’d need to paginate the data table…fine on a PC but not so nice using a mobile device. Enter the Lightning Web Components data grid. This would be a good place to replace the <apex:pageBlockTable> with an LWC data table.

How do we accomplish this? Well, it’s not as straight forward as you may think. In order to make this all work, we’ll need to first create an Aura container which we can place on the page. Once we have an Aura container we can create the LWC component inside the Aura container. Open the Developer Console and create a new Lightning Application. This will create an Aura Application container we can host our LWC component inside of.

Since our component will need to draw on data values found in the Apex class, we can update the current Apex class to @AuraEnabled in order to gain access to the list of Orders being returned. A couple of things we’ll need to do here. First the original Apex class was designed using a wrapper class, so we have to update the wrapper class to be @AuraEnabled. We are currently using a custom Object WebOrders__c to handle the web based orders. So we’ll need to modify the class slightly to expose the public members using @AuraEnabled.

Our code might look something like this:

public class OrdersDataForGrid  
{
@AuraEnabled
public String Id {set;get;}
@AuraEnabled
public String Name {set;get;}
@AuraEnabled
public String Account {set;get;}
@AuraEnabled
public String Product {set;get;}
@AuraEnabled
public Double Price {set;get;}
@AuraEnabled
public String Dosage {set;get;}
@AuraEnabled
public Integer Quantity {set;get;}
@AuraEnabled
public Double Total {set;get;}
@AuraEnabled
public String Status {set;get;}
@AuraEnabled
public String Patient {set;get;}
@AuraEnabled
public String Prescription {set;get;}
@AuraEnabled
public String PatientName {set;get;}
@AuraEnabled
public String userId {set;get;}
@AuraEnabled
public Date OrderDate {set;get;}

public OrdersDataForGrid( WebOrder__c ord )
{
id = ord.Id;
Name = ord.Name;
Account = ord.Account__c;
Product = ord.Pharma_Products__r.Name;
Price = ord.Pharma_Products__r.Price__c;
Dosage = ord.Dosage__c;
Quantity = Integer.valueOf(ord.Quantity__c);
Total = ord.Total_Price__c;
Status = ord.Status__c;
Patient = ord.Patient__c;
Prescription = ord.Patient_Prescription__c;
PatientName = ord.Patient_Prescription__r.Patient_Name__c;
userId = ord.PortalUser__c;
OrderDate = ord.CreatedDate.date();
}

}

Now we need to update the original controller class. This class still returns the a List<OrdersDataForGrid> data type, so that doesn’t need to be changed. However, we now need to @AuraEnable the method and make sure its a public static method so it can be imported into our LWC component.

public without sharing class Orders_GridController  
{
@AuraEnabled(cacheable=true)
public static List<OrdersDataForGrid> getOrders(String OwnerId, String filter)
{
String qry;

if( OwnerId == null ) OwnerId = ApexPages.currentPage().getParameters().get('id');
if( filter == null ) filter = 'All';

PortalUser__c p = [SELECT First_Name__c, Last_Name__c FROM PortalUser__c WHERE Id =:OwnerId];

qry = 'SELECT Id, Name, Account__c, Pharma_Products__r.Name, Pharma_Products__r.Price__c, Dosage__c,';
qry += ' Quantity__c, Total_Price__c, Status__c, Patient__c, Patient_Prescription__c, Patient_Prescription__r.Patient_Name__c, PortalUser__c, CreatedDate FROM WebOrder__c ';
qry += 'WHERE PortalUser__c = \'' + OwnerId + '\' ';

if( filter != 'All')
{
qry += 'AND Status__c = \'' + filter + '\'';
}

List<WebOrder__c> listOfWebOrders = Database.query( qry );

// Put the records into a List<> of the wrapper class
List<OrdersDataForGrid> orderList = new List<OrdersDataForGrid>();
for( WebOrder__c o : listOfWebOrders )
{
orderList.add( new OrdersDataForGrid(o) );
}

return( orderList );
}

}


Once we have completed that it’s time to create our Lightning Web Component. Open Visual Studio Code or Welkin Suite (or whatever editor you desire) and create the LWC component. First we’ll need a template for the Data Table. Let’s call this component orders_Grid. When we create the component bundle you’ll have the .html, .js, .css and .js-meta files in your project. Starting with the HTML page let’s create the template we will need.

<template>
<div style="height: 300px;">
<lightning-datatable
key-field="id"
data={orders}
columns={columns}>
</lightning-datatable>
</div>
</template>

Now for the fun part. We have to actually “wire” everything together (see what I did there). Open the .js file and let’s start by importing the Apex class we’ll need.

import {LightningElement, wire, api} from 'lwc';
import {CurrentPageReference} from 'lightning/navigation';

import getOrderList from '@salesforce/apex/Orders_GridController.getOrders';

We import the method from the apex class using the import method. Once imported we can work with the apex class as if it’s a native Javascript class. Next we need to declare our headers for the data table.

const columns = [
{ label: 'Status', fieldName: 'Status' },
{ label: 'Patient', fieldName: 'PatientName'},
{ label: 'Product', fieldName: 'Product'},
{ label: 'Price', fieldName: 'Price', type: 'currency' },
{ label: 'Quantity', fieldName: 'Quantity', },
{ label: 'Dosage', fieldName: 'Dosage'},
{ label: 'Order Total', fieldName: 'Total', type: 'currency'},
{ label: 'Date Of Order', fieldName: 'OrderDate', type: 'date' }
];

Now that we have our data table columns we need to bind it to the actual data we’re going to use. We do this like so:

export default class Orders_Grid extends LightningElement {
orders = [];
columns = columns;
portalUserId = null;
filter = 'All';

@wire(getOrderList, {OwnerId: '$portalUserId', filter: '$filter' })
wiredList({ error, data }) {
if (data) {
console.log('results =' + JSON.stringify(data));
this.orders = data;
} else if (error) {
console.log('Something went wrong:', error);
}
}
}

In the default of the class we’ll use @wire to call the apex class and then process the returned data. We set the result of the Apex class to the orders javascript array which will feed the data table. Now we are almost ready to update the visual force page. Before we can do that we must set the Lightning Application to load our component. Open the Lightning application and select the “Application” block on the right. Inside the application we add the component <c:orders_Grid/> which will then load the component as part of the Lightning App.

<aura:application extends="ltng:outApp" access="GLOBAL" implements="ltng:allowGuestAccess">
<!-- Define the dependencies for your LWC component -->
<c:orders_Grid/>
</aura:application>

Now that we’ve done that lets see what our final grid page looks like.

So finally we have a one more item to address. Typically passing parameters into an Apex Class or component would be fairly straightforward as the ApexPage reference would allow you to capture it… however, in this case because the Lightning Web Component is inside an Aura container parameters aren’t accessible in the same manner. Because the CurrentPageReference component would attempt to look for parameters in the Aura container this results in NULL values. How do we fix this?

Good old fashioned JavaScript. While generally speaking your Apex class can use the ApexPages.currentPage().getParameters().get(<value>). This doesn’t help if you need the value in your inside your LWC. Enter the connectedCallback() function. We can use the connectedCallBack() to invoke a good old fashion JavaScript function to load the parameters into an array of parameters. I’ve used dozens of these in the earlier years of Visual Force Pages when I needed values on the page without calling to Apex.

So using one of these blocks of code (listed below) we can simply walk the URL and get all the query string values and put them into the parameters arry.

getQueryParameters() {
var params = {};
var searchParams = [];
let search = location.search;
search = search.startsWith("?") ? search.substring(1) : search;

search.split("&").forEach(element => {
if (element.startsWith("?q=") || element.startsWith("q=")) {
let decompressedValue = decompressQueryParams(element.split("=")[1]);
searchParams.push("q=" + decompressedValue);
} else {
searchParams.push(element);
}
});

if (searchParams.length > 0) {
search = searchParams.join("&");
}
if (search) {
try {
params = JSON.parse('{"' + search.replace(/&/g, '","').replace(/=/g, '":"') + '"}', (key, value) => {
return key === "" ? value : decodeURIComponent(value);
});
} catch (error) {
return params;
}
}

return params;
}

Then in the connectedCallBack() we simply call this function and store the values. Like so:

connectedCallback() {
let urlParams = this.getQueryParameters();
this.myId = urlParams[‘Id’];
}

Just like that you have access from inside the LWC component to any of the outside parameters.

Combining the new Flow Data Table with Loops to process multiple records

You ever need a data table for a simple process and think “man I wish I could do this with a flow”. Well now you can. Thanks to the new Data Table component in Flows. It’s a great component to use. Let’s say you want to display a list of Employees saved as Contacts for your company and select which ones are to receive a Christmas Bonus. We start by creating the Flow and adding the GetRecord component to retrieve the Account Record.

We start by creating a special variable recordId (spelled exactly like this) which by default is passed to a flow.

Once we have created the variable it’s now available for us to use in our GetRecords Element. Select the “+” sign to add an element and select “Get Records” from the Data element section.

Now we need to filter the Account to select only the Account that we were sitting on when we launched the flow. That value is automatically assigned to the special variable recordId. We do this using the Condition Requirements which will act like a where clause in a SOQL query.

Simple enough eh? Now that we have the Record returned we need to select all the Contacts for this Account that have the Contact Type value set to “Employee”. You can do this via a Record Type or for this simple demo I created a custom field on Contact with a Pick list value of “Employee” for the purpose of this demo. You can choose to select all the fields or if you don’t need all the fields select the “Choose fields and let Salesforce do the rest”.

So now we have a record set we can bind to the Data Table. Next we choose to add a Screen. Now we set the values for the Form, Header, Footer, buttons etc.

Next we add the Data Table Element to the Form.

First we set the Data Table Source Collection. This is the filtered GetRecords element.

For the purpose of this demo we’ll use the multiple selection value of the Data Table to allow us to select multiple Rows. I will also set the minimum and maximum display values.

Once we have set those values we need to add the columns for displaying the Employees. No need to add a selection column the flow will do that because we selected Multiple. So create each column by selecting the field (which will be available from the Get Records Object)

Now that we have our fields we can save and debug the flow to see the list appear.

After we select a few employees they will be saved in the “selectedRows” member of the Data Table. Now we just need to loop through the employees so we can give them a bonus. To do this we need to use a Loop. Add the Loop to the flow. And set the parameters like below.

Now you just need to add your Employee Entry form in the For Each section of the Loop. In the form add a Display Text to show you the employee Name and a Currency field to enter their bonus amount.

Save and Select Debug. Run it and you’ll see the form display each time for each employee you selected.

And just like that you can loop through a collection of Records and set values. No LWC or Aura components required.

Intro to the Salesforce Tooling API using REST

So I was recently trying to get a list of LWC components and found that the Object was name LightningComponentBundle. Now if you want to see the components you could use the Developer Console, select the Tooling API check box and issue the following query

SELECT ApiVersion,CreatedDate,LastModifiedDate, DeveloperName,MasterLabel,NamespacePrefix 
FROM LightningComponentBundle

This is what it would look like in the Developer Console.

Now this looks simple enough but if you try to run tooling API queries in some cases you won’t be able to access the object. To get around this we can use the Tooling API. Consider the following code for executing a query using the tooling API REST callouts.

HttpRequest req = new HttpRequest();
req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID()); //Get user Session ID
req.setHeader('Content-Type', 'application/json');
String SFdomainUrl=URL.getSalesforceBaseUrl().toExternalForm();
String query='Select+Id,ApiVersion,CreatedDate,LastModifiedDate,MasterLabel+from+LightningComponentBundle+Limit+50';

req.setEndpoint(SFdomainUrl + '/services/data/v45.0/tooling/query/?q=' + query);

req.setMethod('GET');

Http h = new Http();
HttpResponse response = h.send(req);

system.debug(response.getBody());

Map<String, Object> payload1 = (Map<String, Object>)JSON.deserializeUntyped(response.getBody());
system.debug('List of Records ==>' + payload1.get('records'));
List<Object> items = (List<Object>)payload1.get('records');
    
for (Object item : items) 
{    
	Map<String, Object> i = (Map<String, Object>)item;    
	
	DateTime dt = (DateTime)Json.deserialize('"'+i.get('LastModifiedDate')+'"', DateTime.class);
	Date lastModified = date.newinstance(dt.year(), dt.month(), dt.day());

	String label = (String)Json.deserialize('"' + i.get('MasterLabel') + '"', String.class);
	System.debug('MasterLabel ==>' + label);
	System.debug('DateTime ==>' + lastModified.format());
}

Couple of points of interest 1) because this is executing in the Salesforce Org just using the SessionId value should authenticate this REST transaction. 2) Always get the Salesforce Domain to make sure you you have the correct domain URL for the tooling call. 3) Make sure you check the version of the Tooling API you want to access (in this case 45.0) as part of the URL.

So when the call is made, the query is executed and the JSON response is sent back. In the response we have an array of records. I cast that to a List<Object> so I can iterate through it. To pull the values I use the JSON.deserialize() call to deserialize the string data for the proper field, and cast it to it’s respective Type.

Pretty nifty eh? But lets say we want to do something more drastic, such as execute an anonymous apex block. The previous code can be modified by changing the endpoint to anonymous?=<code>

HttpRequest req = new HttpRequest();
req.setHeader('Authorization', 'Bearer ' + UserInfo.getSessionID()); //Get user Session ID
req.setHeader('Content-Type', 'application/json');
String SFdomainUrl = URL.getSalesforceBaseUrl().toExternalForm();

req.setMethod('GET');

Http h = new Http();

String body = 'System.debug(';
body += '\'==> Test\'';
body += ');';


req.setEndpoint(SFdomainUrl + '/services/data/v45.0/tooling/executeAnonymous/?anonymousBody=' + EncodingUtil.urlEncode(body, 'UTF-8'));
Httpresponse response = h.send(req);
system.debug('Anonymous Execute ==>' + response.getBody());

This doesn’t do much but it shows you how to use the Anonymous callout. For true blocks of big code you’d probably want to use the SOAP version as the limitation of REST is the length of the query string.

Which brings us to another example using the SOAP API. To get the SOAP API to work in many cases you need to go to the API and generate the WSDL for import into Java/.NET or whatever language you are using. However, if you’re trying to add the ability to Execute Anonymous code in your current org to a class you can do so by building the XML request and sending it using the POST method.

public String executeAnonymous(String apexString, Boolean enableDebug) 
{
    String result = '';
    if ( apexString != null && apexString.length() > 0 ) {
                 
        Boolean debugEnabled = enableDebug==null?false:enableDebug;
		String SFdomainUrl = URL.getSalesforceBaseUrl().toExternalForm();
		String endPoint = SFdomainUrl + '/services/Soap/s/54.0/'; 
		Http h = new Http();

        HttpRequest req = new HttpRequest(); 

        req.setEndpoint( endpoint );
        req.setHeader( 'User-Agent', 'SFDC Apex Callout Service' ); 
        req.setHeader( 'SOAPAction', '""' );
        req.setHeader( 'Accept', 'text/xml, application/soap+xml' );
        req.setHeader( 'Content-Type', 'text/xml; charset=UTF-8' ); 

        String soapXML = '<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://soap.sforce.com/2006/08/apex"><soapenv:Header><SessionHeader>'+
                            '<sessionId>'+ UserInfo.getSessionID() +'</sessionId>'+
                            (debugEnabled?'</SessionHeader><DebuggingHeader xmlns="http://soap.sforce.com/2006/08/apex"><categories><category>Apex_code</category><level>Finest</level></categories><debugLevel>Debugonly</debugLevel></DebuggingHeader>':'</SessionHeader>')+
                            '</soapenv:Header><soapenv:Body><executeAnonymous><String>'+apexString+'</String></executeAnonymous></soapenv:Body></soapenv:Envelope>';
        System.debug( LoggingLevel.WARN, 'SOAP XML: ' +  EncodingUtil.urlEncode( soapXML, 'UTF-8' ) );
        req.setBody(soapXML);
        req.setMethod( 'POST' );

        HttpResponse res = h.send(req); 
        result = res.getBody();
        System.debug( LoggingLevel.DEBUG, 'HTTP RESPONSE: ' + res.toString()+ ' '+  res.getBody() );       
        if ( res.getStatusCode() == 500 ) 
		{
            System.debug( LoggingLevel.ERROR, 'HTTP SOAP ERROR: '+res.getBody()==null?'': res.getBody());           
        }        
    }
    return result;
}

While the above code isn’t super pretty it demonstrates how to build the SOAP request and deal with the response. To see it in action you can paste the above code into the Execute Anonymous window in the debugger console and append the following 4 lines below at the end of the function declaration to see it in action

String body = 'System.debug(';
body += '\'==> Test\'';
body += ');';

executeAnonymous( body, false ); 

Here are a few links

Check this out: https://github.com/JitendraZaa/Anonymous-Apex-Executer/tree/master/Anonymous%20Apex%20Executer

Apex API Version Checking

With the upcoming release of Salesforce imminent for 2022, we will see the deprecation of API versions 7 through version 30. Now you can do a few things, you can check your Setup -> Apex Classes and sort by API version to see which ones are going to be out of date. This only shows you part of the problem. The other issue is Triggers. You also need to check them as well. And how about creating a report? How are you going to do that? Well fortunately we have Anonymous Apex. We can use the Execute Anonymous window to Execute statements and get results. Let’s look at what we have to do with the Apex Classes.

Internal to Salesforce is the ApexClass Object. This object contains all the information about the Apex code in your organization. So to extract what we need we just query the ApexClass Object using SOQL like so:

SELECT Id, ApiVersion, Name, NamespacePrefix FROM ApexClass WHERE Status='Active'

This will do several things, it will return a list of ALL the Apex classes that are active and it will include the NamespcePrefix. Why is the NamespacePrefix important? Because any value (generally speaking) in this column will represent a package vs internal code. So now we have the list, let’s think about how we further refine it. We only want classes below API version 31, but the ApiVersion field is a text and converting a text inline in SOQL to an integer won’t work… so lets just use the IN clause.

Now at this point if we are going to use the IN clause that means we’d need to create the list of ApiVersions … we could do this manually OR we could cheat and use Anonymous Apex. By outputting the list in a System.Debug() message we’d have our list of API versions.

Set setOfVersions = new Set();
for( Integer i = 7; i < 31; i++ )
{
// convert to string, put in list of versions to check
setOfVersions.add( Decimal.valueOf(i) );
}
System.debug('===> list:' + setOfVersions);

Once the versions are done running we can simply copy the list from the Debug Log. We can then append that to the SOQL query using the WHERE ApiVersion IN (list) like so:

SELECT Id, ApiVersion, Name, NamespacePrefix FROM ApexClass WHERE Status='Active' AND ApiVersion IN('7','8','9'...'30');

Thus the query will not return all classes that are version 30 or below.

Next we simply repeat the process with the other Object ApexTrigger.

SELECT Id, ApiVersion, Name, NamespacePrefix FROM ApexTrigger WHERE Status='Active' AND ApiVersion IN('7','8','9'...'30');

This will now give us both Apex Classes and Triggers of version 30 and below. It should be noted that while you can see all of this in the Salesforce.com UI you would need to filter and you have no way of creating a report for Management (short of cut & paste).

So now let’s revisit one of my earlier posts which explained how to create a CSV file and save it to your instance. By using Execute Anonymous AND my prior example code to create a CSV file we can create an Excel file which contains a list of all Apex Classes and Triggers of version 30 and below.

Set setOfVersions = new Set();
for( Integer i = 7; i < 31; i++ )
{
// convert to string, put in list of versions to check
setOfVersions.add( Decimal.valueOf(i) );
}

List listOfClasses = [
SELECT Id, ApiVersion, Name, NamespacePrefix
FROM ApexClass
WHERE Status='Active'
AND ApiVersion IN :setOfVersions
];

// Create a CSV File
String csv = 'Id,ApiVersion,Name,Type,Name Space\n';
for( ApexClass a : listOfClasses )
{
System.debug('Class ==> ' + a.Name + ' : ' + a.ApiVersion);
csv += a.id + ',' + a.ApiVersion + ',' + a.name.escapeCsv() + ', Class, ' + a.NamespacePrefix + '\n';
}
System.debug('===> Total Classes : ' + listOfClasses.size());

List listOfTriggers = [
SELECT Id, ApiVersion, Name, NamespacePrefix
FROM ApexTrigger
WHERE Status='Active'
AND ApiVersion IN :setOfVersions
];

for( ApexTrigger t : listOfTriggers )
{
System.debug('Trigger ==> ' + t.Name + ' : ' + t.ApiVersion);
csv += t.id + ',' + t.ApiVersion + ',' + t.name.escapeCsv() + ', Trigger, ' + t.NamespacePrefix + '\n';
}

System.debug('===> Total Triggers : ' + listOfTriggers.size());

ContentVersion file = new ContentVersion(title = ‘ApexClassesAndTriggersBelowV30.csv’,
versionData = Blob.valueOf( csv ),
pathOnClient = '/ApexClassesAndTriggersBelowV30.csv');

insert file;
System.debu
g(‘===> Content URL : /’ + file.Id);

And just like that … you have a list of all classes and triggers needing to be updated and you can download it and using Excel filter and do other things with the list. Enjoy!

Spring CM – An Admins Guide

Once you have a login for Spring CM and connection established with Salesforce (if you do not have a connection to Salesforce setting it up can be found here – https://support.docusign.com/en/guides/SpringCM-for-Salesforce-Administrator-Guide) you will need to do several things:

  • Upload the Word Document Template
  • Create the Form to fill in the Word Document
  • Edit the Word Document to have the field tags
  • Download and install Spring CM Edit (which is just a bridge to open Word and download/upload a version of the Document)

Step 1 – Uploading a Word Document.  NOTE – you cannot perform ANY of this if you are not configured to be Super Administrator on Spring CM.  To verify this click on the Icon of the person in the upper right.

NOTE : Super Administrator and the owner of the Organization.  The value after the “-“ (in this case 11510) is the User Code you will need if you call Tech Support.

From the main home screen you can create a folder and upload a Word document to the folder.

Select the Folder and then it will open to display any files currently in the folder.

To upload a file click on the File menu in the upper left below the DocuSign Logo

Select “Upload” and the file will be transferred.

Step 2 – Create the Form.  Now that the document has been upload it’s time to create the form.  The form is essentially nothing more than defining the custom tags (Merge fields in Salesforce speak) that will be embedded into the Word Document.  Select the “Admin” Option from the Main menu bar.

This will open the Document Generation (by Default the first time) and display any “Configurations”.

Configurations can be confusing because you need a form before you can have a configuration.  Also the documentation talks about using XML documents to control the fields rather than explaining that the XML document is created by creating a Form.  To create a Form click on the “Forms” tab next to “Configurations”.

Select “Create Form”

Next fill in the New Document Generation Form options.  Give it a Name.

Then select “Use this form in Salesforce” and Select the Salesforce.com Object (this is the object you connected in the setup found in the Admin guide)

This will open a blank canvas for you to add fields to.

Select the Add Fields button.  Then choose the type of field to add.  The selection of the field is important.  If you choose to add a Text field numeric fields will not appear in your list of available fields.  When you select the field type and add it to the form you will see the field Options.

Fill in the Field Name.  If the field is one you will fill from Salesforce in the Field Value section select the “Link to Salesforce Field”.  Then you click on the cloud and it will open the field selection form.

Notice that related objects will appear, however ONLY field types that match the selected Field (text in this example) will appear in the related objects.  Also the relationships are not clearly defined as to what  type will work.  So far Lookups work, but Master Detail (if the object is the child of the detail it might work, but usually it doesn’t).  Once you select the field it will be added to the form and linked to Salesforce.  Fields added to the form with No Salesforce.com link will be text entry fields which can still be used to copy data into a document but that data is lost after merge.

When the form is completed you will have something that looks like the following :

Step 3 – Copy the Merge Tags.  Now that the fields have been created and Saved the form is ready to be linked to a document.  Seems simple enough, but not quite.  In order to link the form we must return to the configuration screen.  Click on the Configuration Tab. 

Select “Add Configuration”. Once you open the Add Configuration screen you will need to make some entries.  Most will be self explanatory but a few may not be.

First set the “Where to use this Configuration” option to “In Salesforce”.  Then select the linked Object.  Now you have the configuration set to tie back to Salesforce, so we need to add the fields and word document to use to the configuration.  Select the “Add Template” option.

Select the “Create a Document” option.  This form can be a bit confusing as you would think here would be where you would upload a document.  However, in this case Uploading a document is really one pertaining to a manually created XML document or sxformconfig file data type and not a Word Document to be utilized as a template.

Name your template.  Then you must fill in the Configuration Files section.  Section one is the name of the XML configuration file to use.  This is another confusing entry but don’t worry that is really the name of the Form we created earlier.  Select the “Change File” button in the “Form Configuration File” section.

You will then be shown a screen to choose an XML or sxformconfig file.  Under the Forms option on the right the name of the form you created earlier will appear.  Select that form and Choose Apply.  This will link the form to the configuration.

Next click on the “Change File” option for the “Document Template File” option.  This will open the Choose a PDF of Word Template option.

In this Choose PDF or Word Template window navigate to the folder where you uploaded the Word Document back in Step 1.  Select the Word Document you uploaded and click Apply.

Finally your configuration will be complete.  This is what the final Document Generation Configuration screen should look like

Select the “Copy URL” menu option to copy the URL generated by the configuration.  Now we are ready to add the button to Saleforce.

Log in to Salesforce and go to the Object.  I use Classic as it is easier to navigate to the Object by going to the tab (Case Batch Requests in this example) and selecting the pop out menu on the form:

Select the “View Object” option.  This will open the Setup form.  Scroll down through the Object configuration to the Buttons, Links and Actions Section.   Click “New Button or Link”.

Fill in the Label, set the Type as Detail Page Button,  set it to the behavior to Display in a new window, and the Content Source as a URL.  Paste the copied URL into the Source Text box.  Click the “Save” option and the button has been created.  This does not add the button to the form.  In order to add the button to the form, we must now locate the “Page Layouts” section. 

Select the Page layout and click the “Edit” hyperlink to open it.  When the form editor opens select the “Buttons” section.

Then drag the button onto the form.

Once added to the form when you click on the button the DocuSign engine will launch the Form you created.

Step 4 – Download and Run Spring CM Edit.  To find all the information on how to configure the Spring CM Edit (https://support.docusign.com/en/guides/SpringCM-Edit-User-Guide)

You can go to this location to download the Spring CM package:

https://docs.docusign.com/supportdocs/SpringCM/Content/scm-edit-download.htm

It’s also important to note that current development of the Spring CM application in this example is being done in a Spring CM UAT sandbox.  So in order to modify the Spring CM Edit option follow the instructions found here:

https://docs.docusign.com/supportdocs/SpringCM/Content/scm-edit-toggle-uat-prod.htm

Dynamic Programming

Interesting title right?  What do I mean by Dynamic Programming?  By Dynamic Programming I mean using Apex to create code that can handle a variety of dynamic situations without being reprogrammed.  For example, lets say we need to export data from an Object, but we don’t know which Object…it could be Account, Contact, Opportunity, or Campaign.  So usually we’d just stick those values in a list right?  Then select the one we want.

Well lets further complicate the issue.  Let’s allow the user to select whichever Object they wish, then select the fields for that selected Object.  So far so good?

Next lets tackle the easy part.  We can use the Schema Object in Apex to get a list of all the Standard Objects (or Custom, or combination of both), and from the selected Object Type we can then query from the same Schema object all the fields. Below are a few code snippets to use in your Controller to get the Master List of Objects and how to get the fields for Each Object.



//=========================================
// Snippit - Get the Master List of Objects
//=========================================

//initialize the lists
List<SelectOption> options = new List<SelectOption>();

// Get all the Objects
for (Schema.SObjectType ObjType : Schema.getGlobalDescribe().Values())
{
    String apiName = ObjType.getDescribe().getName();
    options.add( new SelectOption(apiName, apiName) );
}  

//===============================================================
// Snippet - Code for getting the fields for each selected object
//===============================================================

List<SelectOption> listOfFields = new List<SelectOption>();

// Get the Object Type for the Selected Object
SobjectType objType = Schema.getGlobalDescribe().get(selectedObject);
			
for(Schema.SobjectField strFld: objType.getDescribe().fields.getMap().Values())
{
   if(strFld.getDescribe().isAccessible())
   {
      listOfFields.add( new SelectOption(strFld.getDescribe().getName(), strFld.getDescribe().getName()) );		 
   }
}

So let’s throw this UI together and see how she looks:

Console

I know… I didn’t publish the code for the UI or how to fill everything up and link it together…because that’s not the purpose behind this article… you can find all the code to do what I’ve outlined above in numerous places…what you don’t see much of, is the next section on dynamic development.

So now when a user selects an Object … say Campaign, and then selects the fields (or we can default it to all fields if we wish) how do we hande the fact that we don’t know ahead of time the Object (Opportunity) or how to query to make all this go.  Fortunately for us Apex has a nifty feature built into the Database Engine… query().  You can build a dynamic string and simply pass it into the Database.Query() method.



// Call this function like so:
// String qry = getSOQL('Account', listOfFiels, 0) <-- return all rows
// String qry = getSOQL('Account', listOfFiels, 20)  0)
function String getSOQL(String ObjectType, List<String> listOfFields, Integer rowLimit)
{
   String qry = 'SELECT ';
   for(String fld : listOfFields)
   {
      qry += fld + ',';
   }
   // Strip ending ","
   qry += qry.subString(0, qry.lenght() -1);
   qry += ' FROM ' + ObjectType;

   if(rowLimit > 0)
   {
      qry += ' LIMIT ' + rowLimit.format();
   }
   return( qry );
}

This solves part of the problem. But now we are faced with another problem…since we don’t know the Object Type of the resulting query how do we store the results? The answer lies in the Type.forName() method. We can actually write code which transforms the string value of the Object Type into an actual data type.

So if we take the query built in the previous function (getSOQL) we can now build a String of the return type. In order to do this we must use a List of the base SObject type. We use SObject because it can store any of the formal data type.

Continuing our example, If the Object Type is Opportunity we would create ‘List’ as the String, and then using Type.forName() cast the variable to the right type, back into a variable of type SObject.


// SObject Container List
List<SObject> sourceObjects;

// Cast sourceObjects to the formal data type 
sourceObjects = (List<SObject>)Type.forName('List<' + dynamicObject + '>').newInstance();

// Pu the query results into the list 
sourceObjects = Database.query(qry);

Utilizing this approach we now have a dynamic selection engine. You can select any Object Type, any list of fields and generate a list result set dynamically. For the final piece .. we need to generate the CSV output file. In an earlier blog (Using the Developer Console for Salesforce Administrators) I showed how to use the ContentVersion object to create the file. With a few tweaks we can make this code also generate the CSV.

A couple of assumptions have to be made since I have not built all the code… (1) a variable named “castType” stores the Selected Object Type, (2) a variable named “selectedFields” is a List of all the selected Fields.



// create the header from the List of fields
String header = String.join(selectedFields, ',') + '\n';
String csv = header;

List<SObject> resultsObjects;

// Cast resultsObjects to the formal data type 
resultsObjects = (List<SObject>)Type.forName('List<' + castType + '>').newInstance();

// Put the query results into the list 
resultsObjects = Database.query(qry);

// Loop through the results
for(integer i = 0; i < resultsObjects.size(); i++)
{
   // Must cast each row to the correct data type as well
   SObject ObjData = ((SObject)Type.forName(castType).newInstance());
   ObjData = resultsObjects[i];  

   String row;

   // Loop through all the fields for this Object			
   for(String fName : fields)
   {
      String fData = (String) ObjData.get(fName.trim());			

      if(fData.contains(','))
      {
         // Put the value in double quotes
         row += '"' + fData + '",';
      }
      else
      {
         row += fData + ',';
      }
   }
   
   // remove ending ","		
   row = row.substring(0, row.length() -1);
   csv += row + '\n';
}

// finally save this into the ContentVersion
ContentVersion file = new ContentVersion(title = 'ExportedData.csv',
                                        versionData = Blob.valueOf( csv ),
                                        review_date__c = date.today(),
                                        pathOnClient = '/ExportedData.csv');
insert file;
System.debug('Content URL : /' + file.Id);

Now you have the knowledge and the means to export any Object Type and any collection of fields to a CSV file.

Handling ‘List has no rows for Assignment’ Error

How many times have we seen this error? What does it mean? Usually it means you’ve requested a record using SOQL from an Object which returns zero results.

The simplest way is always the best I always say. Consider the following:


Account a = [SELECT Id, Name FROM Account WHERE Id= :myId];

This should return an Account record. BUT what if myId contains a value that is no longer available (for whatever reason). Then you’ll get the “List has no rows for assignment” error. To solve this problem we have two solutions.

The first is an easy way which doesn’t require nesting try..catch for every SOQL query call. If you always return the Object query as a List<> you can check the .size() property to determine if the query found any results.


List aList = [SELECT Id, Name FROM Account WHERE Id= :myId]

If( aList.size() > 0 )
{
   a = aList[0];
}
else
{
   System.debug('Empty Results');
}

While this works great you do need to put an if() statement everywhere. Another way is to wrap the code in a try..catch…


try
{
   Account a = [SELECT Id, Name FROM Account WHERE id= :myId];
}
catch( DmlException de )
{
   // Handle the Exception
}

The advantage to using the second method is that now we can get creative and build a simple class to use this to test any Query so you don’t have to put the code in everywhere.


Public Class QueryChecker
{
    public Boolean QueryHasResults( String q )
   {
      List sobjList = Database.query( q );
      return( sobjList.size() > 0)
   }  
}

Now you can test the condition before you even execute the query.


Boolean isValidQuery = QueryChecker.QueryHasResults('SELECT Id, Name FROM Account WHERE Id = XXXXXX');

if(! isValidQuery)
{
   // code to do something
}

There you have it. A simple solution to handling an annoying error.

 

Text Replacements in Formula Fields

So I get asked this quite often.  How do I replace text in a formula field?  Normally as a programmer I’d lean toward using a regular expression (or regex).  Then I got to thinking… how would an administrator handle this? Because as a developer I have a whole set of tools at my disposal they don’t have..so I sat down and figured out HOW to handle it, and this is how I did it.

Consider the following problem : You have a text field (Company Name) that you need to use to create a unique file name.  So you of course can append date/time and other variables to the field value to produce an unique name…BUT what if that name contains a special character?  For Example : Wallace & Sons.  What do you do now?  The filename can’t contain the ampersand or “&” sign…

Enter the custom formula field.  Formula fields let us create a value based on a formula applied against any number of fields both on the target object and any related objects.  At first I decided to use the FIND() feature to determine if the text as present:


IF( FIND(Company_Name__c), "&")  > 0, SUBSTITUTE(Company_Name__c, "&", ""), "")

Which works well for locating a single special character… and yes you can string them together with IF() statements and nest them… BUT this leads to issues in that you must account for both finding and NOT finding the search text for every character.

Upon closer inspection though, I noticed that SUBSTITUTE while it only accepts 3 parameters it will ALWAYS return itself as unaltered text if the search string is not found.  So logically we can now nest the SUBSTITUTE function to find multiple characters.  The following locates the ampersand or “&” sign and the percent or “%” sign:


SUBSTITUTE(  SUBSTITUTE( Company_Name__c, "%", ""), "&", "" )

Therefore to create a more complex replacement is as simple as adding an outer SUBSTITUTE() function call.  The following function removes :
dashes (-),
parenthesis (“),
spaces(” “),
ampersand (&)
period (.)
percent (%)
pound (#)
forward slash (/)
semi-colon (;)
colon(:)


SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(
SUBSTITUTE(Company__r.Name, "(", ""), ")", ""), " ", ""), "-", ""), "&", ""), ".", ""), "%", ""), "#", ""), "/", ""), ";", ""), ":", "")

And there you have it. The regular expression free, formula field way, to replace text in a field.

Using the Developer Console for Salesforce Administrators

Most Salesforce.com Administrators are often asked to perform a variety of tasks that are not easily available in the Salesforce.com User Interface or Reporting environment.  So to that end I’ll endeavor to teach you (the Salesforce.com Administrator) a few tricks of the trade, including how to use the Developer Console.

First log into Salesforce.  Then click your name and from the Dropdown select “Developer Console”

Menu

So for these examples I will use the User and Profile Objects. Here are some things you can keep and Modify for handling users.  The first is an SOQL Queries you can run that will give you a count of Users by Profile.  This is known as an “Aggregate Query”


SELECT ProfileId, Profile.Name, COUNT(Id) myCount FROM User WHERE IsActive = True
GROUP By ProfileId, Profile.Name

To execute this query you’ll select the “Query Editor” tab of the Developer Console and paste this query into the window:

Console

Next click the “Execute” button and view the Results

Console Query Results

Now if you’d like to see the Users in each profile you can use another SOQL Query to create a List by utilizing a Sub SOQL Query to filter for a particular profile like so:


SELECT Id, Name, UserType, LastLoginDate, Profile.Name FROM User
WHERE ProfileId IN (SELECT Id FROM Profile WHERE Name LIKE '%Partner%')
AND IsActive = True
ORDER BY Profile.Name, LastLoginDate

Several things to note here.  First the WHERE clause makes use of a Sub SELECT which will return the Id column from the Profiles Object for all profiles that contain the word “Partner”.  Secondly we are using the ORDER BY clause to group users for each Profile Name logically together, and then by Last Login Date.

Console Query Results - Sub Query

To reuse this query to find users in other Profiles you change %Partner% to %[name]% (where [name] represents the Profile Name) in order to get the users of a profile or group of Profiles.  For example if you wanted all users in an Administration Profile Group change it to %Admin%

Console Query Results - Sub Query - 2

So now you’ve seen how to use the SOQL Query Editor, let’s talk about something more useful…dumping the results to a text file. In order to do that you’ll need to Open the Execute Anonymous window.

Execute Anonymous Menu

Paste this code into that window



List listOfUsers = [SELECT Id, Name, LastLoginDate, Profile.Name 
                    FROM User
                    WHERE ProfileId
                    IN (SELECT Id FROM Profile WHERE Name LIKE '%Partner%')
                    AND IsActive = True
                    ORDER BY LastLoginDate, Profile.Name];

String csv = 'Id,Name,ProfileName,LastLoginDate\n';
for(User u : listOfUsers)
{
   csv += u.id + ',' + u.name.escapeCsv() + ',' u.Profile.Name + ','
        + u.LastLoginDate + '\n';
}

ContentVersion file = new ContentVersion(title = 'PartnerUsersActive.csv',
                                        versionData = Blob.valueOf( csv ),
                                        review_date__c = date.today(),
                                        pathOnClient = '/PartnerUsersActive.csv');

insert file;
System.debug('Content URL : /' + file.Id);

This will generate an Excel CSV file.  Check the Open Log option on the bottom right of the Execute windows (next to the Execute button) so when the code completes it will Open the Log file

Execute Anonymous Code

You’ll see the URL in the Log file

Debug Log

Double click on the line in the Log file and it will open up a new window with the details

URL To File

Copy the “/” and Id following it.

Then go back to the browser which will have the https://naXX.salesforce.com in the URL.  Duplicate the current tab of Salesforce.  Once duplicated Past the “/” + Id after it in the URL (example: https://na28.salesforce.com/0681A000002sAxaQAE)

This will jump you right to the content file.  Click the Download Link

Content Download

Then after you download it, delete the file from SFDC.

If you need to create a CSV file with the Aggregate results (from a SELECT Coun() query) use the following code in the Execute Anonymous window and follow the same steps as above to open the Content and download the CSV


AggregateResult [] arResults = [SELECT ProfileId, Profile.Name pName, 
                                COUNT(Id) myCount
                                FROM User
                                WHERE IsActive = True
                                 GROUP BY ProfileId, Profile.Name];

String csv = 'ProfileId, Count\n';
for(AggregateResult ar: arResults)
{
    Integer iCnt = (Integer)ar.get('myCount');
    
    String line = (String) ar.get('ProfileId') + ',';
    
    line += (String) ar.get('pName') + ',';
    line += iCnt.format() + '\n';
    csv += line;
}

ContentVersion file = new ContentVersion(title = 'ProfileUserCount.csv',
                                    versionData = Blob.valueOf( csv ),
                                    review_date__c = date.today(),
                                    pathOnClient = '/ProfileUserCount.csv');

insert file;
System.debug('Content URL /' + file.Id);

 

This will produce a CSV with the total number of users in each profile. For any queries that utilize Aggregate functions you’ll want to use this template.

Understanding CRUD and FLS

The Force.com Platform allows for multiple ways to access data and to control that access on different levels. The different ways you can access the Objects are:

1. Object Level
2. Record Level
3. Field Level

In this article I’m going to talk about Object and Field Level security and the different techniques you can use to work with them when creating custom code.

So what exactly is CRUD? CRUD stands for Create-Read-Update-Delete access. CRUD settings are applied at the profile level in Salesforce.com. To see an example, just go to Setup->Profiles and select a profile (in my case I used the default “Standard User” and cloned it, then named it “Sample CRUD”), and select the Object Settings. This brings up a List of all the Objects in your organization and the permissions for each one. If I select “Accounts” I will be present with the Object Permissions and the Field Permissions.

CRUD - Figure 1.PNG

I can edit or change the permissions for this user by clicking on the “Edit” button.

CRUD - Figure 2.PNG

For my Custom Sample CRUD profile I can change the Object access by turning on or off the Read, Create, Edit, and Delete. On the same screen you’ll see Field Level Security (FLS) for every field in the Object. I can turn the access off for each individual field as well.

CRUD is automatically supported when the developer references an object (like the Account) and the objects fields directly in a VisualForce page. For example, if a user without FLS visibility to the Phone field of the Accounts object was to view the below page, phone numbers would be automatically removed from the table.


<apex:page standardcontroller="Account" recordsetVar="Acct" sidebar="false">

<apex:form >
   <apex:sectionHeader title="CRUD Example One" />
   <apex:pageBlock title="Accounts">
      <apex:pageBlockTable value="{!Acct}" var="item">
          <apex:column value="{!item.Name}"/>
          <apex:column value="{!item.BillingState}"/>
          <apex:column value="{!item.Phone}"/>
          <apex:column value="{!item.WebSite}"/>
      </apex:pageBlockTable>
   </apex:pageBlock>
</apex:form>
</apex:page>

The same thing will happen in VisualForce if you are rendering an Edit page for an object. For example if we have a custom VisualForce edit page for Account and the user does not have access to the Phone Number it will not appear in the edit form. Along that same line, apex:inputField tags will be rendered as read-only elements for fields that are set to read-only through FLS.

However using other input tags such as apex:inputText or apex:inputTextArea with SObject fields indicate to VisualForce that the fields should not be treated as SObject fields and prevent the platform to automatically enforcing FLS.

There are often cases where developers use VisualForce pages to display data derived from an Object field in an indirect or processed form. For instance, a page controller may use internal logic to determine the appropriate value to display. A simple example of this would be a page that displays a random Contact Name from a list of Contacts:


<apex:page controller="RandomContactController">
   <apex:outputText value="{!getRandomName}" />
</apex:page>

public with sharing class RandomContactController 
{
    public String getGetRandomName() 
    {
        // Check if the user has read access on the Contact.Name field
        if (!Schema.sObjectType.Contact.fields.Name.isAccessible())
        {
          return '';
        }
         
        Contact [] myList = [SELECT Name FROM Contact LIMIT 1000];
        
        // Pick a list entry at random
        Integer index = Math.mod(Math.abs(Crypto.getRandomInteger()),myList.size());
        Contact selected = myList.get(index);
        return selected.Name;
    }
}


This example indirectly displays the Name field of the Contact object by using a custom get method that returns a string (the name) value. Because VisualForce only sees the return value as a string and not as an SObject field, CRUD and FLS is not automatically enforced and it is necessary to call the isAccessible() method on the appropriate Describe Field Result in order to manually check the user’s CRUD and FLS access. The isAccessible() method automatically checks that the user has the corresponding CRUD access to the object type.

By the same approach we now have a mechanism for checking all CRUD access on any Objects fields. First we need to find the SObject in question (say it’s the Contact) and then we need to check for priveledges. So for handling the Create option we can create a Utility Class (CRUD_Checker) which can check the status of any field on any Object.



public Class CRUD_Checker
{
    public Boolean IsUpdateAllowed(String obj, String fld)
    {
	Boolean bResult = true;

	// First get the Object		
	SObjectType objType = Schema.getGlobalDescribe().get(obj);
	Map m = objType.getDescribe().fields.getMap();

        // Check if the user has create access on the each field
        if (!m.get(fld.toLowerCase()).getDescribe().isUpdateable()) 
        {
   	    system.debug('Insufficient access for field : ' + fld + ' in ' + obj);  
   	    bResult = false; 			
        }
		
	return(bResult);
    }

    public Boolean IsCreateAllowed(String obj, String fld)
    {
	Boolean bResult = true;

	// First get the Object		
	SObjectType objType = Schema.getGlobalDescribe().get(obj);
	Map m = objType.getDescribe().fields.getMap();

        // Check if the user has create access on the each field
        if (!m.get(fld.toLowerCase()).getDescribe().isCreateable()) 
        {
   	    system.debug('Insufficient access for field : ' + fld + ' in ' + obj);  
   	    bResult = false; 			
        }
		
	return(bResult);
    }
}


Finally we look at the case of Delete. You do not have applications that delete fields (generally speaking). That’s a method applied to the Object record. Which means the CRUD access is at the Object (Table) level as opposed to the field level. So for handling our Delete we check against the CRUD utility for delete permissions.


public Boolean IsDeleteAllowed(String obj)
{
    Boolean bResult = true;
    
    // First get the Object     
    SObjectType objType = Schema.getGlobalDescribe().get(obj);
    if(! objType.getDescribe().isDeletable())
    	bResult = false;
        
    return(bResult);	
}

NOTE : If you are using controller extensions and intend to delete the active record, another option is to call the standard controller’s delete() function instead of deleting the object within the controller extension. The standard controller will automatically check CRUD access before performing the operation.

So there you have it, the basics of handling CRUD in your Apex and Visual Force classes.