You are on page 1of 17

Build for FailureWhen approaching a JavaScript solution to a problem, it is important to consider that there are

many different variables that could cause your features to break. In fact, between Browser compatibility issues and
upgrades to BigMachines, it is unreasonable to treat anything in the HTML of a page as constant or immutable.
To deal with this uncertainty, design code that fails gracefully.This means different things in different situations. For
instance, consider the following code:
function getPrice() {
var value = document.getElementById("netPrice_quote").value;
return value;
}
If this code is run on a page that doesn't have "netPrice_quote" available, then it will throw a TypeError, because
getElementById will return null. This means that changes to the layout of the page will cause the entire
BigMachines application to break.
You can deal with this by using jQuery to select the element. If netPrice_quote isn't on the layout, then this code will
not throw an error:
function getPrice() {
var value = jQuery("#netPrice_quote").val();
return value;
}
But just because the code doesn't throw an error, doesn't necessarily mean it is failing gracefully. It's possible that
not having the net price will create serious problems for the application, such as data loss or corrupted quotes. If
this is the case, then a loud, complete failure may be appropriate:
function getPrice() {
var value = jQuery("#netPrice_quote").val();
return value;
}
function usePriceForCalculation() {
var price = getPrice();
//if net price is not available, don't allow this script to run and warn the user
if(price === undefined) {
jQuery("body").prepend("<h2>There is an error on this quote. Please contact your system
administrator.</h2>");
throw new Error("Custom work per Case #00012345. Net Price was not available for a critical calculation.");
}
// ...otherwise use it
}
Don't Use Inline JavaScript+
JavaScript is largely an event-driven language. You can bind behavior to a web page in the form of loading events,
click events, change events, and more.
Many resources will recommend that you use "inline" JavaScript to accomplish this. For example:

<body onload="myJavascriptFunction()">

<input id="my-button" type="button" onclick="myJavascriptFunction()"/>

<select id="my-select" onchange="myJavascriptFunction()"/>

<a id="my-link" href="javascript:myJavascriptFunction()">


This method of event binding violates industry best practice, and creates code that is difficult to maintain and
modify. The preferred way to accomplish this task is to select the elements you want to fire an event on, and then
bind the event from within your code. Using jQuery, the above examples would be accomplished like this:

jQuery(document).ready(myJavascriptFunction)

jQuery("#my-button").click(myJavascriptFunction);

jQuery("#my-select").change(myJavascriptFunction);

jQuery("#my-link").click(myJavascriptFunction);
This approach has the clear benefit of separating your logic from your data, which makes it easier to maintain, and
also allows you to centralize all of your code into a single file. It also means that the function being called doesn't
have to be globally available.
Errors and Exceptions: Avoid, Don't Suppress
It can be tempting to use try/catch blocks as a way to shield the user from errors. This can make it very difficult to
locate the source of a problem some time in the future.
(function() {
try {
my_array[3].obj.value = "Uh-oh, this is dangerous.";
} catch(e) {/* Do nothing - just hide it */}
}());
The problem with try/catch is that it is often far too broad; it can cover up errors that you didn't anticipate would
occur. Instead of throwing away exceptions, take steps to avoid specific errors. If other issues come up, then it will
be much easier to single them out and deal with them.

(function() {
if(my_array.length < 4) { return; }
if(!my_array[3].obj) { return; }
my_array[3].obj.value = "This is more specific, and easier to troubleshoot.";
}());
Use === and !==
JavaScript has two types of comparison operators: double and triple. Briefly, the difference is that the double
operator attempts to cast the type of the compared values, while the triple operator does not. This can lead to
some surprises. For instance:
false == '0'; //true
'' == '0'; //false
0 == '0'; //true
0 == ''; //true
null == false; //false
null == undefined; //true
The rules behind this type-casting are unusual, and not very memorable. Use the === and !== operators to
remove ambiguity:
false === '0';// false
'' === '0'; //false
0 === '0'; //false
0 === ''; //false
null === false; //false
null === undefined; //false
Test Your Code
Your JavaScript will be running in a lot of different environments, and it's important to test it in every environment
that your users will be running.
The Framework provides QUnit as a way to test your code. To learn more, read the Framework Quickstart article.
TROUBLESHOOTING
Developer Tools
In Firefox, make sure that you have Firebug available. In Chrome and IE, the developer tools are useful for
debugging JavaScript.
You will want to use the "Script" tab in these tools, and then set break points within your code. When you reload the
page, you will be able to step through the code at the point you chose.
I Can't Call my Code from the Console!
Code within the Framework is non-global, which means that directly calling the function from the console is not an
option. To expose the code, you have two options. You can call "require" from the console:
//directly in the console...
require(["my_module"], function(module) {
//test module directly:
module.some_function();
});
Or you can "export" your code to the global namespace from within the File Manager. Be sure to add a flag as this is
only appropriate for testing purposes:
//within the file manager...
require([], function() {
var addresses = {}, debug = true;
addresses.parse_data = function() {...}
...
if(debug === true) {
window.addresses = addresses;
}
});
The above example will make "addresses" globally available when the debug variable is set to true. This will allow
you to do this directly in the console:
addresses.parse_data();
I can't Call my Code from <a onclick="...">
This is intentional. Please see the above section "Don't use Inline JavaScript."

A configurable BOM (Bill of Materials) is a set of parts specific to a certain product configuration.
For each configuration, where a product has a different set of parts, you can create a BOM.
When a BOM triggers, it appears to buyers on commerce document line item pages. If the parts contained in the
BOM link up with the parts in the parts database, then parts pricing also appears on line item pages. Buyers see
multiple BOMs when several apply to a given configuration.
A BOM rule has a condition and an action. The values of the attribute(s) selected as the condition attribute(s)
determine the result of the condition, which when true triggers an action Step 1:
Using your Configuration Quick Links, navigate to your Product Family > List of Bill of Material Rules.
Step 2:
Click Add to create a new rule.
Step 3:
Enter the basic Properties of the rule. For more information, see .
Step 4:
Create your Conditions by selecting Simple Condition. Since this rule should run ONLY when a user selects the
pizza described above, then those conditions must be met when placing the order.

Step 5:
Select Simple Bill of Materials because you know the part numbers you would like to add: DD001, MS001 and
IS001. To write an advanced rule, click Define Function to launch the script editor.
Step 6:
Add the part numbers and quantities of each.

Step 7: Verify the rule is running by checking the Pipeline Viewer in Configuration. In order to verify, you will
need to select Size = Medium, Crust Type = Deep Dish and Toppings = Pepperoni.

Creating Configuration and Selection Rules with Array Attributes


o
o

o
o

Cache

Recommendations are available as both Conditions and Actions.


The user can return an array from the advanced action function of a recommendation rule, the
size of which is smaller than the array attribute passed as a parameter. In this case, if only a
subset of the recommended values is displayed and the user selects these values, then the
recommendation disappears.
Recommendation rules that force set values do not enforce the force set if any index in the array
is blank. Therefore, if a recommendation rules force sets values for certain indexes in an array
but leaves the other indexes as blank, then the force set is not enforced. In this case, all fields in
the array can be edited by the user.
If a recommendation rule returns the correct value for certain indexes of a single-select menu
type attribute, but incorrect values for other indexes of the same array, all the values set by this
recommendation rule is discarded.
Multiple values can be recommended for a text type array attribute using the "|^|" delimiter.

The Configuration cache feature is enabled by your BigMachines system administrator.

The Configuration cache stores configuration data in the server's memory and improves performance on a
BigMachines site. It does this by minimizing trips to and from the database server. And, less calls to the server
means faster load times. However, this is just one of the ways a BigMachines site can be optimized for better
performance. To learn more methods, see NOTE:1660099.1 - Implementation Tips and Tricks
. After the Configuration cache has been turned on, all areas of the BigMachines site that relate to configuration will
have faster access times once the cache is built.
Remember, once the Configuration cache is enabled any changes made in Configuration must be deployed before
they become visible. This deployment process is exactly like deploying Commerce, but it is a SEPARATE
deployment. The areas that must be deployed when the configuration cache is turned on includes: rules and
products or flows within a product family.
NOTE: each product family must be deployed separately.

Setting Up Line Level Goal Seek


Step 1:
Add the Boolean attribute Override Net Each (overrideNetEach_line). Set the default to False.
Step 2:
Add a new column (Ovr..Net Ea) to the Line Item Grid Layout. Place it after Ext. Discount.

Step 3:
Make the Net Ea columns Editable.

Step 4:
Modify the Commerce Library Pricing Function. Do the following:

Add the attribute Override Neat Each (overrideNetEach_line).

Initialize the Variable


overrideNetPriceDict = dict("float"); //key:document number

Add value into dictionary for both parts and models (this code is added in two places) when a
Net Price has been specified by the user.

if(line.overrideNetEach_line){
put(overrideNetPriceDict, line_document_number, line.netPriceEach_line);
}

Set the temporary variable Net Price if override is TRUE for Goal Seek. This code is added in two places,
both for model and parts code.

//Sets Net Price if Override (Goal Seek)


if(containsKey(overrideNetPriceDict, eachDocNum)){
netPriceEach = get(overrideNetPriceDict, eachDocNum);
discountAmtOverd = listPrice-netPriceEach;
if(discountType == "%"){
if(listPirce<>0){
discount = (discountAmtOver/listPrice) * hundredPercent;
}
}
else{
discount = discountAmtOvrd;
}
}

=============================
Selector Popups provide a way to improve the performance of the Search Results page. They introduce the ability
to invoke calculations on a need-by basis on theSearch Results page.
A link/button is displayed to the user that can be used to launch a with the desired calculations. The calculations
can be tailored to individual model/product line(s) returned as search results. The calculations can access dynamic
values on the search results page. Step 1:
Navigate to the Search Flow Rule Editor page. Select Function to Evaluate Result In Use in the Search Flow Results
Action section.

Step 2:
Define the function and include a call to the popup function. See Syntax of Popup Function below.
Step 3:
Choose the Define Function option in the Popup Window Function section.
Step 4:
Define the functions.
Inputs to the Popup Function
All the attributes available to the Function to Evaluate Result Set are available to the popupFunction
Dynamic values present in the search results page can be passed as a parameter to the popupFunction
Output of the Popup Function
The popupFunction returns a string
Syntax of Popup Function
popupFunction (search_results_string) where,
search_results_string: This is an optional string parameter that is used to pass values computed in the search
results function to the popupFunction. It enables the BigMachines admin to pass dynamic values on the search
results page. The maximum size of this string in 1000.

An administrator modified the HTML for a configuration flow template. Now I want to go back to using
the UI to modify the layout, but how do I get rid of the red Custom Template in Use message?SOLUTION
Navigate to the configuration flow in question, then List Tabs, and click Layout next to the appropriate tab. Now
click Define Advanced Template, then click Delete and then OK to confirm your choice. You will now be able to use
the UI to control the layout again.

The final step to manually adding a table is to make sure it is deployed. All tables that have been
added or modified and are NOT deployed, will be clearly marked in Red.

Post-calculations are used to customize the search results page. They can be used to provide values for
configurable attributes that can carried to the configuration flow. Post-calculations are required for end node type
search flow rules as they instruct the selection engine how to use buyer-submitted values.

There are four post-calculation options available:


1.
2.

None - No post-calculations are specified


Send Results to External URL (Entire URL) - The search results page can be obtained by posting the
specified values for certain attributes to a pre-defined selector engine by selecting the Feed Values into
Pre-Defined Engine (Enter URL) option and specifying the selector engine URL to which the values of the
attributes in the search flow (current display attributes) is posted to obtain a search results page which
also contains a graph. The search flow current display attributes need to be mapped to the equivalent
selector engine attributes. On the user-side, based on the values of the search flow attributes, the search
results are computed and displayed along with the graph.
Using a third-party selection engine, buyer-defined values are passed to an outside engine that performs the
search. This enables you to use more complex mappings than that available through system-supported functional
mapping.
1. Function to Evaluate Result In Use - This option enables you to define a function, which involves
creating a selection script.
2. Pop-Up Window Function.
=======================================
I ran my Parts Integration, then checked the log to see that several Records failed.
None of my Parts have that Record number. What is this Record number and how can I find which parts are throwing
these integration errors?
Step 1:
In Admin, click Download under Bulk Data Services

Step 2:
Select Price Book Associations, then click Next.

Step 3:
Select CSV, then click Download.
Step 4:
In the Download Status page, click Refresh until the Status of your PriceBookAssoc download changes from
'Pending' to 'Completed'.

Step 5:
Click on PriceBookAssoc and save/ open the file.
Step 6:
The Record number in the Parts Integration log would be referring to the part number in the excel row number plus
3.
i.e. Because of the first 3 extra rows, Record number = Row number + 3

===========================
How do Mass Updates affect Data Cube Exports?
The Mass Update touches all quotes in the database and updates the date value that the
Data Cube checks against. Therefore every quote will be included in the export after a
Mass Update has been run.
On Demand Tab Loading
This feature will only load data on the active tab in configuration. When a user opens any other tab, the tab will load
on demand. Upon tab change, a loading page may appear as the next set of data is being loaded. The purpose of
on demand tab loading is that, when enabled, there is an improvement in performance and rendering speed on
larger configuration flows.
To turn on this feature, navigate through: Admin > Catalog Definition > Product Families > Configuration Flows.
Once on the Configuration Flow page, under the Flow Properties section, you will see Tab Loading Behavior.
"Load all tabs on page load" is the default selection when creating a new configuration flow. Change this to "Load
only active tab on page load" to enable this feature.
Why should I use this feature?
If you have a low connection speed, which often causes configurations to load slowly, or for those using mobile
browsers. It is recommended that this feature is enabled to improve loading times

If you encounter the following message when performaning an action in BigMachines that triggers an
integration with SalesForce: An error occurred while communicating with the partner site.

You will want to check the integration associated with the action, and take a look at the
SOAP Generator XSL file. Let's say we have the following code in XSL:
<xsl:variable name="MNamount" select="$sub_doc/relatedMaintenance/>
<Maintenance_Amount__c>
<xsl:value-of select="$MNamount"/>
</Maintenance_Amount__c>
If the value for relatedMaintenance does not exist in XML Document, MNamount will be undefined, and
the generated SOAP XML Input will be: <Maintenance_Amount__c></Maintenance_Amount__c>
Salesforce does not accept SOAP XML with attributes containing no value. Therefore we will receive the
error "An error occurred while communicating with the partner site
To resolve this, we just need to put a check to make sure it is defined:
<xsl:if test="$MNamount">
<Maintenance_Amount__c xmlns="">
<xsl:value-of select="$MNamount"/>
</Maintenance_Amount__c>
</xsl:if>
or
<xsl:choose>
<xsl:when test="$MNamount">
<Maintenance_Amount__c xmlns="">
<xsl:value-of select="$MNamount"/>
</Maintenance_Amount__c>
</xsl:when>
<xsl:otherwise>
<Maintenance_Amount__c xmlns="">
<!-- Default value -->
</Maintenance_Amount__c>
</xsl:otherwise>
<xsl:choose>
You can schedule deployments for the following Event Types:

Deploy - This event ensures that a commerce process is ready for use on the commerce side.
Clone - This event creates a copy of the selected process on the admin side. The cloned process must be deployed
for it to become active on the commerce-side.
Note: this event is visible based on a commerce global setting under Admin Home Page > Global System Settings
and Utilities > Commerce.
Repopulate Column Data - This event makes it possible for changes done in the Data Columns on a deployed
process, to be available on the commerce side. This event is basically applicable for the transaction manager data.
Diff - This event checks for differences between the last deploy and the commerce process and sends an email, if
an email address is specified.
Remove Transactions - This event deletes all transactions from the commerce-side. USE WITH CAUTION.

Navigation Path: Admin Home Page > Commerce and Documents > Process
Definition > Select a Process > Deployment Center

This feature allows the admin user to set-up a BML script which can be used to update all existing commerce
transactions.
For example, if a sales rep quits their job and all of their existing quotes have to be reassigned to
another sales rep, then the admin user could write a mass update BML script to replace the old sales
rep's name with the new one.
The BML script will run on ALL quotes on the site (which can take a while depending on the site), but conditionals
can be used so certain blocks of code only execute if a quote has certain properties.
This script is defined in the Deployment Center of every process.
The return type for this function is : document_number~variable_name~value[ |
document_number~variable_name~value]*
The process must be deployed before the mass update function is scheduled to run.
The mass update does not show up in quote histor1. Click "Define Function" and edit/debug your mass update
script in the window that appears. Save the script and close the window.
2. Use the "Add Event" button to schedule the mass update at the time specified (you might have to deploy
commerce first if you get an error box). If the time isn't changed after the page loads, the scheduled event should
run immediately. Note that this is server time so it may not match your local time.
3. Once the mass update is scheduled, it should appear in the events section of the page.
4. Pressing "Refresh" allows you to see the current status of the mass update.
Customer recieves the same files from the sites daily extract (DataCube) every day from their SFTP.
The files have the same revision date and file size every day.
Cause
The cause of getting the same file everyday for the daily extract is that the time for the DataCube is not set (Red
Box) on the site. To Navigate to the Reporting Screen, go to Admin > Process Definition > Reporting. See below
picture.
Notice the Status (Green Arrow) on the above image. There are three differant Status keywords that can be used.
Inactive, Idle and Running.
The process of running the DataCube has three parts:
1) The DataCube starts running at the time specified by the customer.
2) After the DataCube runs, there is a scheduled time for the DataCube to be sent to the FTPS location for pick-up.
3) The customer picks up the file for use.
The server retains the last copy of a completed DataCube. That file stays in the system at the location that is ready
to be pushed to the SFTP location. When the next DataCube completes, the retained copy of the DataCube gets
deleted and is replaced by the new copy.
The Scheduled time to send the DataCube to the SFTP location will send whatever file is sitting on the server. The
file that is retained from the most recently finished DataCube will be sent. When the time to run the DataCube is
not there, the same file will be sent every day.

Commerce order of operations:


Copy Line Items (User Defined)
Order:
1) Initialization (tab) (new line items only)

2)
3)
4)
5)
6)
7)
8)
9)

Update Line Item Modify (tab) for sub document


Update Line Item Modify (tab) for main document
Update Line Item Advanced Validation
Recalculate (source line items)
Copy Line Items Modify (tab)
Copy Line Items Advanced Modification
Recalculate (copied line items)
Copy Line Items Advanced Validation

Other Use Defined Actions (Main Doc)


Order:
1) Update Line Items Advanced Modification
2) Main Doc Auto Update
3) Modify (tab)
4) Advanced Modification
5) Advanced Validation
Other User Defined Actions (Sub Doc)
Order:
1) Modify (tab)
2) Advanced Modification
3) Advanced Validation
How to set up data cube export
If you have a third party reporting or warehouse tool, you may want to export all of the data from BigMachines to
this tool, so you can run reports. You can compare BM data with other data in their 3rd party system. In other
words, Data Cube Export gives you a way to transmit BigMachines data, as a text file, to another location.
Note: This is an add-on service. Please contact your Project Manager or Account Executive to discuss.
HOW IT WORKS
-------------------------------------------------------------------------------Step 1:
To enable this functionality, you first have to purchase it and contact BigMachines Ops to have them enable it on
the corresponding site(s). Once enabled, you'll see a Reporting button in the bottom of the Process Definition Page.
You'll also need OPS to know where (to which 3rd party system) to send our files (usually transmitted via SFTP). You
will have to work out encryption and decryption keys.
Note: Because of security reasons, data cube files cannot be hosted on BigMachines servers.
Step 2:
Click the Reporting button at the bottom of the Processes list page.
Step 3:
Click the Create a New Mapping File link to download the mapping XML.
The XML will look something like this:
-<mapping>
-<process var_name="TonisPizza_process">
-<document var_name="tonispizza_line">
<attribute var_name="_model_segment_id" db_name="bm_rt_4180720" />
<attribute var_name="_model_segment_name" db_name="bm_4180955" />
<attribute var_name="_model_product_line_id" db_name="bm_4180801 " />
<attribute var_name="_model_product_line_name" db_name="bm_4180751 " />
-</document>
-</process>
-</mapping>
Step 4:
Edit the mapping file in an XML editor. This will tell you which data is included, what files it's placed into, and the
variable name of the columns.
Step 5:
Make sure to update the variable names for the database ID fields. Once this is done, it should run seamlessly
(should there be no additional changes). See the Example XML section below for an example of the relationship
between commerce attributes and the 3rd party database field ID's.

Step 6:
Save and Upload the completed mapping file back to BigMachines. To do so, click Browse next to the Upload
Mapping File button.
I have a requirement to rule (insurance type) based on the company size and the employee type
Creating Configuration and Selection Rules with Array Attributes
Data Available using BMQL Transaction is:

Date
Integer
Float
String
Currency
Summation
Boolean
Approval Comments, History, RTE, HTML and File Attachment attributes are not available through BMQL Transaction.
The BMQL Transaction function is context sensitive. This implies the following:
The function will automatically pick up the transaction id when the user arrives on the Configuration page through
Commerce. It is not possible to query a separate transaction (one that the user is not in the context of) using this
function.
Translations will be returned for menu values based on the current users language. For other string type attribute
values, the user entered value will be returned as translations are not defined for such inputs.
For currency attributes, the exchange rate will be applied on the returned value if the current users session
currency is not the same as the base currency of the site
We can use either POST or GET to communicate with the external server, using the standard BML functions
urldatabypost and urldatabyget.
GET vs POST
There are a few things to consider when choosing which protocol to use:
HTTP specifies that POST is used to make changes on a server.
When avoidable, GET should not be used to make changes.
GET is the same thing as putting the url directly into your browser, with the parameters appended to the end of
the URL.
GET has a max character limit based on URL length - generally 2048 characters.
POST sends the data behind the scenes.
POST doesn't have a set max character limit, and is good for large data transfers.
Many services treat GET and POST identically.
Your choice will depend on what your needs are, as well as the specifications of the web service that you are
communicating with.
Scenario based:
You will be setting the prices for the pizzas based on their size and crust type. According to the menu, you have the
following options and prices:
Small Thin Crust - $9.00
Medium Thin Crust - $12.00
Large Thin Crust - $15.00
Medium Deep Dish - $17.00
Large Deep Dish - $20.00
You will also need to create a user message that explains to the user exactly what they are ordering.

getconfigattrvalue() :The function getconfigattrvalue() is used in Commerce to call a configurable attribute in


Config. If you are in Commerce, and have a configurable attribute that you would like to reference in Config, this is
the function to use.
Parameters:

Parameter

Data Type

Description

[documentNumber]

Integer

Represents the quote number. This is optional.

configAttrVarName

String

Variable name of the configurable attribute from which you are


retrieving data

GETARRAYSTR() This function returns the delimited string for array attributes with $,$ as the delimiter. This
function is only used for configurable attributes.
getdiffindays(date 1, date 2)
This function converts date to a string. If no format is provided the format of the return text is MM/dd/yyyy
HH:mm:ss. If no time zone is provided the time zone will default to the server's time zone. This method expects the
optional dateFormat to be in the same format as used in strtojavadate().
The syntax for this function is:
String datetostr(Date date [, String dateFormat [, String timeZone]])
The return type for this function is a string.
Example use case:
Date fields in commerce are considered string fields so to return a date to a commerce attribute, you need to
convert it to a string first.
=================================
Syntax of calling a custom Util function:
variable = util.Function_Name(Attribute_Name);
The attribute is the input for the util function.

===================
SFDC:
7. A custom object contains some records, now my requirement is to create field in
this object with master detail relationship. Can we create master detail relationship in
this case?
No, directly we cannot create master details relationship if custom object contains existing records.

Following are the steps to create to create master-detail relationship when records are available in
custom object.
1. First create field with lookup relationship.
2. And then associate look field with parent record for every record
3. Next change the data type of the field from look up to Master detail.
6. How can I create Many to Many relationship?
Lookup and Master detail relationships are one to many relationships. We can create many to
Many relationship by using junction object. Junction object is a custom object with two master detail
relationships.
What are Apex Governor Limits?
Governor limits are runtime limits enforced by the Apex runtime engine. Because Apex runs in a
shared, multitenant environment, the Apex runtime engine strictly enforces a number of limits to
ensure that code does not monopolize shared resources. Types of limits that Apex enforces are
resources like memory, database resources, number of script statements to avoid infinite loops, and
number of records being processed. If code exceeds a limit, the associated governor issues a
runtime exception.
What is S-Control ?
Ans: S-Controls are the predominant salesforce.com widgets which are completely based on
Javascript. These are hosted by salesforce but executed at client side. S-Controls are superseded by
Visualforce now.
8. What is a Wrapper Class in S.F? A wrapper class is a class whose instances are collections of other
objects.

Creating Progress Bar field using Formula| Without any coding


In this post we will discuss on creating Progress bar field without any coding, just by using formula field.
Initially it seems like we cant create dynamic progress bar field by using only formula in Salesforce. There
are some approach, where developers stores 4 images with interval of 25% or 10 images with 10%
interval and displaying image with help of Case or IF statement.
We will create Progress bar field, which will reflect every percentage.
Progress Bar using Formula field in Salesforce

To create perfect Progress bar field, we will need to take help of two images.
1.Empty filled image with border (Download Sample)
2.Filled rectangular image (Download Sample)
Upload above two images in Static resource. Once it is uploaded create a formula field.
In Formula field we will be appending both images and repeating Filled Image as per percentage field.
Logic can be seen in below image.
Logic for Progress Bar using Formula field in Salesforce
Formula for Progress Bar Image
IMAGE('/resource/1398170360000/BlueProgressBar', 'Test', 10, ( percField__c * 100 )) &
2 IMAGE('/resource/1398170333000/ProgressBorder', 'Test', 10, (100 - percField__c * 100 )) &
3''&
4 TEXT (percField__c * 100) &
5 '%

41. Can we convert the lookup relationship to Master Detail relationship?


Ans:
We can convert the lookup relationship to master detail relationship if and only if all the existing
record has valid lookup field.
45. What is the difference between database.insert and insert ?
Ans:
insert is the DML statement which is same as databse.insert. However, database.insert gives more
flexibility like rollback, default assignment rules etc. we can achieve the database.insert behavior in
insert by using the method setOptions(Database.DMLOptions)
Important Difference:

If we use the DML statement (insert), then in bulk operation if error occurs, the execution will
stop and Apex code throws an error which can be handled in try catch block.
If DML database methods (Database.insert) used, then if error occurs the remaining records
will be inserted / updated means partial DML operation will be done.

107. Consider total 90k records present in Salesforce and you have used the count() method
of soql. What will be output of it?
Ans : It will throw an error something like Too many query rows: 50001, as the record limit in
SOQL is 50,000.
Although the count() returns only one row however it processes each record and thus hit the allowed
governor limit

164. Can we mass delete reports using Apex (Anonymous Apex) ?


Ans :
Salesforce has not exposed any API for Reports. So best way is :
1.
2.
3.

Move all reports needs to delete in new folder.


Inform everyone that reports will be deleted after some time may be 30 days.
Import your reports folder in Eclipse including all reports to be deleted and then delete the
the reports folder in eclipse. It will delete all the reports at once

176 : Explain Skinny table.


Ans :
Salesforce creates skinny tables to contain frequently used fields and to avoid joins, and it keeps the
skinny tables in sync with their source tables when the source tables are modified. To enable skinny
tables, contact salesforce.com Customer Support.
For each object table, Salesforce maintains other, separate tables at the database level for standard
and custom fields. This separation ordinarily necessitates a join when a query contains both kinds of
fields. A skinny table contains both kinds of fields and does not include soft-deleted records.

144

How

to

get

total

number

of

Child

records

in

Lookup

relationship?

Ans: As Rollup Summary field is only supported in Master detail, we cannot use it for Lookup. There
are following two ways (If anyone has any other idea please comment).
1.
2.

Inline Visualforce page


Trigger on Child Object, which will update field in Parent record if child record is inserted,
deleted or undeleted.

127 : We have Time Based Workflow and there is action scheduled to be executed. Can we
delete
that
workflow?
Ans : If a workflow have any pending time dependent action, then we cannot delete the workflow.
External ID
This is a field that usually references an ID from another (external) system. For instance, if the
customer has an Oracle Financials system that they will be linking with salesforce.com, it may be
easier for them to be able to refer to the Oracle ID of account records from within salesforce. So they
would create an external ID in salesforce.com and they would load the Oracle ID into that field for
each account. They can then refer to that ID field, rather than the salesforce.com id.

Additionally, if you have an external ID field, the field becomes searchable in the sidebar search. You
also can use the upsert API call with the extenal ID to refer to records.
You can have multiple records with the same external ID (though it is not reccomended, as it will
defeat the purpose of the external id) .
External Id available for Text, Number and Email field types.
External Id is used in upsert operations.
If external id is absenor not matched then insert happens.
If external id matched once then record will be updated.
If external id is matched multiple times then error occurs.

77. What is the use of floating report header?


Ans: If you have long tabular report, you can make the column header visible on each pages
as you scroll, by enabling floating report headers.

78. How to enable floating report header?


Ans :
Go to Setup | App Setup | Customize | Report and Dashboard | User Interface Settings .
Click on checkbox Enable Floating Report Headers.

153. How can you lock record using SOQL so that it cannot be modified by other user.
Ans : we will need FOR UPDATE clause of SOQL.
42. In How many way we can invoke the Apex class?
Ans:
1.
Visualforce page
2.
Trigger
3.
Web Services
4.
Email Services
27. What is difference between WhoId and WhatId in the Data Model of Task ?
Ans :
WhoID refers to people things. So that would be typically a Lead ID or a Contact ID
WhatID refers to object type things. That would typically be an Account ID or an Opportunity ID
In class declaration if we dont write keyword with sharing then it runs in system mode
then why keyword without sharing is introduced in apex?
Ans:

Lets take example, there is classA declared using with sharing and it calls classB method. classB is
not declared with any keyword then by default with sharing will be applied to that class because
originating call is done through classA. To avoid this we have to explicitly define classB with keyword
without sharing.
What are annotations and why @future annotation is used
What are outbound messages

You might also like