10 Best practises to stay within Governor Limits in Salesforce

Best practises to stay within Governor Limits
Chinmaya By Chinmaya
15 Min Read

Introduction

Salesforce operates on a multi-tenant architecture, where resources like memory, database, and processing power are shared across multiple organizations.
To ensure fairness and avoid system overload, Salesforce enforces governor limits.

These limits are very much essential for platform stability but can pose challenges for developers working with large data sets or complex logic.

In this blog, we will explore some best practices to ensure that your code stays within these governor limits, ensuring efficient and scalable Salesforce applications.

If you’re unfamiliar with Governor Limits in Salesforce, I recommend checking out the post linked below before proceeding with this one.

Governor Limits in Salesforce
Governor Limit in Salesforce : Writtee.com

What Are Governor Limits?

Governor limits are runtime restrictions enforced by Salesforce to ensure efficient use of shared resources. These limits apply to areas like:

    • Data Storage: Limits on the amount of data you can store.

    • CPU Time: Limits on the amount of processing time your code can use.

    • SOQL Queries: Limits on the number of queries you can execute.

    • DML Operations: Limits on the number of records you can insert, update, or delete.

    • Heap Size: Limits on the amount of memory your code can use.

Exceeding these limits results in errors like “Too many SOQL queries”“CPU time limit exceeded”, or “Too many DML statements”, which can break your application.

Types of Governor Limits:

There are 3 types of Governor Limits in Salesforce:

1. Execution Limits:

These limits control how your code executes during a transaction. They include restrictions on:

    • The number of SOQL queries you can perform.

    • The number of records you can process.

    • The amount of CPU time your code can consume.

Example: You can only execute 100 SOQL queries and 150 DML statements in a single transaction.

2.Resource Limits:

These limits apply to the usage of system resources, such as:

    • Data storage (e.g., the amount of data you can store in objects).

    • File storage (e.g., the space available for files and documents).

    • The number of emails you can send.

Example: You can send a maximum of 5,000 single emails or 1,000 mass emails per day.

3. Concurrent Limits

These limits govern the number of simultaneous operations that can occur, such as:

    • The number of concurrent transactions.

    • The number of API requests.

    • The number of batch jobs that can run at the same time.

Example: Only 5 batch jobs can run concurrently in an org.

Why Do Governor Limits Matter?

    1. Shared Resources:
      Salesforce is a multi-tenant platform, meaning multiple organizations share the same infrastructure. Governor limits ensure fair usage.

    2. Performance:
      Efficient code and processes lead to faster, more reliable applications.

    3. Scalability:
      Adhering to limits ensures your application can scale as your business grows.

    4. Cost Efficiency:
      Staying within limits reduces the need for additional licenses or resources.

Best Practices to Stay Within Governor Limits

Now, lets start with some best practises developers should follow to stay within Governor Limits in Salesforce.

1. Bulkify your code

One of the most important principle in Salesforce development is – Bulkifying your code.
Bulkification ensures that your code can handle multiple records at once, rather than processing single record at a time.

This helps you stay within the limits on SOQL queries, DML statements and other resources.

    • Handle Multiple Records: Always write code that can process multiple records at once, as triggers and batch jobs often operate on batches of records.

    • Avoid Hardcoding IDs: Hardcoding IDs can cause issues when processing multiple records

Example of Non-Bulkified code

				
					for (Account acc : Trigger.new) {
    Account existingAcc = [SELECT Id FROM Account WHERE Name = :acc.Name];
    update existingAcc;
}
				
			

Example of Bulkified code

				
					List<String> accNames = new List<String>();
for (Account acc : Trigger.new) {
    accNames.add(acc.Name);
}
List<Account> existingAccs = [SELECT Id, Name FROM Account WHERE Name IN :accNames];

for (Account existingAcc : existingAccs) {
    // Process bulk records
}
				
			

Why Bulkification Matters?

Processing the records in bulk reduces the number of SOQL queries and DML operation, which helps you stay within Salesforce governor limits.

2. Avoid SOQL and DML statements inside 'FOR' Loop.

This is a common mistake developers make – is that they write SOQL statements and perform DML operation inside FOR Loop, which can cause your transaction to fail.

Example of Inefficient Code

				
					for (Account acc : accountsList) {
    Account existingAcc = [SELECT Id FROM Account WHERE Name = :acc.Name];
    update existingAcc;
}
				
			

As we know, Salesforce enforces a Governor Limit of 100 SOQL queries per synchronous transaction. This means a maximum of 100 SOQL queries can be executed within a single transaction.

However, if you look at the example above, there is a SOQL query being executed for every record inside the loop. This approach can lead to exceeding the 100 SOQL queries limit per transaction.

Example of Efficient Code

				
					List<String> accNames = new List<String>();
for (Account acc : accountsList) {
    accNames.add(acc.Name);
}

List<Account> existingAccs = [SELECT Id, Name FROM Account WHERE Name IN :accNames];

List<Account> accountsToUpdate = new List<Account>();
for (Account existingAcc : existingAccs) {
    // Add to the list of accounts to update
    accountsToUpdate.add (existingAcc);
}
update accountsToUpdate;
				
			

In the above example, it can be seen that the SOQL statement only runs one time on all the Accounts present inside ‘accNames’ list.

Why this matters?

By moving SOQL and DML operation outside of FOR LOOP, we reduce the number of SOQL queries and DML operation executed, which helps to stay within Governor Limits.

3. Use SOQL 'For Loops' for Large Datasets

    • Avoid Nested Loops:
      Nested loops can quickly consume CPU time and heap size.

    • Use Maps for Lookups:
      Instead of looping through lists multiple times, use maps to store and retrieve data efficiently.

				
					// Bad: Nested loops
for (Account acc : accounts) {
    for (Contact con : contacts) {
        if (con.AccountId == acc.Id) {
            // Process contact
        }
    }
}

// Good: Using a map
Map<Id, List<Contact>> accountToContactsMap = new Map<Id, List<Contact>>();
for (Contact con : contacts) {
    if (!accountToContactsMap.containsKey(con.AccountId)) {
        accountToContactsMap.put(con.AccountId, new List<Contact>());
    }
    accountToContactsMap.get(con.AccountId).add(con);
}
				
			

Why this matters?

This approach reduces memory consumption by fetching records in manageable chunks, helping to stay within the 6MB heap size limit for synchronous transaction.

4. Use Efficient SOQL Queries

    • Limit the Number of Queries:
      Salesforce allows 100 SOQL queries per transaction. Use techniques like bulkification to minimize the number of queries.

    • Select Only Required Fields:
      Avoid SELECT * and fetch only the fields you need.

    • Use Selective Queries:
      Add filters (e.g., WHERE clauses) to make queries more efficient and avoid full table scans.

    • Leverage Relationships:
      Use parent-child relationships (e.g., Parent__r.Field__c) to fetch related data in a single query.

				
					// Bad: Querying all fields
List<Account> accounts = [SELECT * FROM Account];

// Good: Querying only required fields
List<Account> accounts = [SELECT Id, Name FROM Account WHERE Industry = 'Technology'];
				
			

5. Use Asynchronous Processing

Asynchronous apex (such as @future method, Batch apex and Queuable apex) allows developers to perform operations outside the context of current transaction.
This helps to bypass many governor limits like – CPU time limit, heap size etc.

    • Use Batch Apex:
      For large data operations, use Batch Apex to process records in chunks.

    • Queueable Apex:
      Use Queueable Apex for background processing to avoid hitting synchronous limits.

    • Scheduled Jobs:
      Schedule jobs to run during off-peak hours.

Example of using Asynchronous Apex - @future method

				
					@future
public static void updateAccounts (List<Account> accountsToUpdate) {
    update accountsToUpdate;
}
				
			

Example of Batch Apex - for processing large set of data

				
					public class AccountBatch implements Database.Batchable <SObject> {
    public Database.QueryLocator start (Database.BatchableContext bc) {
        return Database.getQueryLocator ('SELECT Id, Name FROM Account');
    }
    public void execute (Database.BatchableContext bc, List<Account> scope) {
        // Process each batch of records
        for (Account acc : scope) {
            // Logic here
        }
    }
     public void finish (Database.BatchableContext bc) {
        // Final logic here
    }
}
    
   
				
			

Why this matters?

This is important because – when you move heavy operations to asynchronous processing, it helps you to avoid governor limits in synchronous transaction.

This is because – asynchronous apex has higher limits for CPU time, heap size and other resources.

For example: In asynchronous transaction, the governor limit of CPU time increases to 60,000 milliseconds, compared to 10,000 milliseconds in synchronous Apex

If you’re unfamiliar with Asynchronous Apex in Salesforce, such as Future Method, Batch Apex or Queueable Apex, I recommend checking out the post linked below before proceeding with this one.

6. Use Collections for DML Operations

Developers should always use collections like – Lists or Sets to handle multiple records at once while performing DML operations (such as Insert, Update or Delete) or querying records in Salesforce.

Example of incorrect code:

Below code inserts 4 accounts in a single transaction, which has 4 DML statements.

This is not correct way of performing DML operation – because if there is a requirement to insert 200 Accounts, then there will be 200 DML statements and eventually this transaction will fail as Salesforce allows only 150 DML operations in a single transaction

				
					insert account1;
insert account2;
insert account3;
insert account4;
				
			

Corrected Code:

Instead of writing the above code, we can insert the same 4 Accounts with the help of a single DML statement.

				
					List<Account> accounts = new List<Account>{account1, account2, account3, account4};
insert accounts;
				
			

Why this matters?

By performing DML operation on multiple records at once, you reduce the number of DML statements and stay within the governor limit of 150 DML statements per transaction.

7. Minimize DML Operations

    • Batch DML Statements: Salesforce allows 150 DML statements per transaction. Group records into lists and perform DML operations in bulk.

    • Use Upsert Instead of Insert/Update: The upsert operation can handle both inserts and updates, reducing the need for separate DML statements.

Example:

				
					// Bad: Multiple DML statements
insert newAccounts;
update updatedAccounts;

// Good: Single DML statement
upsert accountsToUpsert;
				
			

8. Use Collections to reduce SOQL Queries

When working with multiple records, developers should always use collections (like Lists or Sets) in their SOQL queries to retrieve or update data in bulk.
This ensures that – number of SOQL queries are minimised in a transaction.

For Example:

				
					Set<Id> accountIds = new Set<Id>();
for (Opportunity opp : Trigger.new) {
    accountIds.add (opp.AccountId);
}
List<Account> accounts = [SELECT Id, Name FROM Account WHERE Id IN :accountIds];
				
			

Why this matters?

By retrieving multiple records at once, a developer can reduce the number of queries and stay within the governor limit of 100 SOQL queries per transaction.

9. Limit the use of recursive triggers

Triggers that update records and cause other triggers to fire can cause recursion, which may result in exceeding Governor Limits.
To avoid this, implement logic to stop recursion in trigger.

				
					if (!Trigger.isExecuting && !recursionFlag) {
    recursionFlag = true;
    // Your logic here
}
				
			

Why this matters?

By preventing recursion, developers avoid hitting limits on SOQL query, DML operation and CPU time.

If you’re not familiar with recursion in Salesforce Triggers, I highly recommend exploring the post linked below. Understanding this concept is crucial, as it is a commonly asked interview question.

How To Stop Recursion in Salesforce Triggers
How to Stop Recursion in Salesforce Triggers : Writtee.com

10. Monitor and Optimize CPU Time

    • Avoid Complex Calculations:
      Simplify logic to reduce CPU usage.

    • Use Limits Methods:
      Use methods like Limits.getCpuTime() to monitor CPU usage in real-time.

Example:

				
					if (Limits.getCpuTime() > 8000) { // 80% of the 10-second CPU limit
    // Optimize or exit the process
}
				
			

Conclusion

Salesforce governor limits ensure that the platform remains scalable and efficient for all users, but they can be restrictive if not handled properly.

By following best practices like bulkifying your code, using asynchronous processing, and writing efficient SOQL queries, you can stay well within governor limits and build more scalable, efficient applications.

Always test your code with larger data volumes to ensure that it scales properly under governor limits, and leverage Salesforce tools like Apex Governor Limit Monitoring to track resource consumption.

Share This Article
Follow:
Chinmaya is working as a Senior Consultant with a deep expertise in Salesforce. Holding multiple Salesforce certifications, he is dedicated to designing and implementing cutting-edge CRM solutions. As the creator of Writtee.com, Chinmaya shares his knowledge on educational and technological topics, helping others excel in Salesforce and related domains.
Leave a comment