How to Resolve the “Too many query rows” Error?
Correcting this error is a matter of tweaking your data fetching strategy. The idea is to either fetch less records, or handle large amounts of records in a way that the platform limitations are respected.
Solution 1: Use Selective WHERE Clauses
This is the most crucial fix. Always filter your SOQL queries to retrieve only the records you actually need. So we should always make our Make SOQL query selective by following Salesforce help article. Filter on Indexed Fields: Make use of indexed fields in the WHERE clause contains (for example, Id, Name, OwnerId, CreatedDate, External IDs, and indexed custom fields). This makes the database to locate the relevant records with much higher speed and efficiency.
Be Specific: Make your conditions as specific as possible to narrow down the result set.
You can also reach out to Salesforce support to get your custom field indexed to Improve Performance of SOQL Queries
Inefficient Example:
List accounts = [SELECT Id FROM Account WHERE Name LIKE 'A%']; // Still potentially large
Optimized Example:
// Filter by a specific record type AND an indexed field (OwnerId)
Id specificRecordTypeId = '012.....';
List accounts = [
SELECT Id
FROM Account
WHERE RecordTypeId = :specificRecordTypeId
AND OwnerId = :UserInfo.getUserId()
]; // Much more selective
Solution 2: Use LIMIT Clauses
If you only need a subset of records or want to explicitly cap the number of records retrieved, use the LIMIT clause. This is often useful for displaying preview data or when you only need some examples, not all records.
Example:
// Retrieve only the 1000 most recent Tasks, not all of them.
List recentTasks = [
SELECT Id, Subject
FROM Task
ORDER BY CreatedDate DESC
LIMIT 1000
];
Solution 3: Process Large Datasets Asynchronously with Batch Apex
For scenarios that do really need over 50,000 records processed (such as nightly data updates, mass calculations), synchronous Apex is not the tool. You have to use
Batch Apex.
Batch Apex is optimized to handle large volumes of data by dividing the job into manageable chunks (batches) and each batch has its own set of governor limits which also includes the 50,000 row s query limit. The start method’s Database. QueryLocator can query millions of records and you won’t hit the limit at that phase.
Example (Batch Apex Structure):
global class ProcessLargeAccountDataBatch implements Database.Batchable {
global Database.QueryLocator start(Database.BatchableContext BC) {
// This QueryLocator can handle millions of records without hitting the 50k limit here.
return Database.getQueryLocator('SELECT Id, Name, AnnualRevenue FROM Account WHERE Needs_Processing__c = true');
}
global void execute(Database.BatchableContext BC, List scope) {
// 'scope' contains a batch of records (default 200).
// Any queries INSIDE execute must respect the 50k limit for THIS batch transaction.
for (Account acc : scope) {
// Process each account...
}
// update scope; // DML on the batch
}
global void finish(Database.BatchableContext BC) {
// Optional: Send completion email, etc.
}
}
Solution 4: Optimize Overall Logic
Sometimes, you can restructure your code to avoid querying large numbers of records altogether. Can you achieve the goal by querying fewer, more targeted records? Can you use Aggregate SOQL queries (COUNT(), SUM()) to get summarized data instead of individual rows?