Understanding Apex Heap Size Limit Error in Salesforce

What This Error Means

You’ll see this error when your Salesforce code tries to hold too much data in memory at once.

Salesforce gives each transaction a limited space (called the heap) to store data while the code is running. If the data you store, like query results, collections, or large JSON responses, becomes too big, Salesforce stops execution and throws this error.

Common Error Message

System.LimitException: Apex heap size too large

This message indicates that the transaction exceeded the memory limit Salesforce allows for that execution.

Where This Error Happens

This error most often appears when,

  • A trigger or class loads too many records at once
  • JSON or API responses are deserialized into large objects
  • Large collections (Lists, Maps) are stored in memory
  • Batch or scheduled jobs process big datasets
  • Integration logic stores large blobs or files

Why This Error Occurs

Salesforce sets limits on how much memory Apex code can use:
  • 6 MB for synchronous transactions
  • 12 MB for asynchronous transactions (Batch, Queueable, Future)

If your code holds more than this amount of data in memory at any point, Salesforce stops it to protect system performance.

Common causes include:

  • Too many records queried without limits
  • SOQL inside loops storing large collections
  • Deserializing entire JSON responses unnecessarily
  • Processing huge file or blob data without streaming

How to Fix the Apex Heap Size Limit Error

Limit Data Returned by SOQL

Only query fields you actually need:
Instead of:
SELECT * FROM Account
Use:
SELECT Id, Name FROM Account
This reduces memory use.

Avoid Storing Unnecessary Records

If you only need record IDs or specific fields:

  • Store only those fields in a Set or small list
  • Don’t keep full records in memory when not needed

Move Heavy Logic to Asynchronous Processing

When working with large sets of records:

  • Use Batch Apex
  • Use Queueable Apex
These split processing into smaller transactions with larger limits.

Clear Variables After Use

After processing large lists or maps, set them to null so Salesforce can free up memory.
Example:
largeList = null;

Optimize JSON Handling

If your integration returns large JSON:

  • Parse only the keys you need
  • Avoid deserializing entire payloads unnecessarily

Best Practices to Prevent This Error

  • Always bulkify Apex (handle many records at once)
  • Don’t run SOQL or DML inside loops
  • Only query the fields you need
  • Use asynchronous jobs for heavy work
  • Test with actual large data volumes

Quick Fix Checklist

  • Did you limit fields in your SOQL query?
  • Are you storing only what’s needed in memory?
  • Have you moved heavy processing to Batch or Queueable?
  • Are large variables cleared after use?
  • Have you tested with large record counts?

Get the latest tips, news, updates, advice, inspiration, and more….

Need help fixing this in your org?

Our Salesofrce experts can debug and resolve this issue quickly.