Unable to Upload File More Than 12 MB in Salesforce Apex? Causes & Fixes

Issue

When i upload file more than 12 MB file upload is failing while using Apex it’s not a bug in your code it’s due to Salesforce platform limits.

Apex has strict governor limits on how much data it can consume in a single transaction. When a file exceeds that limit, the upload will be rejected.

What’s Actually Happening

When you upload a file with Apex, the file contents is first loaded into memory.

Salesforce allows:

  • Up to 12 MB heap size in asynchronous Apex
  • Around 6 MB in synchronous Apex

If the file size is big (especially after encoding), it will quickly exceed this limit. At that point Salesforce halts the process and the upload fails.

When You’ll Notice This

This problem usually shows up when:

  • Uploading files through Apex controllers (Visualforce or Lightning)
  • Handling file uploads in custom components
  • Sending files through integrations or APIs via Apex
  • Working with Base64 file data in Apex

The processing memory usage is higher, and your file can be bigger than 12MB (but not much bigger).

Why It Fails Even Before 12 MB

One important thing to understand:When files are handled in Apex, they are often converted into Base64, which increases their size by around 30–35%.

So:

  • A ~9 MB file can behave like a 12 MB+ file in memory
  • This is why uploads may fail earlier than expected

How to Fix This

1. Avoid Uploading Large Files Through Apex

Apex is not built for large file uploads directly. When the file is large, do not upload it via Apex at all.

2. Use Standard Salesforce File Upload

If you’re working on UI:

  • Use the standard Salesforce file upload component
  • It supports large files and handles everything without hitting Apex limits

It is the easiest and most foolproof way.

3. Upload Files in Parts (Chunking)

Rather than submitting the entire file at once:

  • Break the file into smaller chunks
  • Upload each part separately
  • Combine them after upload
This avoids memory overflow.

4. Use Direct API Upload

If you’re working with integrations:

  • Upload the file directly using Salesforce REST API
  • Skip passing the file through Apex
This helps bypass heap limitations.

5. Use External Storage for Large Files

For very large files:

  • Upload them for external storage (i.e. AWS , Google Drive, and so on ).
  • Store only the link or reference in Salesforce

Quick Check

  • Are you uploading the entire file in one request?
  • Is the file being processed inside Apex?
  • Is Base64 conversion increasing the size?
  • Can you use standard upload instead of custom logic?

Final Note

This issue is not about incorrect code, it’s about how Apex handles memory.

Salesforce simply doesn’t allow large files to be processed fully inside Apex. The right solution is to change the approach, either avoid Apex for uploads or handle files in smaller parts.

Get the latest tips, news, updates, advice, inspiration, and more….

Need help fixing this in your org?

Our Salesofrce experts can debug and resolve this issue quickly.