How to Handle Large Files Upload in Salesforce Using Apex (Complete Guide)

Issue

When you attempt to Large Files Upload from your local machine to Salesforce with Apex, it doesn’t work, or breaks.

This is typically because the platform limits cause Apex to time out when processing large files in one go.

What’s Actually Going On

Apex has strict governor limits on the processing of data.

When you upload a file:

  • The file is read from your system
  • It gets converted (often into Base64)
  • Then it is sent to Apex

This information all sits in memory while running. If the file is large, it immediately goes beyond the allowed limit and the uploading fails.

So it’s not the file that’s the problem, it’s the file that we’re processing.

Why Direct Upload via Apex Fails

That’s usually where the trouble starts:

  • The request covers the whole file at once.
  • Apex tries to hold the full file in memory
  • Base64 encoding increases the file size
  • Memory limits are exceeded

Therefore, the processing of large file uploads cannot be done in a single Apex transaction.

Salesforce File Upload Limits (Synchronous vs Asynchronous)

Understanding limits is critical before choosing your upload approach:

Synchronous Apex Limits

(When running in real-time: triggers, controllers, LWC calls)

  • Heap Size Limit: 6 MB
  • Maximum Request Size: ~6 MB
  • Maximum Response Size: ~6 MB
  • Base64 Impact: File size increases by ~37% after encoding
  • Execution Time Limit: ~10 seconds

Impact: Even a ~4–5 MB file can fail after Base64 conversion and memory usage.

Asynchronous Apex Limits

(Future, Queueable, Batch Apex)

  • Heap Size Limit: 12 MB
  • Execution Time: Longer than synchronous
  • Batch Processing: Able to process large data in chunks
  • Still Limited: Unable to process very large files at once as a single transaction

Impact: Better than synchronous, but still not suitable for full large file uploads.

What Actually Works (Real Solutions)

1. Upload File in Chunks (Most Reliable Approach)

Rather than posting the entire file all at once:

  • Split the file into several segments on the client side Comments
  • Send each part separately to Apex
  • Append or combine them in Salesforce

This means Apex in not required to load the full file in memory at any point. his is the most popular method for handling large file uploads.

2. Use Standard Salesforce File Upload

If you’re building a UI:

  • Use the standard Salesforce File Upload component

It processes large files on its own and never hits any Apex limits. Fast and easy to use, and recommended when possible.

3. Upload Directly via API (Skip Apex)

For integrations or external systems:

  • Upload files directly using Salesforce APIs
  • Avoid routing the file through Apex

This reduces memory usage and prevents failures.

4. Store Files Outside Salesforce (If Very Large)

If file size is very large:

  • Upload to an external storage (AWS, Azure, etc.)
  • Store only the file link in Salesforce

This keeps Salesforce lightweight and avoids all limits.

When to Use Which Approach

  • Small files → Apex upload works fine
  • Medium files → Use chunk upload
  • Large files → Use API or standard upload
  • Very large files → Use external storage

Quick Check

  • Are you sending the full file in one request?
  • Is Apex trying to process the entire file at once?
  • Can you split the file into smaller parts?
  • Can you avoid Apex and use standard upload?

Final Note

This issue is not about incorrect code, it’s about how Apex handles memory.

Apex is not designed to process large files at once. Forcing large uploads through that will never stop giving you problems.

The correct solution is to change the methodology, split the file, use standard tools, or bypass Apex if possible.

From now on large file uploads will be more stable and easier to handle once you do this.

Get the latest tips, news, updates, advice, inspiration, and more….

Need help fixing this in your org?

Our Salesofrce experts can debug and resolve this issue quickly.