Unable to Upload File More Than 6 MB in Salesforce Apex? Causes & Fixes

Issue

Unable to Upload File More Than 6 MB in Salesforce is a common issue developers face when working with Apex. If you’re trying to upload a file and it fails when the size exceeds 6 MB, this isn’t a bug it’s a platform limitation.

Salesforce limits the amount of data that Apex can process in a single transaction. When a file is too large, it exceeds these limits and the upload simply fails.

What This Actually Means

Instead of one static message this error can appear in multiple forms, including:

  • GraphQL request failed
  • Failed to fetch data from GraphQL API
  • Errors returned in the GraphQL response

Sometimes, the request returns partial data along with an error message.

When This Usually Happens

Apex executes with an upper heap size bound (memory limit). The file data is loaded into memory when you attempt to upload a file.

If the file size is too large:

  • It exceeds the allowed heap size
  • Salesforce stops execution
  • The upload fails

This is why files larger than ~6 MB cannot be handled directly using Apex.

When This Problem Occurs

You’ll typically face this issue when:
  • Uploading files through Apex controllers
  • Handling file uploads in Lightning or Visualforce using Apex
  • Sending large files through integrations
  • Converting files into Base64 and processing them in Apex

Why This Happens

The main reason is Salesforce limits:
  • 6 MB heap size for synchronous Apex
  • File data (especially Base64) increases memory usage even more
So even a file close to 6 MB can exceed the limit when processed.

How to Fix This Problem

1. Use Chunk Upload (Recommended)

Rather than uploading the complete file in a single shot:

  • Split the file into several smaller chunks
  • Upload each chunk separately
  • Combine them after upload

This avoids loading the full file into memory.

2. Use Salesforce File Upload Component

If you’re working on UI:

  • Use the standard Lightning File Upload component
  • It supports large files without Apex limits

This is the easiest and most reliable solution.

3. Upload Directly via REST API

Instead of sending the file through Apex:

  • Upload the file directly using Salesforce REST API
  • This bypasses Apex heap limitations

4. Use External Storage (If Needed)

For very large files:

  • Upload files to external storage (AWS, Azure, etc.)
  • Store only the reference or link in Salesforce

5. Avoid Full File Processing in Apex

Do not:

  • Convert large files fully into Base64 inside Apex
  • Store large file data in variables

Process only what is necessary.

What Not to Do

  • Don’t try to increase heap size (it’s fixed by Salesforce)
  • Don’t process large files in a single Apex transaction
  • Don’t store entire files in memory

Quick Check

  • Are you trying to upload the full file in one go?
  • Is the file being processed inside Apex?
  • Can you use standard file upload instead?
  • Can the file be split or uploaded externally?

Final Note

This limitation is because of how Apex manages memory. It’s not something that you can override straight away.

The best way to work with large files in Salesforce is to not work with them entirely in Apex but use chunking or standard components or direct API uploads.

Also Read

Get the latest tips, news, updates, advice, inspiration, and more….

Need help fixing this in your org?

Our Salesofrce experts can debug and resolve this issue quickly.