⚙️ Salesforce Trigger Issues with Large Data? Here’s a Scalable Solution

When working with bulk data in Salesforce, triggers and batch jobs can quickly run into governor limits. In this blog post, I’ll share a powerful pattern using Queueable Apex to safely and efficiently process large datasets and upsert Account records from a custom object LargeDataImport__c.


🧠 Use Case

Imagine you’re importing thousands of external records daily into Salesforce using a custom object like LargeDataImport__c. Each record should:

  • Be matched with an existing Account using a unique External_ID__c
  • If matched, update the account
  • If not matched, insert a new account

Processing these directly in a trigger would be dangerous — and possibly exceed limits. So we’ll offload the work to Queueable Apex in chunks.


🖼️ Architecture Diagram

Here’s how the system works behind the scenes:

Salesforce Queueable Apex for Large Data Import

🔁 Step-by-Step Breakdown

🧷 1. Trigger – Starts the Queueable Job

trigger LargeDataImportTrigger on LargeDataImport__c (after insert) {
    if (Trigger.isAfter && Trigger.isInsert) {
        List<Id> ids = new List<Id>();
        for (LargeDataImport__c ldi : Trigger.new) ids.add(ldi.Id);
        if (!ids.isEmpty()) System.enqueueJob(new AccountProcessorQueueable(ids));
    }
}

What it does:

  • Runs only after insert
  • Collects all inserted record Ids
  • Starts a Queueable job to process them

📦 2. Queueable Apex Class – Processes in Chunks

global class AccountProcessorQueueable implements Queueable, Database.AllowsCallouts {
    private List<Id> largeDataImportIds;
    private static final Integer CHUNK_SIZE = 2000;

    global AccountProcessorQueueable(List<Id> idsToProcess) {
        this.largeDataImportIds = idsToProcess;
    }

    global void execute(QueueableContext context) {

        

        List<Id> chunk = new List<Id>();
        List<Id> remaining = new List<Id>();

        for (Integer i = 0; i < fullList.size(); i++) {
            if (i < CHUNK_SIZE) {
                chunk.add(fullList[i]);
            } else {
                remaining.add(fullList[i]);
            }
        }

        List<LargeDataImport__c> records = [
            SELECT Id, Name, External_ID__c, Industry__c, AnnualRevenue__c 
            FROM LargeDataImport__c 
            WHERE Id IN :chunk
        ];

        Set<String> extIds = new Set<String>();
        for (LargeDataImport__c ldi : records) {
            if (ldi.External_ID__c != null) extIds.add(ldi.External_ID__c);
        }

        Map<String, Account> existing = new Map<String, Account>();
        if (!extIds.isEmpty()) {
            for (Account acc : [
                SELECT Id, External_ID__c 
                FROM Account 
                WHERE External_ID__c IN :extIds
            ]) {
                existing.put(acc.External_ID__c, acc);
            }
        }

        List<Account> upserts = new List<Account>();
        for (LargeDataImport__c ldi : records) {
            Account acc = existing.containsKey(ldi.External_ID__c) 
                ? existing.get(ldi.External_ID__c) 
                : new Account(External_ID__c = ldi.External_ID__c);

            acc.Name = ldi.Name;
            acc.Industry = ldi.Industry__c;
            upserts.add(acc);
        }

        if (!upserts.isEmpty()) upsert upserts External_ID__c;

        // Chain the next batch
        if (!remaining.isEmpty()) {
            System.enqueueJob(new AccountProcessorQueueable(remaining));
        } else {
            System.debug('✅ All large data import records have been processed.');
            // Optional: Send completion email or trigger next steps
        }
    }
}

🎯 Benefits of This Pattern

FeatureBenefit
🔁 Chunked ExecutionAvoids SOQL, DML, and heap limits by processing 2000 records at a time
🔄 RecursiveAutomatically queues the next chunk
🧠 Intelligent LogicMatches using External IDs, avoids duplication
🔌 ExtendableEasy to plug in more logic like callouts or notifications
🚀 AsynchronousKeeps trigger light and scalable

💬 Final Thoughts

This is a clean, scalable approach to handling large-scale imports in Salesforce, with a production-ready pattern using native tools. You can extend it to include:

  • Logging
  • Error handling
  • Notification emails
  • Integration with external systems

If you’re dealing with a high volume of records, this is a pattern worth saving to your toolkit.


🧾 Full Code Block

Here’s the complete Apex code (trigger + class):

// Trigger
trigger LargeDataImportTrigger on LargeDataImport__c (after insert) {
    if (Trigger.isAfter && Trigger.isInsert) {
        List<Id> ids = new List<Id>();
        for (LargeDataImport__c ldi : Trigger.new) ids.add(ldi.Id);
        if (!ids.isEmpty()) System.enqueueJob(new AccountProcessorQueueable(ids));
    }
}

// Queueable Class
global class AccountProcessorQueueable implements Queueable, Database.AllowsCallouts {
    private List<Id> largeDataImportIds;
    private static final Integer CHUNK_SIZE = 2000;

    global AccountProcessorQueueable(List<Id> idsToProcess) {
        this.largeDataImportIds = idsToProcess;
    }

    global void execute(QueueableContext context) {
        List<Id> chunk = (largeDataImportIds.size() > CHUNK_SIZE) 
            ? largeDataImportIds.subList(0, CHUNK_SIZE) 
            : largeDataImportIds;

        List<Id> remaining = (largeDataImportIds.size() > CHUNK_SIZE) 
            ? largeDataImportIds.subList(CHUNK_SIZE, largeDataImportIds.size()) 
            : new List<Id>();

        List<LargeDataImport__c> records = [
            SELECT Id, Name, External_ID__c, Industry__c, AnnualRevenue__c 
            FROM LargeDataImport__c 
            WHERE Id IN :chunk
        ];

        Set<String> extIds = new Set<String>();
        for (LargeDataImport__c ldi : records) {
            if (ldi.External_ID__c != null) extIds.add(ldi.External_ID__c);
        }

        Map<String, Account> existing = new Map<String, Account>();
        if (!extIds.isEmpty()) {
            for (Account acc : [
                SELECT Id, External_ID__c 
                FROM Account 
                WHERE External_ID__c IN :extIds
            ]) {
                existing.put(acc.External_ID__c, acc);
            }
        }

        List<Account> upserts = new List<Account>();
        for (LargeDataImport__c ldi : records) {
            Account acc = existing.containsKey(ldi.External_ID__c) 
                ? existing.get(ldi.External_ID__c) 
                : new Account(External_ID__c = ldi.External_ID__c);

            acc.Name = ldi.Name;
            acc.Industry = ldi.Industry__c;
            upserts.add(acc);
        }

        if (!upserts.isEmpty()) upsert upserts External_ID__c;

        if (!remaining.isEmpty()) {
            System.enqueueJob(new AccountProcessorQueueable(remaining));
        } else {
            System.debug('✅ All large data import records have been processed.');
        }
    }
}

🔧 Need Custom Help?

If you want this adapted to your org or need help building large-scale automation, feel free to reach out. I also offer custom Salesforce solutions, LWC components, API integrations, and more.

📩 drop your questions below in the comments!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses cookies to offer you a better browsing experience. By browsing this website, you agree to our use of cookies.