Introduction
In Salesforce projects, it’s common to integrate with external systems by sending large volumes of records via REST APIs. However, due to Salesforce governor limits (like heap size and callout limits), you cannot send all records in one go.
The solution? Queueable Apex with batch processing.
In this post, we’ll walk through a reusable Salesforce Apex class (JsonSender) that:
- Splits records into smaller batches (default 200 per batch)
- Converts records into JSON payloads
- Sends them via HTTP callouts
- Chains jobs automatically until all records are sent
Why Use Queueable Apex for Callouts?
- Handles asynchronous processing
- Supports chained execution (for multiple batches)
- Allows callouts (unlike batch Apex
finish) - Helps stay within governor limits
The JsonSender Class (Fixed Version)
Here’s the final, production-ready version of our JsonSender utility:
public class JsonSender implements Queueable, Database.AllowsCallouts {
private List<SObject> records;
private Integer startIndex;
private String endpoint;
private static final Integer BATCH_SIZE = 200;
public JsonSender(List<SObject> records, String endpoint, Integer startIndex) {
this.records = records;
this.endpoint = endpoint;
this.startIndex = startIndex != null ? startIndex : 0;
}
public void execute(QueueableContext context) {
Integer endIndex = Math.min(startIndex + BATCH_SIZE, records.size());
List<SObject> batch = new List<SObject>();
for (Integer i = startIndex; i < endIndex; i++) {
batch.add(records[i]);
}
sendBatch(batch);
if (endIndex < records.size()) {
Id jobId = System.enqueueJob(new JsonSender(records, endpoint, endIndex));
System.debug('Chained Job ID: ' + jobId);
}
}
private void sendBatch(List<SObject> batch) {
String jsonPayload = JSON.serialize(new Map<String, Object>{
'data' => batch,
'count' => batch.size()
});
HttpRequest req = new HttpRequest();
req.setEndpoint(endpoint);
req.setMethod('POST');
req.setHeader('Content-Type', 'application/json');
req.setBody(jsonPayload);
req.setTimeout(120000);
Http http = new Http();
try {
HttpResponse res = http.send(req);
if (res.getStatusCode() >= 400) {
System.debug('Error sending data. Status: ' + res.getStatusCode() +
', Response: ' + res.getBody());
} else {
System.debug('✅ Sent ' + batch.size() + ' records successfully. Status: '
+ res.getStatusCode());
}
} catch (Exception e) {
System.debug('❌ Callout failed: ' + e.getMessage());
}
}
// Helper method to start sending
public static void send(List<SObject> records, String endpoint) {
if (!records.isEmpty()) {
System.enqueueJob(new JsonSender(records, endpoint, 0));
}
}
}
How It Works
- Initialization
Call the class with:JsonSender.send(accountList, 'https://api.example.com/data'); - Batching
Records are chunked into 200 per batch (configurable). - Serialization
Each batch is converted into JSON:{ "data": [...records...], "count": 200 } - Callout
Data is sent via HTTPPOSTto the external endpoint. - Job Chaining
If more records remain, a new Queueable job is enqueued automatically.
✅ Benefits of This Approach
- Works with any SObject list
- Handles large record volumes safely
- Provides robust error handling
- Easy to extend (logging, retries, etc.)
🚀 Conclusion
By combining Queueable Apex + Callouts + JSON, we can build scalable and reliable Salesforce integrations.
This pattern ensures:
- Bulk records are processed efficiently
- Salesforce limits are respected
- API calls are reliable with error handling
You can now adapt JsonSender for Accounts, Contacts, Opportunities, or any custom object that needs to be sent to an external system.
