What are Salesforce Integration Patterns? Types, Best Practices
Table of Contents
If integrating your essential applications with Salesforce feels like a challenge, the problem is likely the integration blueprint you are using.
Salesforce integration patterns are these blueprints for solving common integration challenges between Salesforce and other systems, like legacy databases, ERP, or custom applications.
Choosing the right integration pattern is the most important technical decision you make for successful Salesforce integration.
This blog provides you with everything, from understanding integration patterns in Salesforce to its core categories and how to select the right pattern for your business needs. It will help you avoid costly mistakes and ensure your Salesforce architecture is future-proof.
What are Salesforce Integration Patterns?
Salesforce integration patterns are proven, standard methods for solving common integration challenges and connecting Salesforce with other applications. These patterns provide a framework for exchanging data and automating workflows between Salesforce and other systems to meet specific business requirements.
A recent MuleSoft Connectivity Benchmark Report 2025 highlights the same: 95% of IT leaders report facing challenges when integrating AI into the existing Salesforce processes, often due to flawed strategies.
Different Types of Integration Patterns in Salesforce
Here are the common integration patterns that serve as the foundation for connecting Salesforce with other systems.
1. Request and Reply (Synchronous)
It is a real-time, synchronous interaction where Salesforce sends a request and waits for an immediate response before proceeding. It is more likely a live, two-way conversation.
How it works:
Salesforce makes a direct API call to an external service. It holds the connection open, waiting for a response. Then, the process remains open until a reply (success or failure) is received and processed. This is often implemented using REST or SOAP APIs.
When to use:
- Ideal for business processes, where your users need instant results to continue their task (e.g., a credit check during an order process, validating shipping rates at checkout, or fetching real-time inventory levels before creating a quote).
Example: When your sales rep verifies customer credit in real-time during an opportunity, Salesforce calls the financial system and waits for the credit score before proceeding with the deal.
2. Fire and Forget (Asynchronous) and Broadcast
It is a one-way, asynchronous communication/announcement where Salesforce sends a message to an external system without waiting for a direct response.
How it works:
Salesforce sends a message (an “event”) onto a messaging channel (e.g., using Platform Events, Kafka, or a message queue) when a key business event occurs. Any number of other systems (ERP, marketing tool, or data warehouse) can view the message on their own time.
When to use:
- It is ideal for non-critical updates or when you need to trigger background processes
- To broadcast a business event (e.g., OpportunityClosed).
Example: When an account is updated in Salesforce, an account updated platform event is published. A separate marketing automation system subscribes to this event and uses it to update its own customer records, while a data warehouse system subscribes to log the change for analytics.
3. Batch Data Synchronization and Data Replication
It is a process for scheduled, bulk data transfer rather than in real-time, designed to replicate large datasets from a source system like an ERP to Salesforce to ensure data consistency.
How it works:
This integration pattern in Salesforce works through a structured, automated ETL (Extract, Transform, Load) workflow:
- Extraction and delta detection: A scheduled job uses tools Salesforce Data Loader, Informatica, or Mulesoft to extract all the records from a source system (like ERP). Its first job is intelligent delta detection, which identifies only the records that have been created or modified since the last sync by using timestamps or change logs.
- Transformation and mapping: The extracted data is then processed and transformed, involving cleansing the data and applying business rules to ensure it aligns with the target format in Salesforce (e.g., mapping an external “CustID to the Salesforce “Account Number”).
- Bulk loading and replication: This dataset is then pushed into Salesforce in bulk. This step is data replication itself, where the target objects in Salesforce are created or updated to the current state of the source data, providing a reliable data copy.
When to use:
- This process is ideal for syncing large master data sets (Products, Customers) where real-time accuracy is not critical.
- For reporting and data warehousing scenarios where data is consolidated from multiple sources into Salesforce for analysis.
Example: A financial services company runs a batch job every night at 2 AM to sync all new and updated Client records from their core banking system into Salesforce to ensure their relationship managers have the latest information each morning.
4. UI Integration
UI integration refers to embedding the user interface and functionality of an external application directly within the Salesforce Lightning interface.
How it works:
UI integration brings the external app’s interface directly into a Salesforce record page. This is typically achieved using technologies like Lightning Web Components (LWC). The external application appears as a native part of the Salesforce page, and users can interact with it without leaving Salesforce.
When to use:
- When you need to eliminate context-switching and boost user productivity.
- To provide a 360-degree view of the customer by surfacing data from legacy or specialized systems directly in Salesforce.
- When the external application is used frequently with Salesforce data.
Example: An insurance agent can view and update a customer’s specific policy details from a legacy system, all within the customer’s Account page in Salesforce. They get a complete view without toggling between both applications.
5. Data Virtualization
It refers to accessing the external data in real-time without storing it in Salesforce.
How it works:
Salesforce creates a live connection to the external system (like an ERP or database). When a user views a record in Salesforce, it instantly fetches the required data (on-demand) directly from the source system via an API call. The data is retrieved in real-time for viewing, but is not saved in the Salesforce database.
When to use:
- When your team needs to view real-time, frequently changing data (like live inventory or stock prices).
- For large data that is sensitive or governed to be stored in Salesforce.
- To maintain a single, authoritative source of truth in the external system.
Example: A sales rep on a Product record in Salesforce can see the current “Live Inventory Count” directly from the warehouse management system. This number updates in real-time without creating any inventory data inside Salesforce.
6. Bi-directional Sync
This pattern creates a two-way sync of data, ensuring that changes made in either Salesforce or the external systems are automatically reflected in the other.
How it works:
This pattern establishes a continuous feedback loop between systems. It’s implemented in two ways:
- API-driven sync: Each system is configured to send an immediate API call to the other whenever a relevant record is created or updated. This is often built using the request-reply pattern for immediate consistency.
- Event-driven sync: Each system publishes an event when a change occurs in any. The other system subscribes to these events, updates the records accordingly. You can build this using the fire-and-forget pattern.
When to use:
Implement this Salesforce integration pattern when data is truly owned and updated collaboratively. It is essential for maintaining a single, unified view of shared entities like customers, contacts, and cases across the entire organization.
Crucial consideration:
The main challenge of this pattern is the edit collision. For example, what happens when a sales rep changes a contact’s email address in Salesforce, and at the same time, a marketer changes the same contact’s email address in the marketing platform?
For that, you must architect the Salesforce integration and decide which system plays the main system role when updating information in both systems at the same time.
Quick Comparison Between Salesforce Integration Patterns
| Pattern | Best for | Data freshness |
|---|---|---|
| Request and reply | User tasks needing instant results | Real-time |
| Fire and forget | Background processes | Near real-time |
| Batch sync | Larger data volumes, scheduled data loads | Delayed (hours/days) |
| UI integration | Unified user experience | Varies (on-demand) |
| Data virtualization | Live data access without storage | Real-time |
| Bi-directional sync | Collaborative data maintenance across systems | Near real-time |
How to Choose the Right Pattern?
To execute a successful Salesforce integration, start by understanding your business process and asking yourself a few key questions about how your organization actually operates.
1. What is your specific business goal?
Firstly, you should align your integration strategy with your specific business needs:
- For data consistency across systems: Choose a pattern designed for synchronization.
- For keeping datasets: You can consider batch data synchronization.
- For real-time visibility without storing data: You should consider data virtualization.
- For process automation: Select between request-reply and fire-and-forget.
- Use Request-Reply for step-by-step workflows where Salesforce directs the process.
- Use Fire-and-Forget for complex processes where multiple systems react to business events simultaneously.
- For unified user experience: You can go for UI integration as we discussed earlier, eliminating context switching for your teams.
2. What are the timeline requirements? (Real-time vs scheduled)
Choose your integration pattern based on how quickly data needs to move:
- Real-time processes: For actions like immediate responses, like credit verification during sales or inventory checks during ordering, you can request and reply (synchronous), it delivers the real-time interaction your users expect.
- Scheduled processes: When your business can accommodate scheduled updates, such as daily sales reports or overnight customer data syncs, choose Batch Synchronization or Fire-and-Forget (asynchronous). These offer better performance and resilience for non-urgent operations.
3. What are the data volume and frequency requirements? (Volume & load)
Your integration architecture choice further helps in scaling business operations.
- High volume, low frequency: Batch synchronization is your only choice.
- Low volume, high frequency: Request-reply or Fire-and-forget work well.
- Massive, read-only data: Data virtualization prevents your Salesforce data storage.
How volume and frequency affect your architecture?
- High-volume, periodic transfers: For high-volume, periodic transfers like thousands of records updated nightly, you should go for batch data synchronization.
- Frequent, smaller interactions: For ongoing, smaller updates like individual record updates or status changes, you can choose request and reply or fire-and-forget patterns.
- Large, read-only datasets: If you frequently access massive datasets that shouldn’t be stored in Salesforce (e.g., live inventory or financial data), Data Virtualization is the best fit. It provides real-time visibility without consuming Salesforce storage.
By your data volume, update frequency, and business needs, easily identify which integration pattern fits best.
Best Practices for Salesforce Integration Patterns Architecture
1. Design for Loose Coupling
You should build your integrations in a way that if one system goes down or changes, it doesn’t break the others. The fire-and-forget pattern can be a good choice in such situations, as a change in one system should not require redeployment of another.
2. Always have a plan B for integration failure
Assume networks fail and APIs timeout. For request and reply, design fallbacks (e.g., “Credit check unavailable, proceed manually?”). For batch data sync, you can have retry mechanisms and failure notifications.
3. Govern your APIs and data contracts
Think of your integration as a contract between Salesforce and your ERP system.
- For request and reply (API Calls): If your ERP provides an API for customer data, you must agree on a version (v1). If the ERP team changes the API from v1 to v2, Salesforce can keep using v1 until it is ready to upgrade, preventing an immediate breakdown.
- For fire and forget (Events): If Salesforce sends an “AccountUpdated” event, you must define its “shape” (e.g., it will always include AccountId and Name). This ensures every subscribing system (like your Marketing platform) knows how to read it correctly, avoiding confusion and errors.
4. Prioritize observability over just monitoring
You should not just check if the integration is working properly or not. Instead, you should examine your integrations to track end-to-end latency, message throughput, and error rates across all systems in the flow. When a batch job fails, you should know why in minutes, not hours.
5. Talk about data upfront
Before connecting systems, you must agree on a common language for both systems. Does “Customer Status” mean the same thing in both systems? Formalize this data mapping in a document to avoid messy and costly data cleanup later.
Common Mistakes When Selecting an Integration Approach
1. Point-to-Point Overload
Connecting every system directly with Salesforce in a point-to-point integration is not correct. It creates a complex web where a change in one system causes failures in different ways.
How to avoid this? You can introduce a middleware layer like Mulesoft to act as a central nervous system, simplifying connectivity and enforcing governance.
2. Underestimating the Total Cost of Ownership (TCO)
Choosing point-to-point integration, because it is fast to build, can cause technical debt later. You ignore the maintenance costs that come over the years.
How to avoid this? You should evaluate connecting systems based on long-term maintenance, not just for initial development speed. Instead, you should rely on the loose coupling (less dependency on each other’s systems) pattern that has a higher initial cost but a much lower TCO later.
3. Deciding the Designated System Upfront
Failing to designate a system of record for each data is crucial. For example, when you update the same record in both Salesforce and the ERP system at the same time, which system plays the leading role?
How to avoid this? You should establish clear data ownership policies and implement them in your integration logic (like, in the Salesforce ERP integration, ERP always overwrites the product description in Salesforce).
Pro tip: Navigating Salesforce integration patterns and avoiding these mistakes requires technical expertise. If your in-house team lacks bandwidth or knowledge, partnering with a Salesforce integration expert can accelerate time-to-value and ensure you build a scalable foundation through integration.
The Future of Integration: Key Trends 2026
The Rise of Event-driven Architecture
Moving beyond batch to real-time sync is standard now. Platform Events (Publish/ subscribe model) and Change Data Capture (allows external systems to receive near real-time notifications of changes to Salesforce records) are making it easier to implement fire-and-forget patterns.
AI-powered Integration
AI is monitoring data flows and can now predict integration failure by analyzing latency trends, auto-scale resources based on load, and suggest optimal data mapping by learning from historical patterns.
Conclusion
Salesforce integration patterns are the foundational choices that determine your organization’s agility, data integrity, and operational efficiency. When implemented thoughtfully, the right pattern can turn Salesforce from a standalone CRM into the connected core of your entire business ecosystem.
Choosing and implementing these patterns correctly requires deep expertise. If you are looking to ease the complex integration or build a new, future-proof architecture.
That’s where Cyntexa comes in. As a trusted Salesforce integration company, we specialize in simplifying complex integrations and building future-ready architectures. Our team will analyze your specific business processes, map them to the optimal patterns, and provide you with a strategic roadmap to a scalable and maintainable Salesforce ecosystem.
Don’t Worry, We Got You Covered!
Get The Expert curated eGuide straight to your inbox and get going with the Salesforce Excellence.
AUTHOR
Vishwajeet Srivastava
Salesforce Data Cloud, AI Products, ServiceNow, Product Engineering
Co-founder and CTO at Cyntexa also known as “VJ”. With 10+ years of experience and 22+ Salesforce certifications, he’s a seasoned expert in Salesforce Data Cloud & AI Products, Product Engineering, AWS, Google Cloud Platform, ServiceNow, and Managed Services. Known for blending strategic thinking with hands-on expertise, VJ is passionate about building scalable solutions that drive innovation, operational efficiency, and enterprise-wide transformation.

Cyntexa.
Join Our Newsletter. Get Your Daily Dose Of Search Know-How