KISoftware

AI Integration into Existing IT Systems (ERP, CRM, PIM)

Jamin Mahmood-Wiebe

Jamin Mahmood-Wiebe

Integration diagram of an AI solution with ERP, CRM, and PIM systems
Article

AI Integration into Existing IT Systems: Connecting ERP, CRM, and PIM

Most enterprises have built their IT landscape over years: SAP as the ERP backbone, Salesforce or HubSpot as the CRM, a PIM system like Akeneo or Pimcore for product data. These systems work -- but they are silos. Data does not flow automatically, processes break at system boundaries, and manual transfers cost time and produce errors.

AI agents promise the greatest leverage precisely here. Not as replacements for existing systems, but as an intelligent integration layer that connects data, automates processes, and makes decisions based on real-time information from all systems. The challenge: How do you integrate AI into a mature, often heterogeneous IT landscape without rebuilding everything?

This article provides the technical roadmap -- from API strategy to middleware patterns to concrete integration with SAP, Salesforce, Microsoft Dynamics, and PIM systems. It is written for CTOs and IT leaders who want to deploy AI not as an isolated experiment, but as a productive component of their system landscape.

Why Integration Is the Hardest Part of AI Adoption

Most AI projects do not fail because of the AI itself. They fail because of integration. An LLM can deliver brilliant analyses -- but if it has no access to current ERP data, it remains a toy. An AI agent can handle customer inquiries -- but if it cannot read and write to the CRM, no value is created.

According to a McKinsey study, enterprises spend up to 70% of their AI project budget on integration and data preparation -- not on AI development itself. This is not a bug. It is the reality of mature IT landscapes.

The three biggest integration problems:

  • Data silos: ERP, CRM, and PIM contain complementary information that must be manually consolidated. A sales rep needs product data from the PIM, customer data from the CRM, and inventory data from the ERP -- and copies it by hand into their quote calculation.
  • Inconsistent interfaces: Each system offers different APIs, data formats, and authentication mechanisms. SAP speaks OData, Salesforce REST with SOQL, HubSpot has its own API convention. An AI agent must master all these dialects.
  • Real-time vs. batch: Some systems deliver data in real time (Salesforce Platform Events), others only through nightly exports (many custom ERPs). An AI agent using yesterday's inventory data makes wrong decisions today.

Solving these problems unlocks the full potential of AI agents in the enterprise. Ignoring them means building expensive prototypes that never reach production.

The Four Integration Patterns at a Glance

Before diving into system-specific integration, you need to make the fundamental architectural decision: How do you connect AI agents with your existing systems? There are four proven patterns that differ in complexity, flexibility, and use case.

Pattern 1: API-First (Direct Integration)

The AI agent communicates directly with the target system via REST or GraphQL APIs. No middleman, minimal latency, maximum control.

Structure:

  • Agent calls system API directly (e.g., SAP OData, Salesforce REST API)
  • Authentication via OAuth 2.0 or API keys
  • Data is queried and written synchronously

Advantages:

  • Lowest latency (typically < 200 ms)
  • Full control over data flow
  • No additional infrastructure overhead
  • Fastest implementation for individual systems

Disadvantages:

  • Tight coupling to the target system
  • Every API change requires agent adaptation
  • Complexity grows exponentially with the number of systems
  • No centralized error handling

Best for: Few systems (1-3), stable APIs, simple data flows, proof-of-concepts.

Pattern 2: Middleware / iPaaS (Integration Platform as a Service)

An integration platform such as MuleSoft, Workato, Boomi, or n8n sits between the agent and target systems. It handles protocol conversion, data mapping, and error handling.

Structure:

  • Agent communicates with a unified middleware API
  • Middleware translates into system-specific calls
  • Central monitoring and error handling
  • Pre-built connectors for common systems (SAP, Salesforce, HubSpot)

Advantages:

  • Decoupling from target systems
  • Unified interface for the agent
  • Central logging and monitoring
  • Reusable connectors reduce development time
  • Easier maintenance when systems change

Disadvantages:

  • Additional infrastructure and costs (MuleSoft: from EUR 15,000/year, Workato: from EUR 10,000/year)
  • Higher latency due to additional hop (typically 50-150 ms overhead)
  • Middleware becomes a single point of failure
  • Vendor lock-in with proprietary platforms

Best for: Complex landscapes (4+ systems), heterogeneous APIs, regulated environments, enterprises with existing iPaaS infrastructure.

Pattern 3: Direct Database Integration

Direct access to the target system's database via read replicas or CDC (Change Data Capture).

Structure:

  • Read replica of the source database
  • CDC tools like Debezium capture changes from the database changelog
  • AI agent reads from the replica, never directly from the production database

Advantages:

  • Access to all data, even when no API exists
  • No rate limits
  • High performance for bulk queries

Disadvantages:

  • Only meaningful for reads (writing bypasses business logic)
  • Tight coupling to database schema
  • Schema changes can break the integration
  • No business validation

Best for: Legacy systems without APIs, bulk extracts, analytics scenarios, data migration.

Pattern 4: Event-Driven (Kafka, RabbitMQ)

Systems communicate asynchronously via events. The AI agent subscribes to relevant events (e.g., "New order in ERP") and reacts accordingly.

Structure:

  • Message broker (Apache Kafka, RabbitMQ, Azure Service Bus)
  • Systems publish events on state changes
  • Agent consumes events and triggers actions
  • Results are published as new events

Advantages:

  • Maximum decoupling between systems
  • Horizontally scalable and resilient
  • Natural support for asynchronous workflows
  • Replay capability for error handling and audit

Disadvantages:

  • Highest setup complexity
  • Eventual consistency instead of immediate data updates
  • Debugging asynchronous flows is demanding
  • Requires event-streaming expertise

Best for: Highly scaled environments, event-driven business processes, real-time requirements, microservice architectures.

Comparison Table: Integration Patterns at a Glance

CriterionAPI-FirstMiddleware / iPaaSDirect DatabaseEvent-Driven
LatencyLow (< 200 ms)Medium (200-500 ms)LowVariable
CouplingTightLooseVery tightMinimal
Setup EffortLowMediumMediumHigh
ScalabilityLimitedGoodGood (reads)Excellent
Error HandlingIn agentCentralizedManualVia replay
Running CostsLowMedium-HighLowMedium
MonitoringPer connectionCentralizedPer connectionCentralized
Write AccessYes (API)Yes (API)Not recommendedYes (event)
Suitable from1-3 systems4+ systemsLegacy w/o APIEvent-based arch.

In practice, at IJONIS we frequently combine patterns 1 and 2: direct API integration for the one or two most critical systems, middleware for everything else. Event-driven is added when real-time reactions to business events are required. We make this architectural decision as part of our AI automation projects.

Enterprise Systems in Detail: ERP, CRM, and PIM

Each system category has its own integration characteristics. The following overview shows the key systems and their preferred integration paths.

System-Specific Integration Approaches

SystemCategoryPreferred APIAuth MethodReal-Time CapabilityAI Integration Depth
SAP S/4HANAERPOData v4 / BTP APIsOAuth 2.0, SAP PassportEvent Mesh (BTP)High (Read + Write)
Microsoft Dynamics 365ERP/CRMDataverse Web API (OData v4)Azure AD OAuthWebhooks, Event GridHigh
Oracle ERP CloudERPREST API + SOAPOAuth 2.0Oracle Integration CloudMedium-High
SalesforceCRMREST API, Bulk API, SOQLOAuth 2.0Platform EventsHigh
HubSpotCRMREST API v3API Keys, OAuthWebhooksMedium-High
Microsoft Dynamics CRMCRMDataverse Web APIAzure AD OAuthWebhooksHigh
AkeneoPIMREST APIOAuth 2.0Webhooks (Enterprise)Medium
PimcorePIMREST API + GraphQLAPI Key, Bearer TokenEvent ListenersMedium
SalsifyPIMREST APIAPI KeyWebhooksMedium

ERP Integration: SAP, Microsoft Dynamics, and Oracle

The ERP is the heart of every enterprise IT landscape. Financial data, orders, inventory levels, and production data converge here. An AI integration must be able to read this data -- and in many cases write back to it.

SAP S/4HANA -- Recommended Integration Path:

SAP offers a comprehensive integration ecosystem through its Business Technology Platform (BTP). For AI agents, we recommend:

  • SAP OData Services for structured CRUD operations on business objects (orders, customers, material masters). Batch requests and delta queries enable efficient bulk operations.
  • SAP Event Mesh (BTP) for event-driven scenarios: The AI agent subscribes to business events (e.g., "Order created", "Goods receipt posted") and reacts in real time.
  • SAP Integration Suite as an iPaaS layer when connecting systems beyond SAP.

Practical example: An AI agent analyzes incoming orders in SAP, supplements missing data from the CRM (customer segment, credit limit), checks inventory in real time, and writes the complete order back via OData. When discrepancies arise (unknown customer, invalid article code), the agent escalates to the responsible clerk -- with a summary of the problem and a proposed solution.

Microsoft Dynamics 365:

The Dataverse Web API (OData v4) is the primary integration point. Change Tracking enables delta synchronization without full data pulls. Azure Integration Services (Logic Apps, Service Bus, Event Grid) provide enterprise-grade integration for event-driven architectures. Power Platform connectors work for simple trigger-based workflows but are too limited for complex AI agent scenarios.

Oracle ERP Cloud:

Oracle offers REST and SOAP APIs. The Oracle Integration Cloud (OIC) serves as an iPaaS layer. For AI integrations, we recommend the REST API path with OAuth 2.0. Note: Oracle APIs are often less well-documented than SAP or Microsoft. Plan additional development time accordingly.

CRM Integration: Salesforce, HubSpot, and Microsoft Dynamics CRM

The CRM is the customer interface. AI agents integrated here can automate customer communication, optimize lead scoring, and accelerate sales processes.

Salesforce -- The De Facto Standard:

Salesforce offers the most mature API ecosystem in the CRM space:

  • REST API + SOQL: Standard CRUD on all Salesforce objects with a powerful query language. Bulk API for mass operations (up to 10,000 records per batch).
  • Platform Events: Real-time event streaming. The AI agent subscribes to events like "Lead created", "Opportunity stage changed" and reacts immediately.
  • Rate Limits: Typically 100,000 API calls/day on Enterprise Edition. For intensive AI usage, this is not always sufficient -- plan caching and batching strategies accordingly.

Practical example: An AI agent analyzes new leads in Salesforce, enriches them with external data (company profile via Clearbit, industry data, company size), calculates a lead score, and writes the score, recommended next steps, and a needs-analysis summary back into the CRM. Sales sees a prioritized dashboard in the morning instead of an unsorted lead list.

HubSpot:

HubSpot offers the lowest barrier to entry. The v3 API is well-documented, consistently structured, and equipped with generous rate limits (100 requests/10 sec on Professional). Webhooks enable real-time notifications. Custom Objects make HubSpot extensible for individual data models. Ideal for mid-market companies seeking rapid AI integration without enterprise-level complexity.

Microsoft Dynamics CRM:

Uses the same Dataverse Web API as Dynamics 365 ERP. The advantage: if you already run Dynamics 365 as your ERP, the CRM integration is essentially included at no additional cost. Azure AD OAuth provides seamless authentication within the Microsoft ecosystem.

PIM Integration: Product Data as AI Foundation

Product Information Management systems (PIM) are often the forgotten third player. Yet they contain the most structured and richest data in the enterprise: product descriptions, technical specifications, media, classifications, translations.

PIM-specific AI use cases:

  • Automatic product descriptions: AI agent reads technical specifications from the PIM and generates SEO-optimized descriptions for various channels (web shop, Amazon, print catalog)
  • Data quality checks: Agent identifies missing mandatory fields, inconsistent attribute values, and suggests corrections -- across thousands of products in minutes instead of weeks
  • Classification and tagging: Automatic assignment of products to categories based on attributes and descriptions
  • Cross-selling recommendations: Agent analyzes product relationships and suggests bundles or accessories

Akeneo offers the best API documentation in the PIM space (REST, JSON-based). Pimcore excels with flexibility through REST API + GraphQL and its open-source license. Salsify is cloud-native and strong in the e-commerce domain.

Data Synchronization Strategies: Real-Time vs. Batch

The greatest technical challenge in AI integration is not the individual API connection -- it is the consistent synchronization of data across multiple systems. If the AI agent works with stale data, it makes wrong decisions. If it receives inconsistent data from different systems, it produces contradictory results.

Real-Time Synchronization (Change Data Capture)

Changes in source systems are automatically detected and propagated to the AI agent:

  • Database level: Tools like Debezium read database changelogs and stream changes via Kafka
  • API level: Webhooks or polling with delta tokens (e.g., Salesforce Change Data Capture, Dynamics Change Tracking)
  • Advantage: Near real-time, minimal load on source systems
  • Best for: Inventory data, prices, transactions -- data where freshness is business-critical

Batch Synchronization (Scheduled Sync)

Regular, planned synchronization at defined intervals:

  • Time-based: Nightly full syncs or hourly delta syncs
  • Tools: Apache Airflow, dbt, custom ETL pipelines
  • Advantage: Simple to implement, predictable load, easy to test
  • Best for: Master data, PIM data, reporting data -- data that changes infrequently

Conflict Resolution in Bidirectional Synchronization

When the AI agent writes data to multiple systems, conflicts inevitably arise: What happens when a sales rep changes a customer address in the CRM while the AI agent simultaneously updates the same address from the ERP?

Proven conflict resolution strategies:

  • Last-Write-Wins: The most recent write operation wins. Simple but risky -- can overwrite valid changes.
  • Source-of-Truth principle: For each data type, there is a leading system. Address data comes from the ERP, lead scores from the CRM. Conflicts are automatically resolved in favor of the leading system.
  • Merge with manual escalation: On conflict, both versions are stored and a human decides. Safe but does not scale.
  • Versioning with Conflict-Free Replicated Data Types (CRDTs): Mathematically guaranteed conflict freedom. Complex to implement but ideal for highly automated environments.

Synchronization Strategy by Data Type

Data TypeRecommended StrategyUpdate FrequencyConflict Resolution
Inventory levelsCDC / Real-timeSeconds to minutesSource-of-Truth (ERP)
Price dataCDCMinutes to hoursSource-of-Truth (ERP/PIM)
Customer master dataScheduled syncDailyMerge + escalation
Product data (PIM)Scheduled syncDaily or on releaseSource-of-Truth (PIM)
Transaction dataCDC / Real-timeReal-timeLast-Write-Wins
Lead scoresOn-demandOn calculationSource-of-Truth (AI)
Reporting dataScheduled syncNightlyN/A (read-only)

A solid data infrastructure is the prerequisite for every one of these strategies. Without clean pipelines and data quality checks, synchronization becomes guesswork.

Security in System Integrations

Integrating AI into existing systems multiplies the attack surface. Every API connection is a potential entry point. Every data transfer a risk. Security must be designed in from the start -- not as a retroactive audit.

Authentication and Authorization

  • OAuth 2.0 with Client Credentials for system-to-system communication. No human credentials in agent configurations.
  • API keys only as a fallback for systems that do not support OAuth. Rotate keys, store them in secrets managers (Azure Key Vault, AWS Secrets Manager, HashiCorp Vault).
  • mTLS (Mutual TLS) for highly sensitive connections: Both sides authenticate each other via certificates. Standard in banking and insurance, increasingly adopted in mid-market enterprises.
  • Principle of Least Privilege: The AI agent receives only the permissions it needs for its specific task. No admin access, no wildcard permissions.

Network Segmentation

  • VPN or Private Link for connections between cloud AI and on-premise systems (e.g., SAP running on your own servers)
  • API Gateway as a central entry point with rate limiting, IP whitelisting, and request validation
  • Network policies: Isolate AI agents in their own network segments. Only explicitly allowed connections to target systems.
  • Zero Trust Architecture: Every request is authenticated and authorized -- even within the internal network.

Data Protection and GDPR Compliance

  • Data in transit: TLS 1.3 for all API connections. No exceptions.
  • Data at rest: Encryption of all intermediate stores and caches used by the AI agent.
  • Logging: No personally identifiable information (PII) in plain text in logs. Use pseudonymization or tokenization.
  • Data processing agreements with all involved cloud services (iPaaS, LLM provider, hosting).
  • Update the processing register (Art. 30 GDPR) -- every new AI integration is a new processing purpose.
  • Data protection impact assessment (Art. 35 GDPR) when the agent makes automated decisions about individuals.

Real-World Integration Scenarios

Scenario 1: Automated Order Processing (ERP + CRM)

Problem: Incoming customer orders via email must be manually transferred into the ERP. Customer data is looked up in the CRM, article data checked in the ERP, availability verified. A clerk needs 15-20 minutes per order.

Solution: An AI agent receives orders (email, web form), extracts structured data, matches the customer automatically in the CRM (Salesforce REST API), checks article availability in the ERP (SAP OData), and writes the order directly into the ERP when all checks pass. When issues arise (unknown customer, missing article, credit limit exceeded), it escalates with a complete analysis.

Architecture: API-First for SAP and Salesforce, middleware (n8n) for email ingestion and orchestration.

Result: Processing time per order reduced from 15 minutes to 30 seconds. 85% of orders fully automated.

Scenario 2: Intelligent Lead Routing (CRM + PIM + External Data)

Problem: New leads in HubSpot are distributed to sales reps based on simple rules (region, company size). The rules do not consider which products the lead inquired about, how well that matches the sales rep's portfolio, or whether the lead belongs to an existing account.

Solution: An AI agent analyzes each new lead: It reads product interests from the HubSpot inquiry, matches them against the product catalog in the PIM (Akeneo), checks via CRM whether the lead belongs to an existing customer, and routes the lead to the sales rep with the highest probability of closing.

Architecture: HubSpot webhooks trigger the agent, API-First for Akeneo and HubSpot.

Result: Lead response time reduced from 4 hours to 12 minutes. Conversion rate +23% through better matching.

Scenario 3: Automated Product Data Maintenance (PIM + ERP + Web Shop)

Problem: 12,000 products in Pimcore PIM, of which 3,000 lack complete descriptions, 800 have inconsistent attributes, and 200 have outdated prices from the ERP.

Solution: An AI agent synchronizes prices from the ERP (scheduled sync, daily), checks all product records for completeness and consistency, generates missing descriptions based on technical specifications, and writes the results back into the PIM as drafts. An editor reviews and approves.

Architecture: Batch sync for price data (ERP -> PIM), API-First for Pimcore read/write operations.

Result: Product data quality improved from 67% to 94%. Time spent on data maintenance reduced from 3 full-time positions to 0.5.

The Integration Roadmap: From Analysis to Production

Based on our project experience at IJONIS, we recommend a structured roadmap in five phases:

Phase 1: System Landscape Analysis (1-2 Weeks)

  • Inventory of all relevant systems and their interfaces
  • Assessment of API quality (documentation, stability, rate limits)
  • Identification of critical data flows and process breakpoints
  • Mapping of data models between systems
  • Assessment of data quality in source systems

Phase 2: Architecture Decision (1 Week)

  • Selection of integration pattern (API, middleware, event-driven, hybrid)
  • Definition of synchronization strategy per data type
  • Security concept and GDPR assessment
  • Define technology stack
  • Build-or-buy decision for connectors

Phase 3: Connector Development (3-6 Weeks)

  • Implementation of API integrations with error handling
  • Data mapping and transformation logic
  • Retry mechanisms, circuit breakers, dead letter queues
  • Unit and integration tests against sandbox environments

Phase 4: Agent Integration (2-4 Weeks)

  • Create tool definitions for the AI agent
  • Enrich agent prompts with system knowledge and business rules
  • End-to-end tests with real data (read-only first)
  • Human-in-the-loop mechanisms for critical write operations

Phase 5: Production Deployment and Monitoring (Ongoing)

  • Gradual rollout (pilot department, then expansion)
  • Monitoring dashboard for all integrations
  • Feedback loops between business departments and AI team
  • Continuous optimization based on metrics

We apply this same roadmap in our process automation with AI work -- integration into existing systems is the critical success factor there.

FAQ: AI Integration into Existing IT Systems

Do I need to replace my existing ERP to use AI?

No. AI integration works as an additional layer on top of your existing systems. Neither SAP, Dynamics, nor your custom ERP needs to be replaced or fundamentally changed. The AI agent uses the existing APIs and interfaces. In most cases, no changes to the source system are required -- this is the decisive advantage of the API-first approach.

How long does a typical AI integration take?

From analysis to productive integration, we at IJONIS estimate 8-14 weeks. A proof-of-concept with a single system (e.g., CRM integration) is often achievable in 4-6 weeks. Complexity increases with the number of systems and depth of integration. Bidirectional synchronization with conflict resolution takes longer than read-only integrations.

Which system landscapes are best suited for AI integration?

Enterprises with clearly defined business processes that span multiple systems and require manual data transfers. Typical candidates: order processing (ERP + CRM), product data management (PIM + shop + marketplaces), customer service (CRM + knowledge base + ticket system). The more manual steps between systems, the higher the automation leverage.

What are the most common technical hurdles?

Three problems dominate: (1) Outdated or poorly documented APIs in legacy systems -- the direct database approach often helps as a fallback here. (2) Inconsistent data models between systems, e.g., different customer IDs in ERP and CRM -- solvable through centralized ID mapping. (3) Rate limits that throttle the AI agent's throughput -- solvable through caching, batch operations, and intelligent request scheduling.

How do I ensure the integration is GDPR-compliant?

Four measures are critical: First, establish data processing agreements with all involved cloud services. Second, process personal data only for defined purposes and minimally -- the AI agent should receive only the data it needs for its task. Third, ensure complete logging with pseudonymized data for the audit trail. Fourth, conduct a data protection impact assessment (DPIA) when the agent makes automated decisions about individuals.

Conclusion: Integration Is the Key to Productive AI

The most powerful AI is useless if it is cut off from your company's data and processes. Integration into existing IT systems -- ERP, CRM, PIM -- is not the optional extra. It is the foundation. Without it, every AI project remains an isolated experiment.

The good news: With the right architecture, proven integration patterns, and a structured approach, integration is manageable. You do not need to replace your existing systems or execute a big-bang migration. An incremental approach -- system by system, process by process -- delivers results faster and minimizes risk.

The critical success factor is not choosing between API-first and middleware. It is the careful analysis of your system landscape, the definition of clear data flows, and a security concept that stands from day one.

Want to integrate AI agents into your existing IT landscape? Talk to us about your integration strategy -- we analyze your system landscape and develop a concrete roadmap for productive AI integration.


How ready is your company for AI? Find out in 3 minutes with our free, AI-powered readiness assessment. Take the free assessment →

End of article

AI Readiness Check

Find out in 3 min. how AI-ready your company is.

Start now3 min. · Free

AI Insights for Decision Makers

Monthly insights on AI automation, software architecture, and digital transformation. No spam, unsubscribe anytime.

Let's talk

Questions about this article?.

Jamin Mahmood-Wiebe

Jamin Mahmood-Wiebe

Managing Director

Book appointment
WhatsAppQuick & direct

Send a message

This site is protected by reCAPTCHA and the Google Privacy Policy Terms of Service apply.