LLMs Meet API Catalogs: A Practical Path to Intelligent Integration
The core idea was brewing in my mind, but it was brought to life during a recent APIDAYS workshop. The session was led by a banking AI team, which integrated its API catalog with its LLMs. The idea was to use this approach in the absence of an enterprise MCP catalog, which was recently released.
Large enterprises often maintain an API catalog—a centralised registry of their REST, SOAP, and event-driven interfaces. Traditionally, these catalogs have been used for governance, discovery, and compliance.
But with the rise of Large Language Models (LLMs), a new integration pattern is emerging: rather than relying on a third-party MCP-style catalog, firms are directly linking LLMs to their existing API catalogs. This unlocks the ability for developers, analysts, or even business users to query, explore, and auto-generate integration flows in natural language.
Why This Matters
Firms already have API catalogs (Apigee, Kong, MuleSoft Anypoint, or custom)
LLMs can parse OpenAPI specs, understand endpoints, parameters and response schemas
Integration scenarios accelerate: instead of manually browsing, users can ask: “How do I fetch open invoices for a given customer?” and get back the correct API and sample code
Integration Approach
Extracting APIs from the Catalog
Use Exchange APIs (e.g., MuleSoft Anypoint Exchange REST API) to pull OpenAPI specs
Normalizing API Specs
Convert to a standard JSON schema (endpoint, verb, input, output, auth)
Registering with an LLM
Store in a vector DB or structured index
Feed to an LLM so it can answer API-related queries in natural language
Generating Code
LLM returns pseudocode, SDK snippets, or workflow definitions
Usage
Developer Co-pilot: Ask: “How do I retrieve customer order history?” → LLM returns the right API and a Java snippet
Business Analyst Querying: Ask in plain English → Get the API name, description, and usage instructions
Workflow Generation: LLM suggests orchestration across multiple APIs (CRM + Payments + Notifications)
Example
The gist shows a sample registry code for an org's MuleSoft exchange APIs.
Comparison: LLM + API Catalog vs MCP Catalog
Setup & Infra
Requires separate MCP servers, new infra
Reuses existing API catalog infrastructure
Dependencies
Third-party protocol, new runtime components
In-house development, minimal new stack
Security
Additional surface (tool poisoning, remote servers)
Security model already governed in catalog
Flexibility
Rich primitives (resources, prompts, tools)
Focused on APIs, pragmatic for enterprises
Time to Value
Longer — requires MCP integration and governance
Faster — immediate head start with current assets
Control
Vendor-ecosystem driven
Fully enterprise-owned and customizable
Verdict
Cost & Ownership
LLM + API Catalog
Security & Control
LLM + API Catalog
AI Readiness
Both viable, but MCP richer in primitives
Time-to-Value
LLM + API Catalog
Final Thoughts
If your organisation already maintains an API catalog, integrating it with an LLM offers a cost-effective, secure, and fast-start approach. MCP may bring richer primitives in the long term, but for most firms, leveraging existing assets delivers quicker ROI and avoids unnecessary dependencies.
Last updated
Was this helpful?