AI Client
Quick Start
Connect your AI client to OpenAgent Core in two ways — standard direct tool calls or the recommended code execution approach.
Standard Connection
Direct tool calls via the standard MCP protocol. Point your client at the endpoint and start calling tools immediately.
Endpoint
https://your-mcp-server.workers.dev/mcpAuthentication
Authorization: Bearer <TENANT_API_KEY>Claude Desktop Configuration
Add the following to your Claude Desktop claude_desktop_config.json:
{
"mcpServers": {
"openagent-core": {
"url": "https://your-mcp-server.workers.dev/mcp",
"headers": {
"Authorization": "Bearer sk_live_your_tenant_api_key"
}
}
}
}The client loads all 34+ tool definitions via tools/list, then invokes them with tools/call.
Limitations of this approach
Context Overflow
All tool definitions load into the agent's context upfront, consuming hundreds of thousands of tokens before any work begins.
Expensive Data Flow
Large datasets — transactions, account histories — pass through the model multiple times, increasing cost and latency.
Poor Scaling
As more tools and data are used, token usage and response times grow quickly.
Code Execution
The production-ready approach for large-scale financial data workflows with the MCP server.
What Is Code Execution?
Instead of loading all tools as direct tool calls, you present the MCP server as a code API. The agent writes code (e.g., TypeScript) that calls the OpenAgent Core tools, loads only what it needs, and processes data in the execution environment before returning results.
Tools are discovered on-demand by exploring a file tree, and large datasets are filtered or aggregated in code — never passed wholesale through the model.
Key Advantages
~98% token reduction
Load only the tools you need, not all 34+
Lower latency
Fewer model round-trips, faster responses
Handles large financial datasets
Transactions filtered and aggregated in code
On-demand tool discovery
Explore a file tree, read only what's relevant
Production-ready scaling
Token usage stays flat as tools and data grow