Volcano SDK, a TypeScript SDK for Multi-Provider AI Agents

3 weeks ago 2

CI License npm

The TypeScript SDK for Multi-Provider AI Agents

Build agents that chain LLM reasoning with MCP tools. Mix OpenAI, Claude, Mistral in one workflow. Parallel execution, branching, loops. Native retries, streaming, and typed errors.

📚 Read the full documentation at volcano.dev →

⚡️ Chainable API

Chain steps with .then() and .run(). Promise-like syntax for building multi-step workflows.

✨ Automatic Tool Selection

LLM automatically selects and calls appropriate MCP tools based on the prompt. No manual routing required.

🔧 100s of Models

OpenAI, Anthropic, Mistral, Llama, Bedrock, Vertex, Azure. Switch providers per-step or use globally.

🛡️ TypeScript-First

Full TypeScript support with type inference and IntelliSense for all APIs.

🔄 Advanced Patterns

Parallel execution, conditional branching, loops, and sub-agent composition for complex workflows.

⏱️ Retries & Timeouts

Three retry strategies: immediate, delayed, and exponential backoff. Per-step timeout configuration.

📡 Streaming Workflows

Stream step results as they complete using async generators. Perfect for real-time UIs and long-running tasks.

🎯 MCP Integration

Native Model Context Protocol support with connection pooling, tool discovery, and authentication.

🧩 Sub-Agent Composition

Build reusable agent components and compose them into larger workflows. Modular and testable.

📊 OpenTelemetry Observability

Production-ready distributed tracing and metrics. Monitor performance, debug failures. Export to Jaeger, Prometheus, DataDog, NewRelic.

🔐 MCP OAuth Authentication

OAuth 2.1 and Bearer token authentication per MCP specification. Agent-level or handle-level configuration with automatic token refresh.

⚡ Performance Optimized

Intelligent connection pooling for MCP servers, tool discovery caching with TTL, and JSON schema validation for reliability.

Explore all features →

That's it! Includes MCP support and all common LLM providers (OpenAI, Anthropic, Mistral, Llama, Vertex).

View installation guide →

import { agent, llmOpenAI, mcp } from "volcano-sdk"; const llm = llmOpenAI({ apiKey: process.env.OPENAI_API_KEY!, model: "gpt-4o-mini" }); const astro = mcp("http://localhost:3211/mcp"); const results = await agent({ llm }) .then({ prompt: "Find the astrological sign for birthdate 1993-07-11", mcps: [astro] // Automatic tool selection }) .then({ prompt: "Write a one-line fortune for that sign" }) .run(); console.log(results[1].llmOutput); // Output: "Fortune based on the astrological sign"
import { agent, llmOpenAI, llmAnthropic, llmMistral } from "volcano-sdk"; const gpt = llmOpenAI({ apiKey: process.env.OPENAI_API_KEY! }); const claude = llmAnthropic({ apiKey: process.env.ANTHROPIC_API_KEY! }); const mistral = llmMistral({ apiKey: process.env.MISTRAL_API_KEY! }); // Use different LLMs for different steps await agent() .then({ llm: gpt, prompt: "Extract data from report" }) .then({ llm: claude, prompt: "Analyze for patterns" }) .then({ llm: mistral, prompt: "Write creative summary" }) .run();

View more examples →

We welcome contributions! Please see our Contributing Guide for details.

Questions or Feature Requests?

Apache 2.0 - see LICENSE file for details.

Read Entire Article