Show HN: MCP Kit – a toolkit for building, mocking and optimizing AI agents
3 days ago
1
MCP tooling for developing and optimizing multi-agent AI systems
A comprehensive toolkit for working with the Model Context Protocol (MCP), providing seamless integration between AI agents and various data sources, APIs, and services. Whether you're building, testing, or deploying multi-agent systems, MCP Kit simplifies the complexity of tool orchestration and provides powerful mocking capabilities for development.
MCP Servers: Connect to existing MCP servers (hosted or from specifications)
OpenAPI Integration: Automatically convert REST APIs to MCP tools using OpenAPI/Swagger specs
Mock Responses: Generate realistic test data using LLM or random generators
Multiplexing: Combine multiple targets into a unified interface
OpenAI Agents SDK: Native integration with OpenAI's agent framework
LangGraph: Seamless tool integration for LangGraph workflows
Generic Client Sessions: Direct MCP protocol communication
Official MCP Server: Standard MCP server wrapper
Configuration-Driven Architecture
YAML/JSON Configuration: Declarative setup for complex workflows
LLM-Powered Mocking: Generate contextually appropriate responses using LLMs
Random Data Generation: Create test data for development and testing
Custom Generators: Implement your own response generation logic
First you write the Proxy config:
# proxy_config.yaml""" A mocked REST API target given the OpenAPI spec using LLM-generated responses"""target:
type: mockedbase_target:
type: oasname: base-oas-serverspec_url: https://petstore3.swagger.io/api/v3/openapi.jsonresponse_generator:
type: llmmodel: openai/gpt-4.1-nano
Don't forget to setup the LLM API KEY:
# .env
OPENAI_API_KEY="your_openai_key"
Then we can use it as any other MCP:
# main.pyfrommcp_kitimportProxyMCPasyncdefmain():
# Create proxy from configurationproxy=ProxyMCP.from_config("proxy_config.yaml")
# Use with MCP client session adapterasyncwithproxy.client_session_adapter() assession:
tools=awaitsession.list_tools()
result=awaitsession.call_tool("get_pet", {"pet_id": "777"})
print(result)
if__name__=="__main__":
importasyncioasyncio.run(main())
Targets are the core abstraction in MCP Kit, representing different types of tool providers:
Adapters provide framework-specific interfaces for your targets:
Client Session Adapter: Direct MCP protocol communication
OpenAI Agents Adapter: Integration with OpenAI's agent framework
LangGraph Adapter: Tools for LangGraph workflows
Official MCP Server: Standard MCP server wrapper
Response Generators create mock responses for testing and development:
LLM Generator: Uses language models to generate contextually appropriate responses
Random Generator: Creates random test data
Custom Generators: Implement your own logic
OpenAI Agents SDK Integration
frommcp_kitimportProxyMCPfromagentsimportAgent, Runner, traceimportasyncioasyncdefopenai_example():
proxy=ProxyMCP.from_config("proxy_config.yaml")
asyncwithproxy.openai_agents_mcp_server() asmcp_server:
# Use with OpenAI Agents SDKagent=Agent(
name="research_agent",
instructions="You are a research assistant with access to various tools.",
model="gpt-4.1-nano",
mcp_servers=[mcp_server]
)
response=awaitRunner.run(
agent,
"What's the weather like in San Francisco?"
)
print(response.final_output)
if__name__=="__main__":
asyncio.run(openai_example())
LangGraph Workflow Integration
frommcp_kitimportProxyMCPfromlanggraph.prebuiltimportcreate_react_agentimportasyncioasyncdeflanggraph_example():
proxy=ProxyMCP.from_config("proxy_config.yaml")
# Get LangChain-compatible toolsclient=proxy.langgraph_multi_server_mcp_client()
asyncwithclient.session("your_server_name") as_:
# Get the MCP tools as LangChain toolstools=awaitclient.get_tools(server_name="your_server_name")
# Create ReAct agentagent=create_react_agent(model="google_genai:gemini-2.0-flash", tools=tools)
# Run workflowresponse=awaitagent.ainvoke({
"messages": [{"role": "user", "content": "Analyze Q1 expenses"}]
})
# Extract resultfinal_message=response["messages"][-1]
print(final_message.content)
if__name__=="__main__":
asyncio.run(langgraph_example())