a plugins that allow to use OpenAPI as a tools
Install this plugin in the same environment as LLM.
To use this with the LLM command-line tool:
With the LLM Python API:
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
Now install the dependencies and test dependencies:
To run the tests:
The llm-tools-openapi plugin is not limited to standard OpenAPI-compatible APIs. You can also use it to interact with any MCP (Model Context Protocol) server by converting it into an OpenAPI server using mcpo.
MCP servers typically communicate over raw stdio, which is not compatible with most tools and lacks standard features like authentication, documentation, and error handling. mcpo solves this by acting as a proxy: it exposes any MCP server as a RESTful OpenAPI HTTP server, instantly making it accessible to tools and agents that expect OpenAPI endpoints.
Benefits:
- Instantly compatible with OpenAPI tools, SDKs, and UIs
- Adds security, stability, and scalability
- Auto-generates interactive documentation for every tool
- Uses standard HTTP—no custom protocols or glue code required
mcpo also supports serving multiple MCP servers from a single config file. Each tool will be available under its own route, with its own OpenAPI schema and documentation.
Example demo/mcp-demo.json
Start mcpo with:
Each tool will be accessible at:
- http://localhost:8000/basic-memory
- http://localhost:8000/playwright
And their OpenAPI docs at:
- http://localhost:8000/basic-memory/docs
- http://localhost:8000/playwright/docs
Point the llm-tools-openapi plugin to the OpenAPI endpoint provided by mcpo. For example:
You will obtain a md file with the content you want.
....
By combining llm-tools-openapi with mcpo, you can:
- Instantly convert any MCP server into an OpenAPI-compatible API
- Manage and call MCP tools using standard OpenAPI workflows
- Leverage the full power of LLM agents and automation with minimal setup
For more details, see the mcpo documentation and the Open WebUI docs.