Show HN: MCP server to connect LLM agents to any database

16 hours ago 1

FastAPI Python License

Turbular is an open-source Model Context Protocol (MCP) server that enables seamless database connectivity for Language Models (LLMs). It provides a unified API interface to interact with various database types, making it perfect for AI applications that need to work with multiple data sources.

  • 🔌 Multi-Database Support: Connect to various database types through a single API
  • 🔄 Schema Normalization: Automatically normalize database schemas to correct naming conventions for LLM compatibility
  • 🔒 Secure Connections: Support for SSL and various authentication methods
  • 🚀 High Performance: Optimizes your LLM generated queries
  • 📝 Query Transformation: Let LLM generate queries against normalized layouts and transform them into their unnormalized form
  • 🐳 Docker Support: Easy deployment with Docker and Docker Compose

Using Docker (Recommended)

  1. Clone the repository:

    git clone https://github.com/raeudigerRaeffi/turbular.git cd turbular
  2. Start the development environment:

    docker-compose -f docker-compose.dev.yml up --build
  3. Test the connection:

    ./scripts/test_connection.py
  1. Install Python 3.11 or higher

  2. Install dependencies:

    pip install -r requirements.txt
  3. Run the server:

    uvicorn app.main:app --reload

Retrieve the schema of a connected database for your LLM agent.

Parameters:

  • db_info: Database connection arguments
  • return_normalize_schema (optional): Return schema in LLM-friendly format

Optimizes query and then execute SQL queries on the connected database.

Parameters:

  • db_info: Database connection arguments
  • query: SQL query string
  • normalized_query: Boolean indicating if query is normalized
  • max_rows: Maximum number of rows to return
  • autocommit: Boolean for autocommit mode
POST /upload-bigquery-key

Upload a BigQuery service account key file.

Parameters:

  • project_id: BigQuery project ID
  • key_file: JSON key file

Upload a SQLite database file.

Parameters:

  • database_name: Name to identify the database
  • db_file: SQLite database file (.db or .sqlite)

Verify if the API is running.

Get a list of all supported database types.

  1. Fork and clone the repository

  2. Create a development environment:

    docker-compose -f docker-compose.dev.yml up --build
  3. The development server includes:

    • FastAPI server with hot reload
    • PostgreSQL test database
    • Pre-configured test data
  4. Access the API documentation:

We welcome contributions! Here's how you can help:

  1. Check out our contribution guidelines
  2. Look for open issues
  3. Submit pull requests with improvements
  4. Help with documentation
  5. Share your feedback
  • Follow PEP 8 style guide
  • Write tests for new features
  • Update documentation as needed
  • Use meaningful commit messages
  1. Add more testing, formatting and commit hooks
  2. Add SSH support for database connection
  3. Add APIs as datasources using steampipe
  4. Enable local schema saving for databases to which the server has already connected
  5. Add more datasources (snowflake, mongodb, excel, etc.)

Run the test suite:

For development tests with the included PostgreSQL:

./scripts/test_connection.py
connection_info = { "database_type": "PostgreSQL", "username": "user", "password": "password", "host": "localhost", "port": 5432, "database_name": "mydb", "ssl": False }
connection_info = { "database_type": "BigQuery", "path_cred": "/path/to/credentials.json", "project_id": "my-project", "dataset_id": "my_dataset" }
connection_info = { "type": "SQLite", "database_name": "my_database" }

This project is licensed under the MIT License - see the LICENSE file for details.

  • FastAPI for the amazing framework
  • SQLAlchemy for database support
  • Henry Albert Jupiter Hommel for his development support
  • All our contributors and users

Made with ❤️ by the Turbular Team

Read Entire Article