TraceRoot is an open-source debugging platform that helps engineers fix production issues 10x faster by combining structured traces, logs, and source code context with AI-powered analysis.
Join us (Discord) in pushing the boundaries of debugging with AI agents.
Please 🌟 Star TraceRoot on GitHub and be instantly notified of new releases.
The framework enables multi-agent systems to continuously evolve by interacting with environments.
The framework enables real-time tracing and logging to your applications.
The framework enables utilizing structured loggings and tracing data to improve the performance of AI agents.
The framework enables integrating with other sources and tools, such as GitHub, Notion, etc. This provides a seamless experience for you to use the framework in your applications.
We also provide a Cursor like interface but specialized for debugging and tracing. You can select the logs and traces you are interested in and ask the framework to help you with the analysis.
We are a community-driven collective comprising over multiple engineers and researchers dedicated to advancing frontier engineering and research in using Multi-Agent Systems to help not only human but also AI agents on debugging, tracing, and root cause analysis.
| ✅ | Multi-Agent System | Multi-Agent system that can be used to solve complex tasks. |
| ✅ | Real-Time Tracing and Logging | Enable real-time tracing and logging to your applications. |
| ✅ | Structured Logging | Enable structured logging to your applications, which allows better performance of AI agents. |
| ✅ | Integration with Multiple Resources | Integrate with other sources and tools, such as GitHub, Notion, etc. |
| ✅ | Developer Friendly | We provide a Cursor like interface but specialized for debugging and tracing. |
Here is an overview for our AI Agent Framework:
Please checkout the README.md in the rest/agent directory for more details.
The fastest and most reliable way to get started with TraceRoot is signing up to TraceRoot Cloud.
You can install the latest version of TraceRoot with the following command:
Install the dependencies locally:
For local usage, all of your data will be stored locally.
Run the below command to intialize environment variables.
You can use the TraceRoot framework locally by following the README.md in the ui directory and README.md in the rest directory.
Also, you can build the latest docker image and run the docker container by following the README.md in the docker directory.
This will start the UI at http://localhost:3000 and the API at http://localhost:8000.
Before using the TraceRoot framework, you need to setup the Jaeger docker container at first. It will be used to store the traces and logs and capture the traces and logs from our SDK which is integrated with your applications.
In local mode, the first step is to go to the integration page and connect with your GitHub account (optional) with your GitHub token. You also need to put your OpenAI API key in the integration page.
Our project is built on top of the TraceRoot SDK. You need to use our SDK to integrate with your applications by
To use the local mode of the TraceRoot SDK, you need create a .traceroot-config.yaml file in the root directory of your project with following content:
As mentioned above, you need to setup the Jaeger docker container at first before let the TraceRoot SDK capture the traces and logs from your applications.
For more details or the SDK usage and examples, please checkout this Quickstart.
If you find our exploratory TraceRoot useful in your research, please consider citing:
.png)

