Show HN: SHAI – an open-source, terminal-native AI coding assistant

1 month ago 5

shai is a coding agent, your pair programming buddy that lives in the terminal. Written in rust with love <3

Shai CLI Screenshot

install latest stable release

Install the latest release with the following command:

curl -fsSL https://raw.githubusercontent.com/ovh/shai/main/install.sh | sh

the shai binary will be installed in $HOME/.local/bin

Install the last unstable version with the following command:

curl -fsSL https://raw.githubusercontent.com/ovh/shai/main/install.sh | SHAI_RELEASE=unstable sh

the shai binary will be installed in $HOME/.local/bin

Configure a provider and Run!

By default shai uses OVHcloud as an anonymous user meaning you will be rate limited! If you want to sign in with your account or select another provider, run:

shai auth

Once you have a provider set up, you can run shai:

shai

Shai can also run in headless mode without user interface. In that case simply pipe a prompt into shai, it will stream event in the stderr:

echo "make me a hello world in main.py" | shai

shai headless

you can also instruct shai to return the entire conversation as a trace once it is done:

echo "make me a hello world in main.py" | shai 2>/dev/null --trace

shai headless

this is handy because you can chain shai calls:

echo "make me a hello world in main.py" | shai --trace | shai "now run it!"

shai headless

You can create a SHAI.md file at the root of your project containing any information you want Shai to know about the project (architecture, build steps, important directories, etc.). Shai will automatically load this file as additional context.

Instead of a single global configuration, you can create custom agent in a separate configuration.

.ovh.config contains an example of a custom configuration with an remote MCP server configured.

Place this file in ~/.config/shai/agents/example.config, you can then list the agents available with:

you can run shai with this specific agent with the agent subcommand:

shai can also act as a shell assistant in case a command failed and will propose you a fix. This works by injecting command hook while monitoring your terminal output. Your last terminal output along with the last command and error code will be sent for analysis to the llm provider. To start hooking your shell with shai simply type:

for instance:

Shai CLI Screenshot

To stop shai from monitoring your shell you can type:

Simply build the project with cargo

git clone [email protected]:ovh/shai.git cd shai cargo build --release

Compatible OVHCloud endpoints

OVHCloud provides compatible LLM endpoints for using shai with tools. Start by creating a Public Cloud project in your OVHCloud account, then head to AI Endpoints and retreive your API key. After setting it in shai, you can:

Read Entire Article