Chat Client with Tool-Calling Support
An interactive C-based command-line chat client that enables AI models to execute real-world actions through a comprehensive tool-calling system.
🚀 Quick Start
Prerequisites
Ubuntu/Debian:
sudo apt-get update sudo apt-get install build-essential libglib2.0-dev libjson-glib-dev libsoup2.4-dev libreadline-devFedora/RHEL:
sudo dnf install gcc glib2-devel json-glib-devel libsoup-devel readline-develmacOS:
brew install glib json-glib libsoup readline pkg-configBuilding
git clone <your-repo-url> cd gobject make clean && makeSetup API Keys
For DeepSeek API:
export DEEPSEEK_API_KEY="sk-your-key-here"For Ollama (local):
# Install and start Ollama curl -fsSL https://ollama.ai/install.sh | sh ollama serve # Pull a model (in another terminal) ollama pull qwen3📖 Usage
Basic Commands
# Start with DeepSeek (requires API key) ./elelem # Use Ollama with specific model ./elelem -p ollama -m qwen3 # Use custom Ollama server ./elelem -p ollama -u http://remote-server:11434 -m mistralInteractive Commands
| /tools | List available tools | /tools |
| /history | Show conversation history | /history |
| /save | Save current conversation | /save |
| /load <file> | Load previous conversation | /load conversations/chat_2024-01-01_10-30-00.txt |
| /model <name> | Switch AI model | /model llama3.1 |
| /provider <name> | Switch between providers | /provider ollama |
| /clear | Clear conversation history | /clear |
| /exit | Exit application | /exit |
🛠️ Available Tools
The AI can execute these tools to help you:
🔍 Search Tools
-
grep - Search for patterns in files recursively
- Required: pattern
- Optional: path, file_pattern, case_sensitive
-
analyze_code - Analyze code structure and metrics
- Optional: path (default: current), language (default: c)
📁 File Tools
-
read_file - Read and display file contents
- Required: filename
-
write_file - Create or modify files
- Required: filename, content
-
list_directory - Browse directory contents
- Optional: path (default: current directory)
💻 System Tools
- shell - Execute shell commands (sandboxed for safety)
- Required: command
💡 Example Interactions
Code Analysis
> Analyze this codebase and tell me about its structure [AI will use the analyze_code tool to examine your project, count lines of code, identify functions, and provide architectural insights]File Search
> Find all TODO comments in C files [AI will use grep tool with pattern "TODO" and file_pattern "*.c" to search recursively through your codebase]File Operations
> Create a new header file called "utils.h" with basic includes [AI will use write_file tool to create the header file with appropriate content]System Operations
> What's in the current directory and what's the git status? [AI will use list_directory and shell tools to show directory contents and run "git status"]🏗️ Architecture
Core Components
- main.c - CLI interface and main event loop
- llm_interface.[ch] - Abstract client interface
- deepseek_client.[ch] - DeepSeek API implementation
- ollama_client.[ch] - Ollama local API implementation
- tool_manager.[ch] - Tool orchestration system
- tool_definition.[ch] - Tool schema definitions
- builtin_tools.c - Built-in tool implementations
- my_http_client.[ch] - HTTP client utilities
Tool-Calling Flow
- User Input → User asks AI to perform a task
- AI Planning → Model decides which tools to use
- Tool Calls → AI generates JSON function calls
- Validation → System validates tool calls and parameters
- Execution → Tools run in sandboxed environment
- Results → Tool output fed back to AI
- Response → AI provides final answer with context
🔧 Configuration
System Prompt Customization
The system prompt is loaded from prompts/system_prompt.txt. You can customize it to:
- Add new tool descriptions
- Modify AI behavior
- Change response format preferences
Conversation Storage
- Conversations are auto-saved to conversations/ directory
- Files are named with timestamps: chat_YYYY-MM-DD_HH-MM-SS.txt
- Use /load command to resume previous conversations
🛡️ Security Features
- Command Filtering - Dangerous shell commands are blocked
- Path Validation - File operations validate and sanitize paths
- Output Limits - Large outputs are truncated to prevent memory issues
- Sandboxed Execution - Tools run with limited permissions
- Input Validation - All tool parameters are validated before execution
🔨 Development
Adding New Tools
- Implement the handler in builtin_tools.c:
- Register the tool in tool_manager_register_builtin_tools():
- Update system prompt in prompts/system_prompt.txt to describe the new tool.
Building with Debug Info
make clean CFLAGS="-g -O0 -DDEBUG" makeRunning Tests
# Test with different providers ./elelem -p deepseek ./elelem -p ollama -m llama3.1 # Test tool functionality echo "List the files in this directory" | ./elelem -p ollama -m llama3.1🐛 Troubleshooting
Common Issues
Build Errors:
- Ensure all dependencies are installed
- Check pkg-config can find libraries: pkg-config --cflags glib-2.0
DeepSeek API Issues:
- Verify API key is set: echo $DEEPSEEK_API_KEY
- Check network connectivity and API quotas
Ollama Issues:
- Ensure Ollama server is running: curl http://localhost:11434/api/version
- Verify model is available: ollama list
Tool Execution Issues:
- Check file permissions for file operations
- Verify shell commands aren't blocked by security filters
Debug Mode
Set environment variable for verbose output:
G_MESSAGES_DEBUG=all ./elelem📊 Features
✅ Implemented
- Multi-provider support (DeepSeek, Ollama)
- Real-time streaming responses
- Tool-calling with 6 built-in tools
- Conversation history with save/load
- Command-line interface with readline support
- Security sandboxing and validation
- File-based system prompt configuration
🚧 Planned
- Tool result caching for performance
- Async tool execution for parallel operations
- Plugin system for third-party tools
- Tool dependency management and chaining
📄 License
GNU Affero General Public License v3
Need help? Open an issue or check the conversation history with /history to see example interactions.
.png)

