A C library for embedding Apple Intelligence on-device models in any application
4 hours ago
1
Apple Intelligence for any programming language
A C library that provides access to Apple Intelligence on-device Foundation models for any programming language or application.
libai bridges Apple's FoundationModels framework through a C interface, enabling developers to integrate Apple Intelligence capabilities into applications written in C, C++, Python, Rust, Go, or any language that supports C bindings.
The library provides direct access to on-device AI models without requiring network connectivity or external API calls. All processing occurs locally on the user's device.
The library provides direct access to on-device AI models without requiring network connectivity or external API calls. All processing occurs locally on the user's device. It supports Intel Macs, Apple Silicon devices including MacBooks, iPhones, and iPads, as well as Apple Vision Pro.
Supported platforms:
iOS 26.0+
iPadOS 26.0+
Mac Catalyst 26.0+
macOS 26.0+ (Intel and Apple Silicon)
visionOS 26.0+
The library centers around session management. Applications create isolated AI sessions with independent configurations. Text generation operates in both synchronous and asynchronous modes with streaming callbacks available for real-time interfaces.
Structured response generation ensures AI output conforms to predefined JSON schemas. The library validates responses and provides both text and structured object representations.
Tool integration enables AI models to execute functions and interact with external systems. Applications can register native C callback functions as compile-time tools or utilize external MCP servers for runtime tool integration.
Tool definitions use JSON schemas following the Claude tool format for parameter validation and documentation.
System Requirements: macOS 26.0 or later with Apple Intelligence enabled.
# Download latest release
curl -L https://github.com/6over3/libai/releases/latest/download/libai.tar.gz | tar xz
# Build from source
git clone https://github.com/6over3/libai.git
cd libai
make
#include"ai.h"voidstream_callback(ai_context_t*context, constchar*chunk, void*user_data) {
if (chunk) {
printf("%s", chunk);
fflush(stdout);
} else {
printf("\n[Generation complete]\n");
}
}
intmain() {
ai_init();
ai_context_t*ctx=ai_context_create();
ai_session_id_tsession=ai_create_session(ctx, NULL);
// Stream responseai_stream_id_tstream=ai_generate_response_stream(
ctx, session, "Tell me a story", NULL, stream_callback, NULL);
// Wait for completion or cancel if needed// ai_cancel_stream(ctx, stream);ai_context_free(ctx);
ai_cleanup();
return0;
}
#include"ai.h"// Tool callback functionchar*get_weather_tool(constchar*parameters_json, void*user_data) {
// Parse parameters_json to extract location// Call weather API or return mock datareturnstrdup("{\"temperature\": 72, \"condition\": \"sunny\"}");
}
intmain() {
ai_init();
ai_context_t*ctx=ai_context_create();
// Configure session with toolsai_session_config_tconfig=AI_DEFAULT_SESSION_CONFIG;
config.tools_json="[{\"name\":\"get_weather\",\"description\":\"Get weather for a location\",\"input_schema\":{\"type\":\"object\",\"properties\":{\"location\":{\"type\":\"string\"}},\"required\":[\"location\"]}}]";
ai_session_id_tsession=ai_create_session(ctx, &config);
// Register tool callbackai_register_tool(ctx, session, "get_weather", get_weather_tool, NULL);
// Generate response that may use toolschar*response=ai_generate_response(ctx, session, "What's the weather like in San Francisco?", NULL);
printf("%s\n", response);
ai_free_string(response);
ai_context_free(ctx);
ai_cleanup();
return0;
}
Structured Response Example
#include"ai.h"intmain() {
ai_init();
ai_context_t*ctx=ai_context_create();
ai_session_id_tsession=ai_create_session(ctx, NULL);
// Define JSON schema for structured outputconstchar*schema="{\"type\":\"object\",\"properties\":{\"name\":{\"type\":\"string\"},\"age\":{\"type\":\"number\"}},\"required\":[\"name\",\"age\"]}";
// Generate structured responsechar*response=ai_generate_structured_response(
ctx, session, "Extract name and age from: John Smith is 30 years old", schema, NULL);
printf("Structured response: %s\n", response);
ai_free_string(response);
ai_context_free(ctx);
ai_cleanup();
return0;
}
Included with libai is momo, a terminal user interface that demonstrates the library's capabilities. momo functions as Cursor in your terminal, powered by your local Apple Intelligence model.
The application provides real-time streaming responses with markdown rendering, tool calling support with both built-in utilities and MCP server integration, and multi-line input handling with syntax highlighting.
API documentation appears in the header files with parameter descriptions, return value specifications, and memory ownership requirements. See ai.h for the complete API reference.
The underlying Apple Intelligence models operate with a 4096 token context limit. While session chaining can help manage longer conversations, developers should carefully consider this constraint in their implementation design.
Looking forward, as on-device AI models expand to additional platforms like Windows and Android, this library is designed to support those ecosystems through a unified abstraction layer.