A pure Python re-implementation of Vercel's popular AI SDK for TypeScript. Zero-configuration functions that work consistently across providers with first-class streaming, tool-calling, and structured output support.
Python is the defacto language for AI. However, to actually get started with AI, you'll need to 1. use a bloated external framework and install a bunch of dependencies, or 2. use an incredibly confusing API client (to simply call an LLM, you need client.chat.completions.create(**kwargs).result.choices[0].message.content).
Zero-configuration functions that work consistently across providers
First-class streaming & tool-calling support
Strong Pydantic types throughout - you know exactly what you're getting
Strict structured-output generation and streaming via Pydantic models
Provider-agnostic embeddings with built-in batching & retry logic
Tiny dependency footprint - no bloated external frameworks
Install via UV (Python package manager):
Or with pip:
pip install ai-sdk-python
That's it - no extra build steps or config files.
Get started in just a few lines of code.
fromai_sdkimportgenerate_text, openaimodel=openai("gpt-4o-mini")
res=generate_text(model=model, prompt="Tell me a haiku about Python")
print(res.text)
importasynciofromai_sdkimportstream_text, openaiasyncdefmain():
model=openai("gpt-4o-mini")
stream_res=stream_text(model=model, prompt="Write a short story")
asyncforchunkinstream_res.text_stream:
print(chunk, end="", flush=True)
asyncio.run(main())
fromai_sdkimportgenerate_object, openaifrompydanticimportBaseModelclassPerson(BaseModel):
name: strage: intmodel=openai("gpt-4o-mini")
res=generate_object(
model=model,
schema=Person,
prompt="Create a person named Alice, age 30"
)
print(res.object) # Person(name='Alice', age=30)
fromai_sdkimportembed_many, cosine_similarity, openaimodel=openai.embedding("text-embedding-3-small")
texts= ["The cat sat on the mat.", "A dog was lying on the rug."]
result=embed_many(model=model, values=texts)
similarity=cosine_similarity(result.embeddings[0], result.embeddings[1])
print(f"Similarity: {similarity:.3f}")
fromai_sdkimporttool, generate_text, openaifrompydanticimportBaseModel, Field# Using Pydantic models (recommended)classAddParams(BaseModel):
a: float=Field(description="First number")
b: float=Field(description="Second number")
@tool(name="add",description="Add two numbers.",parameters=AddParams)defadd(a: float, b: float) ->float:
returna+bmodel=openai("gpt-4o-mini")
res=generate_text(
model=model,
prompt="What is 21 + 21?",
tools=[add],
)
print(res.text) # "The result is 42."
generate_text - Synchronous text generation with rich metadata
stream_text - Asynchronous streaming with real-time callbacks
generate_object - Structured output with Pydantic validation
stream_object - Streaming structured output with partial updates
embed - Single-value embedding helper
embed_many - Batch embedding with automatic batching
fromai_sdkimportembed_many, cosine_similarity, openaimodel=openai.embedding("text-embedding-3-small")
# Knowledge basedocuments= [
"Python is a programming language.",
"Machine learning involves training models on data.",
"Databases store and retrieve information."
]
# Search queryquery="How do I learn to code?"# Embed everythingall_texts= [query] +documentsresult=embed_many(model=model, values=all_texts)
query_embedding=result.embeddings[0]
doc_embeddings=result.embeddings[1:]
# Find most similar documentsimilarities= []
fori, doc_embeddinginenumerate(doc_embeddings):
sim=cosine_similarity(query_embedding, doc_embedding)
similarities.append((sim, documents[i]))
# Get top resulttop_result=max(similarities, key=lambdax: x[0])
print(f"Most relevant: {top_result[1]}")
fromai_sdkimporttool, generate_text, openaiimportrequestsdefget_weather(city: str) ->str:
"""Get current weather for a city."""weather_data= {
"New York": "72°F, Sunny",
"London": "55°F, Rainy",
"Tokyo": "68°F, Cloudy"
}
returnweather_data.get(city, "Weather data not available")
weather_tool=tool(
name="get_weather",
description="Get current weather information for a city.",
parameters={
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The city name to get weather for"
}
},
"required": ["city"]
},
execute=get_weather
)
model=openai("gpt-4o-mini")
res=generate_text(
model=model,
prompt="What's the weather like in New York?",
tools=[weather_tool],
)
print(res.text)
The SDK is provider-agnostic. Currently supported: