We're shipping 1.4.0! This release brings structured output that actually works, direct access to provider-specific parameters, and Rails generators that produce idiomatic Rails code.
🎯 Structured Output with JSON Schemas
Wrestling with LLMs to return valid JSON is over. We've added with_schema that makes structured output as simple as defining what you want:
No more prompt engineering gymnastics. Just schemas and results. Use the RubyLLM::Schema gem for the cleanest DSL, or provide raw JSON schemas if you prefer.
🛠️ Direct Provider Access with with_params
Need to use that one provider-specific parameter? with_params gives you direct access:
No more workarounds. Direct access to any parameter your provider supports.
🚄 Rails Generator That Produces Idiomatic Rails Code
From rails new to chatting with LLMs in under 5 minutes:
This creates:
- Migrations with proper Rails conventions
- Models with acts_as_chat, acts_as_message, and acts_as_tool_call
- A readable initializer with sensible defaults
- Zero boilerplate, maximum convention
Your Chat model works exactly as you'd expect:
🔍 Tool Call Transparency
New on_tool_call callback lets you observe and log tool usage:
🔌 Raw Response Access
Access the underlying Faraday response for debugging or advanced use cases:
🏭 GPUStack Support
Run models on your own hardware with GPUStack:
🐛 Important Bug Fixes
- Anthropic multiple tool calls now properly handled (was only processing the first tool)
- Anthropic system prompts fixed to use plain text instead of JSON serialization
- Message ordering in streaming responses is rock solid
- Embedding arrays return consistent formats for single and multiple strings
- URL attachments work properly without argument errors
- Streaming errors handled correctly in both Faraday V1 and V2
- JRuby officially supported and tested
🎁 Enhanced Rails Integration
- Message ordering guidance to prevent race conditions
- Provider-specific configuration examples
- Custom model name support with acts_as_ helpers
- Improved generator output
Context isolation works seamlessly without global config pollution:
📚 Quality of Life Improvements
- Removed 60MB of test fixture data
- OpenAI base URL configuration in bin/console
- Better error messages for invalid models
- Enhanced Ollama documentation
- More code examples throughout
Installation
Full backward compatibility maintained. Your existing code continues to work while new features await when you need them.
Merged PRs
- Add OpenAI base URL config to bin/console by @infinityrobot in #283
- Reject models from Parsera that does not have :provider or :id by @K4sku in #271
- Fix embedding return format inconsistency for single-string arrays by @finbarr in #267
- Fix compatibility issue with URL attachments wrong number of arguments by @DustinFisher in #250
- Add JRuby to CI test job by @headius in #255
- Add provider specifying example to rails guide by @tpaulshippy in #233
- More details for configuring Ollama by @jslag in #252
- Remove 60 MB of the letter 'a' from spec/fixtures/vcr_cassettes by @compumike in #287
- docs: add guide for using custom model names with acts_as helpers by @matheuscumpian in #171
- Add RubyLLM::Chat#with_params to add custom parameters to the underlying API payload by @compumike in #265
- Support gpustack by @graysonchen in #142
- Update CONTRIBUTING.md by @graysonchen in #289
- Fix handling of multiple tool calls in single LLM response by @finbarr in #241
- Rails Generator for RubyLLM Models by @kieranklaassen in #75
- Anthropic: Fix system prompt (use plain text instead of serialized JSON) by @MichaelHoste in #302
- Provide access to raw response object from Faraday by @tpaulshippy in #304
- Add Chat#on_tool_call callback by @bryan-ash in #299
- Added proper handling of streaming error responses across both Faraday V1 and V2 by @dansingerman in #273
- Add message ordering guidance to Rails docs by @crmne in #288
New Contributors
- @infinityrobot made their first contribution in #283
- @K4sku made their first contribution in #271
- @finbarr made their first contribution in #267
- @DustinFisher made their first contribution in #250
- @headius made their first contribution in #255
- @jslag made their first contribution in #252
- @compumike made their first contribution in #287
- @matheuscumpian made their first contribution in #171
- @graysonchen made their first contribution in #142
- @MichaelHoste made their first contribution in #302
- @bryan-ash made their first contribution in #299
- @dansingerman made their first contribution in #273
Full Changelog: 1.3.1...1.4.0
.png)
