Ash AI: A Comprehensive LLM Toolbox for Ash Framework

4 days ago 1

Let’s talk about the LLM-ephant in the room. There probably isn’t a single software engineer in the world remaining who has not been asked to implement some kind of LLM related feature for their application. Most likely that feature is something like a chat bot, either for front line customer support, or as a tool for users to interact with your application logic using natural language.

The AI scene is rapidly changing, but there is one thing that will always be true: AI agents must have their choices routed through a well formed and secure application layer. We do not build agentic tooling for our users by providing an LLM with full access to our database and including in the prompt “do not share other user’s data or you’ll go to jail”. At least, you don’t if you want to keep your job and/or stay out of jail yourself.

Introducing the Ash AI package

Ash aims to bring the best of the Elixir ecosystem to bear (and build it ourselves when necessary), to help developers solve real, every day problems. We take a ground up, methodical approach, and over time have begun layering high level concepts on top of a rich core framework. For example AshAuthentication, AshDoubleEntry and AshMoney are all packages that do far more for you, by way of integrating with and understanding the core framework, than a simple Elixir library ever could.

So what does that look like when it comes to AI? The Ash AI package builds on Ash Framework’s core strengths to rapidly implement LLM features for their application.

The primary design principle of Ash Framework is the concept of modelling your application and its logic as data. When your application is modelled with data, tooling can be added that leverages your data at an astonishing rate of speed and fidelity. This also avoids the sprawl and spaghetti that often arises without this design pattern. We apply these same principles to help you integrate AI agents directly into your application powerfully, simply and safely. We integrate directly with and leverage LangChain, and plan to provide ongoing support for multiple lower level LLM tools in the future.

Installing Ash AI

Just like the rest of the Ash ecosystem, you can install Ash AI with a simple Igniter command:

mix igniter.install ash_ai

Read on for additional commands you can run afterwards .

So whats in the Ash AI box?

Ash AI is a hex package and extension just like all of our tooling. It provides 5 primary features:

  • Prompt-backed Actions with Structured Outputs - Use all of the tools you’ve already learned while building with Ash to delegate work to Agents.
  • Tool Definition - Declare any of your application’s actions as tool calls, which can be made available to any agentic workflow, including the structured output tools above.
  • Vectorisation - Tools for automatically translating your data model into vector embeddings and storing them alongside your data (using pgvector).
  • mix ash_ai.gen.chat - A quickstart tool to generate the resources and Liveviews for a chat feature, complete with conversations, persistence, streaming responses and tool calls.
  • MCP Server - An MCP server for providing your tools to things like IDEs & Claude, and a development tool MCP server which complements Tidewave perfectly
  • All of the rest of the Ash ecosystem to integrate with 🚀

Lets go into each feature in more detail.

Prompt-backed Actions with Structured Outputs

Structured outputs are the king of LLM features in my opinion. They allow us to use LLMs as if they were any other predictable interface in our application. With AshAi it is as simple as defining a generic action that uses the builtin prompt/2 implementation. A prompt will be derived from a combination of your action’s descriptions, inputs and their descriptions and values. You can, of course, provide your own prompts as well. Lets see it in action:

defmodule MyApp.Types.ProductInfo do defstruct [:name, :price] use Ash.Type.NewType, subtype_of: :struct, constraints: [ instance_of: __MODULE__, fields: [ name: [type: :string, allow_nil?: false], price: [type: :money, allow_nil?: false] ] ] end

Given the type above, which represents a product and its price, we can build an action like so:

action :scan_for_products, {:array, MyApp.Types.ProductInfo} do description """ Scans a given html page for product information, extracting their name and price. The name includes any disambiguation, i.e `banana (large)` if present. """ argument :page_contents, :string do allow_nil? false description "The raw contents of the HTML page" end run prompt(LangChain.ChatModels.ChatOpenAI.new!(%{model: "gpt-4o"})) end

Now, calling this action will yield a list of products:

iex> MyApp.Shopping.scan_for_products!(""" <div> <p>Banana (medium)</p> <p>3 dollars</p> <p>Banana (large)</p> <p>3.2 dollars</p> </div> """) [ %ProductInfo{name: "Banana (medium)", Money.new(:USD, "3")}, %ProductInfo{name: "Banana (large)", Money.new(:USD, "3.20")} ]

For those familiar with Ash, you will see that the generic action description above looks just like any other action we might write! We just use prompt/1 to have an agent do it for us 😎.

Tool Definition

Another extremely powerful feature of LLMs is tool calling. A tool call is something that the agent can do to invoke some piece of functionality that your app provides. Luckily, Ash is the king of declaring application functionality for use in multiple interfaces, and an agent’s tool calling is just one such example. Building on our products example above, lets say we want the agent to convert all currencies to USD, using our own internal currency conversion logic.

First, we define a simple resource with one action like so:

defmodule MyApp.Money.Currencies do use Ash.Resource action :convert_to_usd, :money do argument :to_convert, :money, allow_nil?: false run fn input, _ -> MyApp.CurrencyConvert.to_usd(input.arguments.money) end end end

and then in our domain we can make it available as a tool.

tools do # make a tool called `:convert_to_usd` # that uses the `:convert_to_usd` action from the `MyApp.Money.Currencies` resource tool :convert_to_usd, MyApp.Money.Currencies, :convert_to_usd end

Now we could modify our action from before like so, and the agent would convert the prices to USD. Mileage varies based on the skill of the LLM of course, but this is the fundamental pattern.

action :scan_for_products, {:array, MyApp.Types.ProductInfo} do description """ Scans a given html page for product information, extracting their name and price. The name includes any disambiguation, i.e `banana (large)` if present. The price is converted to USD using the `convert_to_usd` tool. """ argument :page_contents, :string do allow_nil? false description "The raw contents of the HTML page" end run prompt( LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o"}), tools: [:convert_to_usd] ) end

Not only is this convenient and powerful, it is also secure. When you call one of these actions, you can provide an actor, and that actor's policies will be used to authorise access to data and mutations! This gives you an unprecedented level of freedom to experiment with and integrate agentic tooling into your application.

See the last section for ways that we can take this even further.

Vectorisation

A very common feature when building AI enabled applications is vectorisation, a key component of RAG. For example, if we have a Product resource, we may want to make its description searchable via vector similarity.

# in `MyApp.Shopping.Product` vectorize do full_text do text fn product -> """ Name: #{product.name} Description: #{product.description} """ end end attributes(description: :vectorized_description) # You bring your own embedding model, this one calls OpenAI's embeddings API. embedding_model MyApp.OpenAiEmbeddingModel end

Now whenever you modify this data, an attribute called full_text_vector will be written to with the embeddings for the full text, and an attribute called vectorized_description will contain just the embeddings of the description. You can then use functions like vector_cosine_distance(full_text, "some query")) in your sorts and filters!

mix ash_ai.gen.chat

Using mix ash_ai.gen.chat will generate all of the code needed to run an agentic chat UI, and there is a space for you to configure the tools available to the agent. With one command you are set up with a Liveview chat application with streaming backed by Phoenix PubSub, durable agent responses backed by Oban, and tool calling! In fact, you can go from zero to AI enabled application in 3 minutes:

# ensure igniter is installed mix archive.install hex igniter_new # generate your app mix igniter.new my_app --with phx.new \ --install ash,ash_postgres,ash_authentication \ --install ash_authentication_phoenix \ --install ash_ai@github:ash-project/ash_ai \ --auth-strategy password # cd into it cd my_app # generate chat resources & views mix ash_ai.gen.chat --live # setup the database mix ash.setup # run the server iex -S mix phx.server

Exposing MCP Servers with Ash

MCP is Anthropic’s protocol for exposing behaviours to external agents. For example, if you have a weather application, you might expose an MCP server, running at /mcp, that has a tool called get_weather. A user could connect your app to their AI agent and this would allow their agent to provide them with accurate and up-to-date weather info. See the documentation for more: https://modelcontextprotocol.io/introduction.

Ash AI provides facilities for two MCP servers. One for production, designed to expose your application’s behaviour to end users, and one for development, designed to be used with tools like Zed, Windsurf and Cursor.

You can set up both with one command:

mix ash_ai.gen.mcp

You can then add to the tools list for your production MCP!

You’ll see a plug like the following in your endpoint, powering the dev MCP plug:

plug AshAi.Mcp.Dev, otp_app: :your_app

And the following in your router, for serving your production MCP:

scope "/mcp" do pipe_through :mcp forward "/", AshAi.Mcp.Router, tools: [ :get_weather ], otp_app: :my_app end

Tools for MCP are defined in the same way that tools for agents are defined.

Ash Ecosystem

Interfaces

Aside from all of the awesome functionality above, you get everything else that comes with building Ash applications. Want to put your fancy page/product scanner into a GraphQL API or a JSON:API? Or perhaps you want to expose it as a mix task?

graphql do queries do # expose a `scanHtmlForProducts` query action MyApp.Shopping.Product, :scan_html_for_products, :scan_for_products end end json_api do routes do route MyApp.Shopping.Product, :get, "products/scan", :scan_for_products end end mix_tasks do action MyApp.Shopping.Product, :scan_html_for_products, :scan_for_products end

In 14 lines of code (3 if you don’t count the sections), we just created three full featured interfaces to our new AI enabled tooling 🤯.

Using our resources

What if we want our action to actually create products?

# Add an action create :create_product do accept [:name, :price] end # define the tool tools do tool :convert_to_usd, MyApp.Money.Currencies, :convert_to_usd tool :create_product, MyApp.Shopping.Product, :create_product end # and add the tool to an action action :scan_for_products, {:array, MyApp.Types.ProductInfo} do ... run prompt( LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o"}), tools: [:convert_to_usd, :create_product] ) end

And now we have an LLM driven data importer that creates found products automatically!

Scheduling Work

Want to periodically check a given web page for new products? AshOban has your back!

action :create_products_for_amazon_deals do run fn _, _ -> page = Req.get!("amazon_deals_url").resp_body MyApp.Shopping.scan_for_products!(page) :ok end end oban do scheduled_actions do schedule :scan_amazon, "0 */6 * * *", action: :create_products_for_amazon_deals end end

With just a few lines of code, our agent will check every 6 hours for new amazon products and create corresponding products in our product database 😎.

Complex/multi-agent workflows

Want to create a complex/multi-agent workflow? Reactor to the rescue 🎉. Often, instead of giving one agent a bunch of tools and relying on it to do the job, we want a hybrid approach. For example, we may want to use a cheaper model for some tasks, or provide our own structure.

defmodule MyApp.Shopping.ProductImport do use Reactor, extensions: [Ash.Reactor] input :url return :create_products step :read_page do argument :url, input(:url) run fn args -> {:ok, Req.get!(args.url).resp_body} end end # actions that don't depend on eachother happen in parallel! create :page_read, MyApp.History.PageRead, :create do inputs %{url: input(:url), contents: result(:read_page)} end # We're using the original example of this, that takes the text # and returns a list of ProductInfo structs. action :scan_for_products, MyApp.Shopping.Product, :scan_for_products do inputs %{page_contents: result(:read_page)} end # create inputs for a bulk create action step :create_product_input do argument :product_info, result(:scan_for_products) run &Enum.map(&1.product_info, &Map.take(&1, [:name, :price]) end bulk_create :create_products, MyApp.Shopping.Product, :create do initial result(:create_product_input) end end

You could then define a generic action that calls this reactor:

action :product_import, {:array, :struct} do # this returns a list of products constraints items: [instance_of: MyApp.Shopping.Product] argument :url, :string, allow_nil?: false run MyApp.Shopping.ProductImport end

And if you want to go full on inception, you can make :product_import a tool that you expose to other agentic workflows!

To make all this happen, we didn’t have to adopt a new framework for building agents, or even learn many new skills at all on top of what Ash already provides. All of the tools that Ash provides out of the box, the same tools that work for “regular” actions, can be leveraged to build first class AI products!

What does the future hold for Ash in a world with AI tools?

We’ve found that, while foundation LLM models need further prompting to be proficient with Ash, it still provides a ton of guardrails for agents to help you develop. So much of Ash is verified at compile time, and provides explanatory errors when things are configured incorrectly. Ash is a perfect middle-ground structure for your developers and their LLM tools to collaborate on. The Ash code they produce isn’t thousands of lines of spaghetti, its a few hundred lines of idiomatic, structured data.

By combining that strong foundation with first-class, built-in AI support through Ash AI, we believe Ash is in a perfect position to grow and thrive as more teams embrace AI-powered development.

Ready to give Ash AI a spin?

Need help with Ash?

Ash Premium Support is a subscription service that provides a direct line to our team of Ash experts, helping client teams maximise productivity and ensure project success. Learn more about how Ash Premium Support can help your team succeed with Ash. 

Read Entire Article