Batch Inference, Type Systems, and Why Cortex Aisql Got Me Excited

2 hours ago 2

04 Jun, 2025

Cortex AISQL is the Snowflake announcement that got me most excited this year.

Why? Because it marks a shift in how we think about integrating LLMs into actual data systems. Not as magic boxes, but as functions with structure, semantics, and type systems.

Here are a few reflections:

  1. Prompt operators as primitives

You can do almost anything with a raw prompt but if five well-defined operators can cover 80% of use cases, that’s a signal. It means we’re starting to distill real structure from the noise, and productize LLM behavior in a way that’s more reproducible and composable.

  1. Batch inference gets first-class support

AISQL is optimized for high throughput, high latency workloads, quite the opposite of today’s infra that prioritizes low latency, online use cases (like chat). That shift matters. It acknowledges that many teams want to process data at scale, not just serve interactive prompts.

  1. Structured data isn’t going away

The real value shows up when you combine structured data with unstructured content like PDFs, audio, and images. AISQL lets you bring them together in the same query plan. That’s powerful.

  1. New workloads need new types

The introduction of a File type might seem small, but it’s not. When the way we compute changes, our type systems need to evolve too. This is how you make multi-modal processing ergonomic.

That said, I have some open questions. Especially around composability.

SQL has always struggled here. That’s part of why DBT succeeded, it brought structure and reuse to what were previously brittle, opaque queries.

With AISQL, composability becomes even more critical:

  • Inference is expensive. If I have 5 CTEs each invoking an LLM operator, I don’t want to re-run them all just to test changes in the 6th.
  • Inference is non deterministic. If output shifts every time I run the query, how do I isolate the effect of prompt changes from upstream variation?

These are just early thoughts. I’m excited to see how Snowflake and others tackle these problems and I have some ideas of my own I’ll be sharing soon.

Would love to hear how others are thinking about LLM inference inside analytical workflows.

#AI tooling #Cortex AISQL #LLMs #Snowflake #batch inference #inference pipelines #prompt engineering #structured data #type systems #unstructured data

Read Entire Article