Robust and modular LLM prompting using types, templates, constraints and an optimizing runtime.
lmql
@lmql.query def meaning_of_life(): '''lmql "Q: What is the answer to life, the \ universe and everything?" "A: [ANSWER]" where \ len(ANSWER) < 120 and STOPS_AT(ANSWER, ".") print("LLM returned", ANSWER) "The answer is [NUM: int]" return NUM ''' meaning_of_life()Created by the SRI Lab @ ETH Zurich and contributors.
LMQL now supports nested queries, enabling modularized local instructions and re-use of prompt components.
promptdown
Q: When was Obama born?200incontext
200ANSWER04/08/1961200incontext200incontext200 Q: When was Bruno Mars born?200incontext1200ANSWER08/10/1985200incontext1200incontext1200 Q: When was Dua Lipa born?200incontext2200ANSWER22/08/1995200incontext2200incontext2200 Out of these, who was born last?LASTDua LipaLMQL automatically makes your LLM code portable across several backends. You can switch between them with a single line of code.
Prompt construction and generation is implemented via expressive Python control flow and string interpolation.
LMQL
lmql
"My packing list for the trip:" for i in range(4): "- [THING] \n" where THING in \ ["Volleyball", "Sunscreen", "Bathing Suit"]Model Output
promptdown
My packing list for the trip: - THING Volleyball - THING Bathing Suit - THING Sunscreen - THING Volleyball
.png)
