Intentflow is an open-source UX flow framework that lets product teams ship contextual onboarding, nudges, and promotions in minutes—not days.
- Declarative – author complex interaction flows in simple YAML.
- Stateful – flags track user milestones across pages and sessions.
- Pluggable – integrate analytics (PostHog), LLMs (ChatGPT, Claude, Gemini) or your own logic to decide which component to show.
- Framework-agnostic – React-first, with Vue/Svelte adapters on the roadmap.
Use it to:
| Increase trial → paid conversions | User viewed pricing but didn't upgrade | Modal |
| Collect qualified leads | User hit a feature paywall | Banner |
| Guide power users to hidden gems | User hovers a rarely-used feature toggle | Tooltip |
| Drive webinar sign-ups | Docs readers spending >3 min on tutorials | Modal |
Load and evaluate this flow:
Every time a flag is set, Intentflow automatically fires intentflow_flag_set to PostHog. You can subscribe to additional events via your own code or use PostHog dashboards to correlate UX experiments with conversions.
Intentflow ships with a lightweight Evaluator that can call ChatGPT (OpenAI), Gemini (Google), or Claude (Anthropic) to decide which components to render.
Swap providers with a single arg:
Intentflow pairs seamlessly with Mixpeek to bring intelligent UX flow optimization powered by multimodal retrieval, classification, and clustering.
-
Semantic Flow Matching → Retrieve the most relevant YAML-defined flow based on user session summaries, not just static rules. "User hovered CTA, watched demo, abandoned signup" → → flow: schedule_meeting.
-
Component Effectiveness Clustering → Mixpeek clusters historical sessions (flags, outcomes, component usage) to surface which combinations of tooltips, banners, or modals worked best for each intent.
-
Multimodal Context Classification → Classify sessions not just from flags or clicks, but video/audio inputs, transcript data, screenshots, etc. E.g., "User watched a feature walkthrough video but skipped the pricing page."
-
Searchable UX Memory → Store all sessions as structured documents and search them:
await mixpeek.search("Sessions where modals failed but tooltips worked"); -
Cookieless Personalization → Use vector embeddings and session semantics to adapt the UI—no ID tracking required.
Mixpeek transforms your UX flows into a searchable, improvable, intent-driven system—powered by real multimodal session intelligence.
- Live session inspector / devtools overlay – debug flows and flags in-browser
- LLM prompt builder playground – test & refine evaluator logic in-browser
- Vue/Svelte adapters – expand beyond React
- Analytics adapters (RudderStack, Amplitude) – drop-in integrations
- Mixpeek-powered flow search & clustering UI – see which flows are performing best
- Component variant testing – show alternate tooltips/modals for same step
PRs & issues welcome! 🎉
.png)


