Hi HN,
I started building VexFS yesterday — a kernel-native filesystem that stores vector embeddings alongside files, supports semantic search (brute-force now, HNSW later), and exposes everything through a minimal IOCTL + mmap interface.
Think of it as:
A semantic memory layer for local AI agents
RAG without a vector DB
Vector search as an OS primitive
It’s early. It barely works. But it boots.
Why? Because if memory’s not snapshotable, it’s not memory. And maybe, just maybe, agents deserve a /mnt/mem they can mount natively.
When I asked Gemini what it thought of the idea, it said:
“An OS-level semantic context layer like this could enable more powerful, context-aware, and efficient AI systems.”
Not sure if it's a brilliant idea or a kernel panic waiting to happen. Either way, I’d love your feedback (and flames).