"Locally-runnable, CPU-not-just-GPU-permitting LLMs for chatbot dev" probably means Llama, right? Or are there other contenders on par and worth (co)considering?
I'm late to the game (no regrets), having had simply zero inspirations until now for any kind of chatbot dev. So I wanna skip all the already-outdated (or hallucinated) tuts a simple web search might probably yield (given today's web search quality especially...)
This toy I'll tinker on will be foss&fully-free, no startup stuff.. so not looking for some bleeding edge. Will be sth like SocialAI but more limited in various ways (stuff I don't care for) and yet offline-capable and with way more user-side tweakability of the nature/inclinations/worldview/opinions/conditioning of the (majority of) bots. That's the angle I wanna experiment on, to bring "make your sufficiently-plausible filter bubble of interest for entertainment value, and observe it unfolding by itself over time". So for amusement, but should be interesting.
So yeah, thanks for sharing here which blogs/videos/other-tuts take the coder from (near)zero to CustomTargetedChatbot(s)HelloWorld, are recent / current and building ideally specifically on (LLMs with) an offline / local setup. I only know these things as prompt textareas so far, not as configurable hackable API surfaces..
On the off-chance there's a pre-LLM-era or non-LLM alternative (foss lib or sdk in any lang/stack) that you'd recommend, ie. better-than-Eliza but non/pre-LLMs, I'd be interested too =)
.png)


