pseudoc lets you compile pseudocode into blazingly fast (🔥) native executables using AI 🚀
pseudoc works by sending your pseudocode to a large language model, telling it to translate it into a Go program. After that, an embedded Go compiler is used to compile it down to a self-contained native executable!
pseudoc combines two technologies that should probably not be combined: AI & Compilers. This allows for many innovative features not seen in any other compilers:
Since LLMs are unpredictable, you get completely different results every time 🔥
To compile your program, you need access to the internet! Not to mention you also get to pay credits every time you compile any program.
Note
This feature can optionally be disabled by self-hosting your own LLM.
Since any code can be considered as being pseudocode, pseudoc is technically a compiler for all languages in the world!
For example, pseudoc makes Python programs over 40x faster! 🔥👇
You can get precompiled executables of pseudoc from the Releases. Alternatively, compile it yourself:
The environment variables OPENAI_URL and OPENAI_API_KEY must be set to any OpenAI Chat Completions API compatible endpoint. The OPENAI_MODEL variable must also be set to the LLM you want to use. Select a model that supports reasoning for best accuracy. You can use a .env file (see .env.example for a template).
Compile a program:
Output intermediate representation instead of executable:
Good luck and Godspeed.