Run any LLM locally on your Mac in less than 2 mins

7 hours ago 2

Run any LLM locally on your Mac in less than 2 mins

Yes, it's that simple

I am just surprised that is is so simple. Plus it’s so elegant, I want to stand on my rooftop and shout. Anyway, here are the steps. BTW you only need 1min if you dont care about a fancy chat interface.

Step 1:

Visit https://ollama.com/ click on download

Step 2:

Click on Models on the above page, pick any model you want and run a cmd like this in your terminal

ollama run gemma3:4b

Step 3:

Congrats! Your local LLM is now up and running. You can start talking to it in the terminal itself.

Step 4:

Visit https://github.com/open-webui/open-webui to get the chat UI.

Run these 2 simple cmds to install

pip install open-webui open-webui serve

Step 5:

It will automatically connect with ollama and you can start chatting!

Image

Read Entire Article