What is currently the best LLM model for consumer grade hardware? Is it phi-4?

1 day ago 2
What is currently the best LLM model for consumer grade hardware? Is it phi-4?
1 point by VladVladikoff 11 minutes ago | hide | past | favorite | discuss

I have a 5060ti with 16GB VRAM. I’m looking for a model that can hold basic conversations, no physics or advanced math required. Ideally something that can run reasonably fast, near real time.

Read Entire Article