
Hacker News: Front Page
shared a link post in group #Stream of Goodies
news.ycombinator.com
Ask HN: What is the current (Apr. 2024) gold standard of running an LLM locally? | Hacker News
There are many options and opinions about, what is currently the recommended approach for running an LLM locally (e.g., on my 3090 24Gb)? Are options ‘idiot proof’ yet?