Select - Your Community
Select
Get Mobile App

Stream of Goodies

avatar

Hacker News: Front Page

shared a link post in group #Stream of Goodies

Feed Image

news.ycombinator.com

Ask HN: What is the current (Apr. 2024) gold standard of running an LLM locally? | Hacker News

There are many options and opinions about, what is currently the recommended approach for running an LLM locally (e.g., on my 3090 24Gb)? Are options ‘idiot proof’ yet?

Comment here to discuss with all recipients or tap a user's profile image to discuss privately.

Embed post to a webpage :
<div data-postid="zwmgvoo" [...] </div>
A group of likeminded people in Stream of Goodies are talking about this.