Select - Your Community
Select
Get Mobile App

Stream of Goodies

avatar

Hacker News: Front Page

shared a link post in group #Stream of Goodies

arxiv.org

Mixtral of Experts

We introduce Mixtral 8x7B, a Sparse Mixture of Experts (SMoE) language model. Mixtral has the same architecture as Mistral 7B, with the difference that each layer is composed of 8 feedforward blocks (

Comment here to discuss with all recipients or tap a user's profile image to discuss privately.

Embed post to a webpage :
<div data-postid="ywwpekr" [...] </div>
A group of likeminded people in Stream of Goodies are talking about this.