Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
fdaudens 
posted an update Nov 13
Post
1844
Been reading about the "bigger models = better AI" narrative getting pushed back today.

@thomwolf tackled this head on at Web Summit and highlighted how important small models are (and why closed-source companies haven't pushed for this 😬). They're crushing it: today's 1B parameter models outperform last year's 10B models.

Fascinating to hear him talk about the secret sauce behind this approach.
In this post