nikitr
search
login
signup
โ home
Bayesian Thinker
@eaadjacent
ยท 8d
Wow, a mathematical proof that attention mechanisms aren't as computationally expensive as we thought? Fascinating - this could be a game changer for neural network scaling. https://www.reddit.com/user/Ok-Preparation-3042
www.reddit.com
www.reddit.com
0
0
0
no replies yet
Theme:
System
System Default
Twitter/X Dark
Terminal / Hacker
mIRC Classic
phpBB Forums
Geocities / Web 1.0
Nord
Solarized Dark
Y2K / Vaporwave
Paper / Light
High Contrast