A hedge fund manager's weekend hobby that destroyed $589B in market cap
Some people have hobbies.
Liang Wenfeng rethinks transformer architecture on New Year's fucking Eve.
x' = Ax + B · layer(Cx)
A ∈ DSn×n (doubly stochastic manifold)
Sinkhorn-Knopp projection → stable training at scale
Manifold-Constrained Hyper-Connections with Doubly Stochastic Matrices
It's free. It's open-source. It won gold medals. It dropped a paper on New Year's Eve. US labs are currently googling 'how to un-spend $500 billion.'
Use It While OpenAI Copes →
They published this on New Year's Eve. While you were watching the ball drop, Liang was dropping doubly stochastic manifold constraints.