PrismML Exits Stealth With First Commercially Viable 1-Bit Large Language Models, Fitting an 8B-Parameter Model in 1.15 GB
A Caltech spinout backed by Khosla Ventures has released a family of fully 1-bit language models under Apache 2.0 that compress an 8-billion-parameter model to 1.15 GB while matching full-precision rivals on standard benchmarks.
4 min read3 sources