Discussion about this post

User's avatar
The AI Architect's avatar

Fantastic reporting on the RAISE/SB53 swap. The knowledge distillation carveout is wild because it basically creates a regulatory arbitrage where companies can just use model distilation to sidestep the revenue threshold entirely. Seems like the $500m revenue bar versus $100m training compute criteria means startups burning cash on frontier models get a pass until they actually monetize, which feels backwards when the risk profile is about capability not profitibility.

Expand full comment

No posts

Ready for more?