Reflection, a startup founded just last year by two former Google DeepMind researchers, has raised two billion dollars at an eight billion dollar valuation. This represents a fifteen-fold leap from its five hundred forty-five million dollar valuation just seven months ago. The company originally focused on autonomous coding agents but is now positioning itself as an open-source alternative to closed frontier labs like OpenAI and Anthropic. It also aims to be a Western equivalent to Chinese AI firms like DeepSeek.
The startup was launched in March 2024 by Misha Laskin, who led reward modeling for DeepMind’s Gemini project, and Ioannis Antonoglou, who co-created AlphaGo. Their background in developing these advanced AI systems is central to their pitch, which argues that the right AI talent can build frontier models outside of established tech giants.
Along with its new funding round, Reflection announced it has recruited a team of top talent from DeepMind and OpenAI and built an advanced AI training stack that it promises will be open for all. The company says it has identified a scalable commercial model that aligns with its open intelligence strategy.
Reflection’s team currently numbers about sixty people, mostly AI researchers and engineers across infrastructure, data training, and algorithm development. The company’s CEO, Misha Laskin, stated that Reflection has secured a compute cluster and hopes to release a frontier language model next year trained on tens of trillions of tokens.
The company stated it has built something once thought possible only inside the world’s top labs: a large-scale language model and reinforcement learning platform capable of training massive Mixture-of-Experts models at frontier scale. After applying this approach to autonomous coding, the company is now bringing these methods to general agentic reasoning.
Mixture-of-Experts refers to a specific architecture that powers frontier large language models. Previously, only large, closed AI labs were capable of training these systems at scale. The company sees Chinese firms like DeepSeek, which had a breakthrough in training these models openly, as a wake-up call. Laskin expressed concern that if no action is taken, the global standard of intelligence will be built by someone else, not by America.
Laskin added that this situation puts the United States and its allies at a disadvantage because enterprises and sovereign states often avoid using Chinese models due to potential legal repercussions. He framed the choice as living at a competitive disadvantage or rising to the occasion.
American technologists have largely celebrated Reflection’s new mission. David Sacks, the White House AI and Crypto Czar, stated it is great to see more American open source AI models, noting a meaningful segment of the global market will prefer the cost, customizability, and control that open source offers. Clem Delangue, co-founder and CEO of Hugging Face, called the funding great news for American open-source AI, while noting the challenge will be to show a high velocity of sharing open AI models and datasets.
Reflection’s definition of being open seems to center on access rather than development, similar to strategies from Meta with Llama or Mistral. Laskin said Reflection would release model weights for public use while largely keeping datasets and full training pipelines proprietary. He stated that the model weights are the most impactful element because anyone can use and tinker with them, whereas the infrastructure stack can only be used by a select handful of companies.
This balance also underpins Reflection’s business model. Researchers will be able to use the models freely, but revenue will come from large enterprises building products on top of Reflection’s models and from governments developing sovereign AI systems. Laskin explained that large enterprises want an open model they have ownership over, can run on their own infrastructure, control its costs, and customize for various workloads.
Reflection has not yet released its first model, which will be largely text-based with multimodal capabilities planned for the future. The company will use the funds from this latest round to acquire the compute resources needed to train the new models, with the first model aimed for release early next year.
Investors in Reflection’s latest round include Nvidia, Disruptive, DST, 1789, B Capital, Lightspeed, GIC, Eric Yuan, Eric Schmidt, Citi, Sequoia, CRV, and others.

