Physical Intelligence, Stripe veteran Lachy Groom’s latest bet, is buildingSilicon Valley’s buzziest robot brains

From the street, the only indication of Physical Intelligence’s headquarters in San Francisco is a pi symbol painted a slightly different color than the rest of the door. Inside, the space is a giant concrete box softened by a haphazard sprawl of long blonde-wood tables. Some are clearly meant for lunch, dotted with Girl Scout cookie boxes, jars of Vegemite, and small wire baskets stuffed with condiments. The rest tell a different story entirely. They are laden with monitors, spare robotics parts, tangles of black wire, and fully assembled robotic arms in various states of attempting to master the mundane.

During my visit, one arm is folding a pair of black pants, or trying to. It’s not going well. Another is attempting to turn a shirt inside out with a determination that suggests it will eventually succeed, just not today. A third seems to have found its calling, quickly peeling a zucchini before depositing the shavings into a separate container. The shavings, at least, are going well.

“Think of it like ChatGPT, but for robots,” Sergey Levine tells me, gesturing toward the motorized ballet. Levine, an associate professor at UC Berkeley and one of the company’s co-founders, has the amiable, bespectacled demeanor of someone accustomed to explaining complex concepts. What I’m watching, he explains, is the testing phase of a continuous loop. Data gets collected on robot stations here and at other locations, and that data trains general-purpose robotic foundation models. When researchers train a new model, it returns to stations like these for evaluation. The pants-folder is someone’s experiment. So is the shirt-turner. The zucchini-peeler might be testing whether the model can generalize across different vegetables.

The company also operates test kitchens using off-the-shelf hardware to expose the robots to different environments. There’s a sophisticated espresso machine nearby, and I assume it’s for the staff until Levine clarifies that no, it’s there for the robots to learn. Any foamed lattes are data, not a perk for the engineers.

The hardware itself is deliberately unglamorous. These arms sell for about three thousand five hundred dollars, and that’s with what Levine describes as an enormous markup from the vendor. If manufactured in-house, the material cost would drop below one thousand dollars. A few years ago, a roboticist would have been shocked these things could do anything at all. But that’s the point: good intelligence compensates for bad hardware.

As Levine excuses himself, I’m approached by Lachy Groom, moving with purpose. At thirty-one, Groom still has the fresh-faced quality of Silicon Valley’s boy wonder, a designation he earned early by selling his first company nine months after starting it at age thirteen in his native Australia.

Groom found what he was looking for when he started following the academic work of Levine and Chelsea Finn, a former Berkeley PhD student of Levine’s who now runs her own lab at Stanford. Their names kept appearing in everything interesting happening in robotics. When he heard rumors they might be starting something, he tracked down Karol Hausman, a Google DeepMind researcher also involved. “It was just one of those meetings where you walk out and it’s like, This is it,” Groom says.

He never intended to become a full-time investor, even after leaving Stripe, where he was an early employee. He spent roughly five years as an angel investor, making early bets on companies while searching for the right company to start or join himself. “I was looking for five years for the company to go start post-Stripe,” he says. “Good ideas at a good time with a good team is extremely rare.”

The two-year-old company has now raised over one billion dollars. When I ask about its runway, he’s quick to clarify it doesn’t actually burn that much, with most spending going toward compute. A moment later, he acknowledges that under the right terms, he’d raise more. “There’s no limit to how much money we can really put to work,” he says. “There’s always more compute you can throw at the problem.”

What makes this arrangement unusual is what Groom doesn’t give his backers: a timeline for turning Physical Intelligence into a money-making endeavor. “I don’t give investors answers on commercialization,” he says of backers that include Khosla Ventures, Sequoia Capital, and Thrive Capital, which have valued the company at five point six billion dollars. “That’s sort of a weird thing, that people tolerate that.” But tolerate it they do.

So what’s the strategy? Co-founder Quan Vuong, who came from Google DeepMind, explains it revolves around cross-embodiment learning and diverse data sources. If someone builds a new hardware platform tomorrow, they can transfer all the knowledge the model already has. “The marginal cost of onboarding autonomy to a new robot platform, whatever that platform might be, it’s just a lot lower,” he says.

The company is already working with a small number of companies in different verticals to test whether their systems are good enough for real-world automation. Vuong claims that in some cases, they already are.

Physical Intelligence isn’t alone in chasing this vision. The race to build general-purpose robotic intelligence is heating up. Pittsburgh-based Skild AI, founded in 2023, recently raised one point four billion dollars at a fourteen billion dollar valuation and is taking a notably different approach. While Physical Intelligence remains focused on pure research, Skild AI has already deployed its commercial system, saying it generated thirty million dollars in revenue in just a few months last year.

Skild has even taken public shots at competitors, arguing that most robotics foundation models lack true physical common sense because they rely too heavily on internet-scale pretraining rather than physics-based simulation and real robotics data.

It’s a sharp philosophical divide. Skild AI bets that commercial deployment creates a data flywheel that improves the model. Physical Intelligence bets that resisting the pull of near-term commercialization will enable it to produce superior general intelligence. Who is more right will take years to resolve.

In the meantime, Physical Intelligence operates with what Groom describes as unusual clarity. “It’s such a pure company. A researcher has a need, we go and collect data to support that need, and then we do it. It’s not externally driven.” The company had a five- to ten-year roadmap of what the team thought would be possible. By month eighteen, they’d blown through it.

The company has about eighty employees and plans to grow, though Groom says hopefully as slowly as possible. What’s most challenging, he says, is hardware. “Hardware is just really hard. Everything we do is so much harder than a software company.” Hardware breaks, arrives slowly, and safety considerations complicate everything.

As Groom springs up to rush to his next commitment, I’m left watching the robots continue their practice. The pants are still not quite folded. The shirt remains stubbornly right-side-out. The zucchini shavings are piling up nicely.

There are obvious questions about whether anyone actually wants a robot in their kitchen peeling vegetables, about safety, and about whether all of the time and money being invested here solves big enough problems. Meanwhile, outsiders question the company’s progress, whether its vision is achievable, and if betting on general intelligence rather than specific applications makes sense.

If Groom has any doubts, he doesn’t show it. He’s working with people who’ve been working on this problem for decades and who believe the timing is finally right, which is all he needs to know.

Besides, Silicon Valley has been backing people like Groom and giving them a lot of rope since the beginning of the industry, knowing there’s a good chance that even without a clear path to commercialization or a timeline, they’ll figure it out. It doesn’t always work out. But when it does, it tends to justify a lot of the times it didn’t.