The latest wave of AI excitement has brought us an unexpected mascot: a lobster. A personal AI assistant called Moltbot went viral within weeks of its launch. It will keep its crustacean theme despite having had to change its name from Clawdbot after a legal challenge from Anthropic. But before you jump on the bandwagon, here is what you need to know.
According to its tagline, Moltbot is the “AI that actually does things.” This includes managing your calendar, sending messages through your favorite apps, or checking you in for flights. This promise has drawn thousands of users willing to tackle the technical setup required, even though it started as a scrappy personal project built by one developer for his own use.
That developer is Peter Steinberger, an Austrian founder known online as @steipete. After stepping away from his previous project, PSPDFKit, Steinberger felt empty and barely touched his computer for three years. But he eventually found his spark again, which led to Moltbot.
While Moltbot is now much more than a solo project, the publicly available version still derives from a tool he built to help manage his digital life and explore human-AI collaboration. For Steinberger, this meant diving deeper into the momentum around AI that had reignited his builder spark. A self-confessed enthusiast of Anthropic’s AI, Claude, he initially named his project after it. He revealed that Anthropic subsequently forced him to change the branding for copyright reasons. But the project’s “lobster soul” remains unchanged.
To its early adopters, Moltbot represents the vanguard of how helpful AI assistants could be. Those excited by using AI to generate websites and apps are keen to have a personal AI assistant perform tasks. And just like Steinberger, they are eager to tinker with it.
This explains how Moltbot amassed more than 44,200 stars on GitHub so quickly. So much viral attention has been paid to Moltbot that it has even moved markets. Cloudflare’s stock surged 14% in premarket trading as social media buzz around the AI agent re-sparked investor enthusiasm for Cloudflare’s infrastructure, which developers use to run Moltbot locally.
Still, it is a long way from breaking out of early adopter territory, and maybe that is for the best. Installing Moltbot requires being tech savvy, and that includes awareness of the inherent security risks that come with it.
On one hand, Moltbot is built with safety in mind. It is open source, meaning anyone can inspect its code for vulnerabilities, and it runs on your computer or server, not in the cloud. But on the other hand, its very premise is inherently risky. As entrepreneur and investor Rahul Sood pointed out, an AI that “actually does things” means it can execute arbitrary commands on your computer.
What keeps Sood concerned is the risk of “prompt injection through content.” This is where a malicious person could send a message that leads Moltbot to take unintended actions on your computer without your knowledge. That risk can be mitigated partly by careful set-up. Since Moltbot supports various AI models, users can make choices based on their resistance to these attacks. But the only way to fully prevent it is to run Moltbot in a silo.
This may be obvious to experienced developers tinkering with a weeks-old project, but some have become more vocal in warning users attracted by the hype. Things could turn ugly fast if they approach it as carelessly as they might use a standard chatbot.
Steinberger himself was served a reminder that malicious actors exist when he messed up the renaming of his project. He complained that crypto scammers snatched a related GitHub username and created fake cryptocurrency projects in his name. He warned followers that any project listing him as a coin owner is a scam. He later posted that the GitHub issue had been fixed, but cautioned that the legitimate account is @moltbot, not any of the scam variations.
This does not necessarily mean you should stay away from Moltbot if you are curious to test it. But if you have never heard of a VPS, a virtual private server, you may want to wait. That is where you may want to run Moltbot for now, not on the laptop with your sensitive credentials.
Right now, running Moltbot safely means running it on a separate computer with throwaway accounts, which defeats the purpose of having a useful AI assistant. Fixing that security-versus-utility trade-off may require solutions beyond Steinberger’s control.
Still, by building a tool to solve his own problem, Steinberger showed the developer community what AI agents could actually accomplish. He demonstrated how autonomous AI might finally become genuinely useful rather than just impressive.

