Everything you need to know about viral personal AI assistant Clawdbot (now Moltbot)

The latest wave of AI excitement has brought us an unexpected mascot: a lobster. Clawbonea personal AI assistant, went viral within weeks of launch and will keep the crustacean theme despite having to change the name to Moltbot after a legal challenge from Anthropic. But before you jump on the bandwagon, here’s what you need to know.
According to its tagline, Moltbot (formerly Clawdbot) is the “AI that actually does things” – whether it’s managing your calendar, sending messages through your favorite apps, or checking in for flights. This promise has driven thousands of users to tackle the required technical setup, even though it started as a messy personal project built by one developer for his own use.
That man is Peter Steinberger, an Austrian developer and founder who is known online as @steipete and actively blogs about his work. After abandoning his previous project, PSPDFkit, Steinberger felt empty and barely touched his computer for three years, he explains on his blog. But he eventually found his spark again – which led to Moltbot.
Although Moltbot is now much more than a solo project, the publicly available version still comes from it Claw“Peter’s crust assistant,” now called Molty, a tool he built to help him “manage his digital life” and “explore what human-AI collaboration could be.”
For Steinberger, this meant diving deeper into the momentum around AI that had reignited his builder spark. A self-confessed “Claudoholic”he initially named his project after Claude, Anthropic’s flagship. He then revealed on X that Anthropic forced him to change the branding for copyright reasons. TechCrunch has contacted Anthropic for comment. But the “lobster soul” of the project remains unchanged.
For early adopters, Moltbot represents the cutting edge of how helpful AI assistants can be. Those who were already excited about the prospect of using AI to quickly generate websites and apps are even more eager to have their personal AI assistant perform tasks for them. And just like Steinberger, they are eager to tinker with it.
This explains how Moltbot collected more than 44,200 stars on GitHub so fast. So much viral attention has been paid to Moltbot that it has even moved markets. Cloudflare stock increased by 14% in premarket trading on Tuesday, as social media buzzed around the AI agent, it reignited investor enthusiasm for Cloudflare’s infrastructure, which developers use to run Moltbot locally on their devices.
WAN event
San Francisco
|
October 13-15, 2026
Still, breaking into early adopter territory is still a long way off, and perhaps that’s for the best. Installing Moltbot requires technical knowledge, and that includes being aware of the inherent security risks that come with it.
On the one hand, Moltbot is built with security in mind: it’s open source, meaning anyone can inspect the code for vulnerabilities, and it runs on your computer or server, not in the cloud. But on the other hand, its premise is inherently risky. As an entrepreneur and investor Rahul Sood pointed to X“’actually doing things’ means ‘being able to execute arbitrary commands on your computer.’”
What keeps Sood awake at night is “rapid content injection” – where a malicious actor can send you a WhatsApp message that could lead to Moltbot taking unintended actions on your computer without your intervention or knowledge.
That risk can be partly limited by careful preparation. Because Moltbot supports several AI models, users may want to make configuration choices based on their resistance to these types of attacks. But the only way to completely prevent this is to run Moltbot in a silo.
This may seem obvious to experienced developers tinkering with a week-old project, but some of them have become louder in warning users attracted by the hype: things can get ugly quickly if they approach it as carelessly as ChatGPT.
Steinberger himself was reminded that evil actors exist when he “messed up” the renaming of his project. He complained on X that “crypto scammers” grabbed his GitHub username and created fake cryptocurrency projects in his name, and he warned his followers that “any project that lists [him] as a coin owner is a SCAM.” He then posted that it was GitHub problem had been resolved but warned that the legitimate X account is @moltbot, “not one of the 20 scams of it.”
This doesn’t necessarily mean you should stay away from Moltbot at this stage if you’re curious about testing it. But if you’ve never heard of a VPS – a virtual private server, which is essentially a remote computer that you rent to run software – you might want to wait your turn. (You might want to run Moltbot on that for now. “Not the laptop with your SSH keys, API credentials, and password manager,” Sood warned.)
Currently, running Moltbot safely means running it on a separate computer with disposable accounts, which defeats the purpose of having a useful AI assistant. And resolving that trade-off between safety and utility may require solutions beyond Steinberger’s control.
Still, by building a tool to solve his own problem, Steinberger showed the developer community what AI agents could actually achieve and how autonomous AI could eventually become truly useful rather than just impressive.




