No Creds Notes #12
Clawdbot and Helix 02 (Lobsters and Terminator)
👋Hi!
Welcome back to another No Creds Notes! On deck this week we’ve got a new agentic AI that makes any other agentic AI you’ve used look like a preschooler and more updates on robots. I’m excited, so let’s jump into it.
New here! Uncredentialed is a twice-a-week newsletter focusing on tech, strategy, and the future, with essays on Tuesdays and roundups on Thursdays. If that interests you, subscribe today!
Your New Coworker is a Lobster
Alright, let’s talk about Clawdbot. Those of my subscribers active in startup/VC Twitter are probably sick of hearing about it since its viral moment a few days ago, but seeing some of the use cases here genuinely looked like the future so I’ve got to talk about it.
Clawdbot (now known as Moltbot, presumably because its name was too similar to Claude) goes well beyond the typical ChatGPT wrapper AI agent that we’ve all seen. It is a fully functional digital operator that remembers everything you tell it.
As you sleep it can fully autonomously create and edit your code, read and respond to emails, edit your writing and videos, negotiate purchases, and more, all without ever directly prompting it to do that task.
I really liked the framing in the video I linked of Clawdbot representing a shift from thinking about how to use AI as a tool to how you would enable a human employee to go make your business better on your own. The way we use AI today tends to be limited by the possibilities we can think of. We see our website has low traffic and ask AI to help us improve X, Y, or Z. Clawdbot helps you address the “unknown unknowns” by taking in as much info as it can about you and independently going around working to find areas it can make your life better.
The setup most people seem to be running is a Mac Mini running locally for privacy, with different models handling different tasks and passing along as many passwords as they feel comfortable with to let Clawd work autonomously.
On that note, Clawd seems to be a bit of a Rorschach test for the tech world. Some are going all-in, buying Mac Minis or even high end Mac Studios or other computers to integrate their full life with Clawdbot. Others, even those generally enthusiastic about AI’s potential, are hesitant worried about giving too much access or the threat of prompt injections.
Regardless of where you fall on that Rorschach test, I’d highly encourage you to watch the video I linked. It’s 15ish minutes if you watch it on 2x speed and paints a pretty vivid and exciting picture of where the future is headed.
Robots that can walk and chew gum
Anyone remember back in elementary school when inevitably some teacher or too smart for their own good student would proudly inform the class that people can’t multitask, it’s scientifically impossible? And then the ADHD of us out there would respond with “watch me.”
Well, Figure just said “watch me.”
Their launch of Helix 02 unveils humanoid robots that can do 2 things at once. Typically, when a robot reached for something, it had to stop walking, stabilize, think about life for the longest second ever, and then grab whatever it was reaching for.
Helix 02 solves this by controlling its entire body through one unified neural network, directly from visual input. No more stitching together discrete behaviors, it just moves like a person does.
Figure trained the foundational movement system (System 0) on over 1k hours of human motion capture data, replacing over 100k lines of hand-coded C++. The robot learned to move and balance by watching us instead of studying physics equations, just like we learn.
Watch the video I linked closely. We’ve seen plenty of dishwasher unloading videos. So many that sometimes I feel like the Bart meme when writing another No Creds Notes section on robots.
But I just couldn’t help myself. Yes, Helix 02 did a good job unloading the dishwasher but its the emergent behaviors it showed while unloading that really got me excited.
Using its hip to close the drawer? Nobody programmed
Steadying the door with its foot when its hands were full? Nobody programmed
Setting a bowl down to open the cabinet because its hands were full? Nobody programmed
I mean come on. Seeing moves like that start to come out as emergent behavior in a robot is both awesome and maybe even bordering a little on uncanny valley. Even a couple years from now I expect robots will be augmenting significant pieces of labor.
Like I said in my meme, 2026 is the year of the robot and we’re just getting started.
No Creds Reading List
Noah Smith argued for letting Chinese cars into the US
Guillermo Flor shared on building startups without suffering
Susan Montgomery discussed how AI’s allowing bad companies to survive longer
Startup Synergy told us all why LinkedIn sucks (but could be great)
If you enjoyed this week’s No Creds Notes, consider sharing it with a friend! And if this is your first time, it’s not too late to subscribe!





