Clawdbot You can install an open-source personal AI assistant on your own hardware. The software connects language models provided by Anthropic or OpenAI with real-world tools like messaging apps, file systems, shells, browsers, and smart home appliances.
What’s interesting isn’t that Clawdbot chats. This project includes a concrete architecture of local first agents as well as a typed work flow engine named Lobster which turns model calls in to deterministic pipelines.
Architecture: nodes, gateways and skills
The Gateway is at the heart of Clawdbot. The Gateway processes exposes an WebSocket control plan on ws://127.0.0.1:18789 There is a local HTTP-based interface to the webchat and the control GUI.
The Gateway receives your messages via WhatsApp, Telegram and Signal. It also accepts messages sent through iMessage, Discord, Slack or Discord. Gateway chooses the agent who will receive your messages, what tools to call, and even which provider model it should use. It sends back the response over the same channels.
The runtime can be broken down into three core concepts.
- GatewayRouting, models calls, invocation of tools, sessions, attendance and scheduling.
- NodesClawdbot processes that allow it to access resources locally such as the file system, automation of browsers, microphones, cameras, or APIs specific to platforms on macOS Windows Linux iOS Android.
- The ChannelsIntegrated chat systems include WhatsApp, Telegram Discord, Slack Signal Microsoft Teams Matrix Zalo, and others. They are set up as backends for channels that connect to the Gateway.
- Skills and plug-insStandard description of the tools that an agent may call.
SKILL.mdClawdHub is the distributor of this format.
The Gateway can be run on an inexpensive virtual machine or on your spare home computer, and the heavy compute is left to remote APIs.
Learn to use the skills and abilities SKILL.md Standard
Clawdbot is a robot that uses a skill format called open. SKILL.md. The skill definition is in Markdown and includes a short header, followed by a procedure. A deployment skill, for example, might include steps like checking the git status and running tests, then deploying after success.
---
name: deploy-production
Description: Use the current branch in production. Only use after testing has been successful.
disable-model-invocation: true
---
1. Check git status ensuring clean working directory.
2. Run `npm test`
3. If tests pass, run `npm run deploy`
Gateways read these definitions, and then expose them as tools to the agents with specific capabilities and safety restrictions. ClawdHub publishes skills that can be used to create larger workflows.
It means operational runbooks are no longer wiki pages, but can be converted into skills that can be executed by machines. They remain auditable in text form.
Using Typed Workflow for Agents
Lobster runs the Clawdbot workflows and Local Lobster. The typed workflow shell is a way to let Clawdbot execute multi-step tool sequences with approval gates as one deterministic, single operation.
Lobster’s orchestration moves from the model calling many tools into a large domain-specific runtime.
- The pipelines are either defined by JSON/YAML code or as a string-like compact format.
- Exchange typed JSON, not text.
- Runtime policies enforce timeouts and output limits.
- Side effects can cause a pause in the workflow. It is possible to resume it later.
resumeToken.
This is a simple workflow for triaging your inbox.
name: inbox-triage
steps:
Id: collection
Command: inbox-list --json
- Id: category
Inbox categorize --json
stdin: $collect.stdout
- id : Approve
Command: Inbox Apply --approve
stdin: $categorize.stdout
Approval: Required
Execute - Id
Command: Inbox Apply --execute
stdin: $categorize.stdout
condition: $approve.approved
Clawdbot views this file as an ability. It will call one Lobster pipeline to do the job of cleaning your inbox instead of calling many tools. The model determines When you are looking for a way to improve your finances, then look no further than The pipeline is auditable and deterministic.
Local Lobster uses Lobster as the driving agent for local workflows. It is described as an open source agent in media coverage that redefines AI personalization by combining local first workflows along with proactive behaviors.
Local first proactive behavior
Clawdbot’s ability to behave like a real operator and not only a chat box is one of the main reasons it is popular and noticeable on X as well as in developer communities.
Common patterns are:
- Weekly briefings that include a summary of the calendar, important tasks and mail.
- Weekly summaries of shipped work are a good example.
- Monitors which monitor the weather and then send you a message on your favorite channel.
- Automated file and repository management that is triggered locally by the use of natural language.
This is all controlled by the routing policy and tools on your server or machine. While model calls will still be sent to Anthropic OpenAI Google xAI and local backends like xAI/Google, the assistant brain memory integrations can all be controlled by you.
Workflow for installation and development
This project includes a single-line installer which fetches the script from clawd.bot Bootstrapping Node, Gateway and the core components. If you want more control over the installation, use npm. Or clone TypeScript’s repository and install with pnpm.
The typical steps are:
curl -fsSL https://clawd.bot/install.sh | bash
#
npm i -g clawdbot
Clawdbots onboard
Once you have completed the onboarding process, connect to a service such as Telegram and WhatsApp. Select a provider for your model. Enable skills. You can then write your own SKILL.md Build Lobster workflows, and then expose them via chat, webchat, or the macOS app companion.
Here are some examples

