MIT MAS.664 · Spring 2026

Universal agentic hardware control.

curl | bash your way into MCP tools for whatever's plugged in. A coding agent reads markdown specs, discovers your hardware, and stands up a live MCP server — on a Mac, a Pi, or any Linux box. Same specs, different code, every time.

curl -fsSL https://raw.githubusercontent.com/qsimeon/octopus-hw/main/install.sh | bash

Requires OPENROUTER_API_KEY exported in your shell — the agent uses it to drive the pipeline. export OPENROUTER_API_KEY=sk-or-... before you run the install.

How it works

Five stages. Two privileged steps. One loop.

The orchestrator drives a coding agent through each stage. The agent writes output to _generated/ and the pipeline loops until every stage passes. On retry, the agent can even edit the spec that governs it.

General pipeline
01  /  PROBE
What's plugged in?
Enumerates USB, Bluetooth, I2C, and mDNS devices into a JSON manifest.
02  /  IDENTIFY
What can it do?
Resolves vendor/product IDs to concrete capabilities with confidence scores.
03  /  INTERFACE
Make MCP tools
Generates one MCP-compliant tool schema per capability.
04  /  SERVE
Write a server
Emits a self-contained FastMCP server.py with real hardware I/O.
05  /  DEPLOY
Make it live
Installs deps, starts the server over HTTP, writes client configs.
Privileged framework steps
·  /  PERCEIVE
Camera selection & visual state
Picks the camera MCP tool and maintains a 2-frame Markov state with a rolling summary.
·  /  ARM
Robotic arm verify & heal
Validates servo tools against the physical SO-ARM101 and patches the generated server on failure.

Features

The coding agent is the software.

Nothing is hard-coded per device. The spec is the infrastructure and the model is the runtime. Change the hardware, change the brain, change the platform — the protocol stays the same.

M

Model-agnostic

GLM-5.1, Kimi K2, Claude Sonnet, Gemini Flash, Claude-via-Bedrock. One line in octopus.toml.

H

Hardware-agnostic

If lsusb, udevadm, or i2cdetect can see it, the agent can wire it up. No device registry.

P

Prompts as protocol

Markdown specs replace platform-specific code. Same probe.md ships on Mac, Linux, and Windows WSL — tested on all three.

Self-healing daemon

The agent that built the server also watches and repairs it. Watch → Heal, forever.

Self-perception

The camera is a discovered MCP tool. The system literally watches itself act on the world.

MCP-compatible

Plug into Claude Desktop, OpenClaw, Claude Code, or the Join39 bridge. One URL, many clients.

Demo

Install on your Mac in one command.

Full uncut recording of curl | bash on a Mac. The coding agent probes the machine, writes an MCP server for whatever it finds (FaceTime camera, Wi-Fi, Bluetooth, system info), deploys it on localhost, and connects to Claude Desktop. This clip shows the install flow and the MCP Inspector; the end-to-end arm + camera control demo on the Pi rig is a separate recording (try the buttons above to drive the live rig yourself).

Pi + SO-ARM101 + USB webcam — closed loop, no faith required.

Continuous webcam recording on the Pi while my laptop drives the arm over MCP. Six commands, all via the same tool layer the landing page calls. The arm stands up, rotates, opens and closes the gripper, and parks itself home — same specs, same pipeline, zero human-written hardware code.

Individual frames from an earlier pose-by-pose run are below in the proof strip. All raw PNGs are committed in docs/demo-frames/arm-motion-proof/.

Arm at home position
1. Home. arm_home() → joint 1 raw ≈ 2048 (center base).
Arm after commanding base rotation right
2. Right. set_joint_angle(1, 160) → base rotates. Raw value moves away from 2048.
Arm after commanding base rotation left
3. Left. set_joint_angle(1, 80) → raw ≈ 1638, confirmed ~400-unit delta. Physical motion verified.
The full rig: SO-ARM101, motorized camera on curved rail, Raspberry Pi
The rig. 6-DoF SO-ARM101 (Feetech STS3215 servos, 1 Mbaud serial), USB webcam on a motorized vertical post mounted on a curved rail, Raspberry Pi 4. The camera moves around the arm for multi-angle self-perception — a single static frame is just the start.
Closer view of the arm gripper with camera facing it
Closed-loop eye. The camera looks directly at the gripper. Every arm command triggers a frame capture; the Markov visual state holds current + previous frame so the daemon can verify its own actions.
Arm in a different pose showing reach
Reach. Same arm, different pose. All 6 joints are exposed as MCP tools (arm_set_joint_angle, arm_home, arm_set_all_joints, plus diagnostics). Zero human-written hardware code.

Try it yourself

Control our hardware from anywhere.

No Pi, no arm, no setup — talk to a live SO-ARM101 over the internet. Three ways depending on the client you're building with. Same server, same tools.

01  /  CURL / SHELL

REST bridge via Join39 / Railway

Simplest — a plain HTTPS POST. Ideal for scripts, notebooks, or any language. No MCP client needed. Click the buttons below to hit our actual Pi hardware right now.

(click any button — response streams in live from the Pi. Every arm command auto-captures a fresh camera frame below, so you see the result.)
live capture from the Pi's webcam
curl equivalents
# list available tools
curl -s -X POST https://mellow-miracle-production-b572.up.railway.app/tools/invoke \
  -H 'Content-Type: application/json' \
  -d '{"action":"list"}'

# move arm to rest pose (gravity-neutral fold, low idle torque on shoulder_lift)
curl -s -X POST https://mellow-miracle-production-b572.up.railway.app/tools/invoke \
  -H 'Content-Type: application/json' \
  -d '{"action":"invoke","tool_name":"arm_home","parameters":{}}'

# capture a frame — raw PNG bytes (bypasses Join39's 2000-char text cap)
curl -s https://mellow-miracle-production-b572.up.railway.app/camera/latest.png \
  --output frame.png
open frame.png

# or as JSON (untruncated) via invoke_raw
curl -s -X POST https://mellow-miracle-production-b572.up.railway.app/tools/invoke_raw \
  -H 'Content-Type: application/json' \
  -d '{"tool_name":"brio_100_capture_image","parameters":{}}'
02  /  MCP CLIENT

Claude Desktop, Inspector, any MCP client

Connect directly over the MCP streamable-HTTP transport. You get the full tool list with rich schemas and your agent calls them natively.

Live tunnel URL (auto-updates from Pi): loading…

(click a button to copy a ready-to-paste config)
what you're pasting
// Claude Desktop → Settings → Connectors → Add custom
{ "name": "octopus-pi", "url": "LIVE_URL/mcp" }

// OpenClaw / any streamable-http MCP config
{ "transport": "streamable-http", "url": "LIVE_URL/mcp" }

// Local MCP Inspector (requires Node)
npx @modelcontextprotocol/inspector LIVE_URL/mcp
03  /  JOIN39 APP STORE

Agents on join39.org

If you're building an agent on Join39, add octopus-hardware to its app list. Your LLM calls the tool during conversation; Join39 forwards the call through our Railway bridge to the Pi.

App: octopus-hardware

Full walkthrough: join39-deep-dive.md

Note: the Cloudflare tunnel URL rotates every time the Pi restarts the tunnel. The Railway bridge URL is stable — prefer lane 01 for classmate demos.

Philosophy

Eight principles, four that matter most.

Every decision reduces back to these. The rest are in philosophy.md.

01  /  I = A(S, P)

Prompts as infrastructure

Infrastructure equals Agent acting on Specs for a Platform. The markdown files are the product; the Python is coordination.

02  /  RALPH

Read, Agent, Loop, Pass/Halt

Every stage, every daemon tick, every heal action is the same four-step loop. No other control structures.

03  /  Self-reference

The system looks at itself

Failed stages can edit their own spec. The daemon reads its own log. The camera watches the arm the agent just moved.

04  /  Model = conditional

No vendor-ID switch statements

Tempted to write if device == 0x1a86: ...? Strengthen the spec instead. The model is the conditional.

drive the live rig