openclaw-embodied-os
v0.1.0
Published
A unified operating system for controlling embodied intelligent robots - the control hub bridging AI agents and physical world
Maintainers
Readme
Embodied-OS
The Control Hub for Embodied Intelligence - Bridging AI Agents and the Physical World
Embodied-OS is a unified operating system that enables AI agents to seamlessly control physical robots and interact with the real world. It provides a standardized interface for perception, reasoning, and action execution across diverse robotic platforms.
Core Vision
Transform how AI interacts with physical reality by providing:
- Unified Control Interface: Single API for controlling any robot
- AI-Native Design: Built for LLM and agent-based control
- Real-time Perception: Multi-modal sensor fusion (vision, audio, touch)
- Safe Execution: Built-in safety constraints and human oversight
- Plug-and-Play: Easy integration with existing robotic systems
Architecture
┌─────────────────────────────────────────────────────────────┐
│ AI Agent Layer │
│ (Claude, GPT, Custom Agents) │
└────────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Embodied-OS Core │
│ ┌──────────────┐ ┌──────────────┐ ┌─────────────────┐ │
│ │ Natural │ │ Task │ │ Safety │ │
│ │ Language │ │ Planner │ │ Validator │ │
│ │ Interface │ │ │ │ │ │
│ └──────────────┘ └──────────────┘ └─────────────────┘ │
│ ┌──────────────┐ ┌──────────────┐ ┌─────────────────┐ │
│ │ Perception │ │ Action │ │ State │ │
│ │ Module │ │ Executor │ │ Manager │ │
│ │ │ │ │ │ │ │
│ └──────────────┘ └──────────────┘ └─────────────────┘ │
└────────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Robot Abstraction Layer (RAL) │
│ Unified interface for all robot types and platforms │
└────────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Physical Robots │
│ 🦾 Manipulators 🚗 Mobile Robots 🦿 Humanoids 🚁 Drones │
└─────────────────────────────────────────────────────────────┘Quick Start
Installation
# Clone the repository
git clone https://github.com/ZhenRobotics/openclaw-embodied-os.git
cd openclaw-embodied-os
# Install dependencies
pip install -e .
# Or install from PyPI
pip install openclaw-embodied-os
# Configure API keys (for AI agents)
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."Basic Usage
from embodied_os import EmbodiedOS, RobotConfig
# Initialize the OS
os = EmbodiedOS()
# Connect to a robot
robot = os.connect_robot(
platform="universal_robot",
model="UR5e",
endpoint="192.168.1.100"
)
# Natural language control
os.execute("Pick up the red cube and place it in the box")
# Or programmatic control
result = robot.actions.grasp(
object_id="red_cube",
approach_vector=[0, 0, -1]
)Agent Integration
from embodied_os import AgentInterface
# Create an agent interface
agent = AgentInterface(
model="claude-4-sonnet",
robot=robot
)
# The agent can now control the robot through natural language
agent.chat("Look around the room and tell me what you see")
agent.execute("Navigate to the kitchen and open the fridge")Core Components
1. Robot Abstraction Layer (RAL)
Provides a unified interface across different robot platforms:
- Motion Control: Move, rotate, grasp, release
- Perception: Camera, LiDAR, IMU, force sensors
- State Management: Joint positions, velocities, forces
- Safety: Collision detection, emergency stop, workspace limits
Supported Platforms:
- Universal Robots (UR3e, UR5e, UR10e)
- Franka Emika Panda
- Boston Dynamics Spot
- Custom robots via plugin system
2. Perception System
Multi-modal sensor processing:
# Visual perception
objects = robot.perception.detect_objects()
depth_map = robot.perception.get_depth_map()
# Audio perception
audio_stream = robot.perception.listen()
transcription = robot.perception.transcribe(audio_stream)
# Tactile sensing
contact_force = robot.perception.get_contact_force()3. Action Executor
High-level action primitives:
# Navigation
robot.actions.navigate_to(x=2.0, y=1.5, theta=0)
# Manipulation
robot.actions.pick(object="cup")
robot.actions.place(location="table", position=[0.5, 0.3, 0])
# Interaction
robot.actions.press_button(target="elevator_button")
robot.actions.open_door(handle_position=[1.0, 0.5, 1.0])4. Task Planner
AI-powered task decomposition:
# High-level task
task = "Prepare coffee for the user"
# Automatic decomposition
plan = robot.planner.create_plan(task)
# Returns:
# 1. Navigate to kitchen
# 2. Locate coffee machine
# 3. Pick up cup
# 4. Place cup under dispenser
# 5. Press brew button
# 6. Wait for brewing
# 7. Pick up cup
# 8. Navigate to user
# 9. Hand over cup
# Execute with monitoring
robot.planner.execute(plan, monitor=True)5. Safety System
Multi-layer safety guarantees:
# Define safety constraints
robot.safety.set_workspace_bounds(
x_min=0, x_max=2.0,
y_min=-1.0, y_max=1.0,
z_min=0, z_max=1.5
)
# Collision avoidance
robot.safety.enable_collision_avoidance(
objects=["table", "wall", "human"]
)
# Force limits
robot.safety.set_max_force(newtons=50)
# Emergency stop
robot.safety.set_emergency_stop_callback(on_emergency)Use Cases
1. Warehouse Automation
warehouse_robot = os.connect_robot(platform="mobile_manipulator")
# Agent-driven inventory management
agent.execute("""
Go to aisle 5, shelf B.
Pick up all items marked with red tags.
Transport them to the packing station.
Report the quantity and item IDs.
""")2. Elderly Care Assistant
care_robot = os.connect_robot(platform="service_robot")
# Proactive assistance
agent.monitor_and_assist("""
Watch for the person calling for help.
If they ask for water, bring them a glass.
If they drop something, pick it up.
If they seem distressed, alert the caregiver.
""")3. Research Lab Assistant
lab_robot = os.connect_robot(platform="dual_arm_robot")
# Complex manipulation
agent.execute("""
Set up the chemistry experiment:
1. Measure 50ml of solution A into beaker
2. Heat to 60 degrees Celsius
3. Add 2 drops of catalyst
4. Stir for 2 minutes
5. Transfer to test tube
""")Features
Agent-First Design
- Natural Language Control: Speak to robots like you speak to humans
- Multi-Agent Coordination: Multiple agents controlling multiple robots
- Context Awareness: Robots understand their environment and task context
- Learning from Feedback: Improve performance based on corrections
Cross-Platform Support
- Hardware Agnostic: Works with any robot via adapters
- ROS/ROS2 Integration: Seamless integration with ROS ecosystem
- Simulation Support: Test in Gazebo, Isaac Sim, MuJoCo
- Cloud and Edge: Deploy on cloud servers or embedded devices
Developer Friendly
- Python SDK: Intuitive API for rapid development
- TypeScript SDK: Web-based control interfaces
- REST API: HTTP endpoints for any language
- WebSocket: Real-time bidirectional communication
Production Ready
- Monitoring Dashboard: Real-time robot status and metrics
- Logging System: Comprehensive action and event logs
- Error Recovery: Automatic retry and fallback mechanisms
- Fleet Management: Control and monitor multiple robots
Installation
Prerequisites
- Python 3.9+
- (Optional) ROS2 Humble or later
- (Optional) CUDA for vision processing
From PyPI
pip install openclaw-embodied-osFrom Source
git clone https://github.com/ZhenRobotics/openclaw-embodied-os.git
cd openclaw-embodied-os
pip install -e .Hardware Setup
Connect your robot to the network and configure the endpoint:
# For Universal Robots
embodied-os connect --platform ur --model ur5e --ip 192.168.1.100
# For custom robots
embodied-os connect --platform custom --config robot_config.yamlConfiguration
Create a configuration file config.yaml:
robot:
platform: universal_robot
model: UR5e
endpoint: 192.168.1.100
perception:
cameras:
- name: head_camera
type: realsense_d435
resolution: [1280, 720]
fps: 30
lidar:
enabled: true
type: velodyne_vlp16
safety:
workspace:
x: [0, 2.0]
y: [-1.0, 1.0]
z: [0, 1.5]
max_velocity: 0.5 # m/s
max_force: 50 # N
collision_check: true
agent:
model: claude-sonnet-4
api_key: ${ANTHROPIC_API_KEY}
temperature: 0.7
max_retries: 3API Reference
Core Classes
EmbodiedOS
Main interface to the system.
os = EmbodiedOS(config_path="config.yaml")
robot = os.connect_robot(platform, model, endpoint)
os.disconnect()Robot
Represents a connected robot.
robot.actions.move_to(x, y, z)
robot.perception.get_image()
robot.state.get_joint_positions()
robot.safety.emergency_stop()AgentInterface
AI agent control interface.
agent = AgentInterface(model="claude-4", robot=robot)
agent.execute(task_description)
agent.chat(message)Examples
See the examples/ directory for complete working examples:
examples/basic_control.py- Basic robot controlexamples/agent_navigation.py- Agent-driven navigationexamples/multi_robot.py- Coordinate multiple robotsexamples/vision_manipulation.py- Vision-guided manipulationexamples/human_interaction.py- Human-robot interaction
Development
Running Tests
pytest tests/Building Documentation
cd docs
make htmlContributing
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Roadmap
Phase 1: Core Platform (Current)
- [x] Robot abstraction layer
- [x] Basic perception system
- [x] Action executor
- [x] Safety system
- [ ] Agent interface
Phase 2: Advanced Features (Q2 2026)
- [ ] Multi-robot coordination
- [ ] Advanced vision processing
- [ ] Learning from demonstration
- [ ] Cloud deployment
Phase 3: Ecosystem (Q3 2026)
- [ ] Skill marketplace
- [ ] Community plugins
- [ ] Simulation environments
- [ ] Mobile app control
Community
- GitHub: https://github.com/ZhenRobotics/openclaw-embodied-os
- Discord: https://discord.gg/embodied-os
- Documentation: https://docs.embodied-os.ai
- Blog: https://blog.embodied-os.ai
License
MIT License - see LICENSE file for details.
Citation
If you use Embodied-OS in your research, please cite:
@software{embodied_os_2026,
title = {Embodied-OS: A Unified Operating System for Embodied Intelligence},
author = {ZhenRobotics Team},
year = {2026},
url = {https://github.com/ZhenRobotics/openclaw-embodied-os}
}Acknowledgments
Built with inspiration from:
- ROS (Robot Operating System)
- OpenAI Robotics
- Boston Dynamics AI Institute
- The open-source robotics community
Embodied-OS - Making robots as easy to control as talking to a friend.
