LeRobot documentation

Bring Your Own Hardware

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Bring Your Own Hardware

This tutorial will explain how to integrate your own robot design into the LeRobot ecosystem and have it access all of our tools (data collection, control pipelines, policy training and inference).

To that end, we provide the Robot base class in the LeRobot which specifies a standard interface for physical robot integration. Let’s see how to implement it.

Prerequisites

  • Your own robot which exposes a communication interface (e.g. serial, CAN, TCP)
  • A way to read sensor data and send motor commands programmatically, e.g. manufacturer’s SDK or API, or your own protocol implementation.
  • LeRobot installed in your environment. Follow our Installation Guide.

Choose your motors

If you’re using Feetech or Dynamixel motors, LeRobot provides built-in bus interfaces:

Please refer to the MotorsBus abstract class to learn about its API. For a good example of how it can be used, you can have a look at our own SO101 follower implementation

Use these if compatible. Otherwise, you’ll need to find or write a Python interface (not covered in this tutorial):

  • Find an existing SDK in Python (or use bindings to C/C++)
  • Or implement a basic communication wrapper (e.g., via pyserial, socket, or CANopen)

You’re not alone—many community contributions use custom boards or firmware!

For Feetech and Dynamixel, we currently support these servos:

  • Feetech:
    • STS & SMS series (protocol 0): sts3215, sts3250, sm8512bl
    • SCS series (protocol 1): scs0009
  • Dynamixel (protocol 2.0 only): xl330-m077, xl330-m288, xl430-w250, xm430-w350, xm540-w270, xc430-w150

If you are using Feetech or Dynamixel servos that are not in this list, you can add those in the Feetech table or Dynamixel table. Depending on the model, this will require you to add model-specific information. In most cases though, there shouldn’t be a lot of additions to do.

In the next sections, we’ll use a FeetechMotorsBus as the motors interface for the examples. Replace it and adapt to your motors if necessary.

Step 1: Subclass the Robot Interface

You’ll first need to specify the config class and a string identifier (name) for your robot. If your robot has special needs that you’d like to be able to change easily, it should go here (e.g. port/address, baudrate).

Here, we’ll add the port name and one camera by default for our robot:

from dataclasses import dataclass, field

from lerobot.common.cameras import CameraConfig
from lerobot.common.cameras.opencv import OpenCVCameraConfig
from lerobot.common.robots import RobotConfig


@RobotConfig.register_subclass("my_cool_robot")
@dataclass
class MyCoolRobotConfig(RobotConfig):
    port: str
    cameras: dict[str, CameraConfig] = field(
        default_factory={
            "cam_1": OpenCVCameraConfig(
                index_or_path=2,
                fps=30,
                width=480,
                height=640,
            ),
        }
    )

Have a look at our Cameras tutorial to understand how to detect and add your camera.

Next, we’ll create our actual robot class which inherits from Robot. This abstract class defines a contract you must follow for your robot to be usable with the rest of the LeRobot tools.

Here we’ll create a simple 5-DoF robot with one camera. It could be a simple arm but notice that the Robot abstract class does not assume anything on your robot’s form factor. You can let you imagination run wild when designing new robots!

from lerobot.common.cameras import make_cameras_from_configs
from lerobot.common.motors import Motor, MotorNormMode
from lerobot.common.motors.feetech import FeetechMotorsBus
from lerobot.common.robots import Robot

class MyCoolRobot(Robot):
    config_class = MyCoolRobotConfig
    name = "my_cool_robot"

    def __init__(self, config: MyCoolRobotConfig):
        super().__init__(config)
        self.bus = FeetechMotorsBus(
            port=self.config.port,
            motors={
                "joint_1": Motor(1, "sts3250", MotorNormMode.RANGE_M100_100),
                "joint_2": Motor(2, "sts3215", MotorNormMode.RANGE_M100_100),
                "joint_3": Motor(3, "sts3215", MotorNormMode.RANGE_M100_100),
                "joint_4": Motor(4, "sts3215", MotorNormMode.RANGE_M100_100),
                "joint_5": Motor(5, "sts3215", MotorNormMode.RANGE_M100_100),
            },
            calibration=self.calibration,
        )
        self.cameras = make_cameras_from_configs(config.cameras)

Step 2: Define Observation and Action Features

These two properties define the interface contract between your robot and tools that consume it (such as data collection or learning pipelines).

Note that these properties must be callable even if the robot is not yet connected, so avoid relying on runtime hardware state to define them.

observation_features

This property should return a dictionary describing the structure of sensor outputs from your robot. The keys match what get_observation() returns, and the values describe either the shape (for arrays/images) or the type (for simple values).

Example for our 5-DoF arm with one camera:

@property
def _motors_ft(self) -> dict[str, type]:
    return {
        "joint_1.pos": float,
        "joint_2.pos": float,
        "joint_3.pos": float,
        "joint_4.pos": float,
        "joint_5.pos": float,
    }

@property
def _cameras_ft(self) -> dict[str, tuple]:
    return {
        cam: (self.cameras[cam].height, self.cameras[cam].width, 3) for cam in self.cameras
    }

@property
def observation_features(self) -> dict:
    return {**self._motors_ft, **self._cameras_ft}

In this case, observations consist of a simple dict storing each motor’s position and a camera image.

action_features

This property describes the commands your robot expects via send_action(). Again, keys must match the expected input format, and values define the shape/type of each command.

Here, we simply use the same joints proprioceptive features (self._motors_ft) as with observation_features: the action sent will simply the goal position for each motor.

def action_features(self) -> dict:
    return self._motors_ft

Step 3: Handle Connection and Disconnection

These methods should handle opening and closing communication with your hardware (e.g. serial ports, CAN interfaces, USB devices, cameras).

is_connected

This property should simply reflect that communication with the robot’s hardware is established. When this property is True, it should be possible to read and write to the hardware using get_observation() and send_action().

@property
def is_connected(self) -> bool:
    return self.bus.is_connected and all(cam.is_connected for cam in self.cameras.values())

connect()

This method should establish communication with the hardware. Moreover, if your robot needs calibration and is not calibrated, it should start a calibration procedure by default. If your robot needs some specific configuration, this should also be called here.

def connect(self, calibrate: bool = True) -> None:
    self.bus.connect()
    if not self.is_calibrated and calibrate:
        self.calibrate()

    for cam in self.cameras.values():
        cam.connect()

    self.configure()

disconnect()

This method should gracefully terminate communication with the hardware: free any related resources (threads or processes), close ports, etc.

Here, we already handle this in our MotorsBus and Camera classes so we just need to call their own disconnect() methods:

def disconnect(self) -> None:
    self.bus.disconnect()
    for cam in self.cameras.values():
        cam.disconnect()

Step 4: Support Calibration and Configuration

LeRobot supports saving and loading calibration data automatically. This is useful for joint offsets, zero positions, or sensor alignment.

Note that depending on your hardware, this may not apply. If that’s the case, you can simply leave these methods as no-ops:

@property
def is_calibrated(self) -> bool:
   return True

def calibrate(self) -> None:
   pass

is_calibrated

This should reflect whether your robot has the required calibration loaded.

@property
def is_calibrated(self) -> bool:
    return self.bus.is_calibrated

calibrate()

The goal of the calibration is twofold:

  • Know the physical range of motion of each motors in order to only send commands within this range.
  • Normalize raw motors positions to sensible continuous values (e.g. percentages, degrees) instead of arbitrary discrete value dependant on the specific motor used that will not replicate elsewhere.

It should implement the logic for calibration (if relevant) and update the self.calibration dictionary. If you are using Feetech or Dynamixel motors, our bus interfaces already include methods to help with this.

def calibrate(self) -> None:
    self.bus.disable_torque()
    for motor in self.bus.motors:
        self.bus.write("Operating_Mode", motor, OperatingMode.POSITION.value)

    input(f"Move {self} to the middle of its range of motion and press ENTER....")
    homing_offsets = self.bus.set_half_turn_homings()

    print(
        "Move all joints sequentially through their entire ranges "
        "of motion.\nRecording positions. Press ENTER to stop..."
    )
    range_mins, range_maxes = self.bus.record_ranges_of_motion()

    self.calibration = {}
    for motor, m in self.bus.motors.items():
        self.calibration[motor] = MotorCalibration(
            id=m.id,
            drive_mode=0,
            homing_offset=homing_offsets[motor],
            range_min=range_mins[motor],
            range_max=range_maxes[motor],
        )

    self.bus.write_calibration(self.calibration)
    self._save_calibration()
    print("Calibration saved to", self.calibration_fpath)

configure()

Use this to set up any configuration for your hardware (servos control modes, controller gains, etc.). This should usually be run at connection time and be idempotent.

def configure(self) -> None:
    with self.bus.torque_disabled():
        self.bus.configure_motors()
        for motor in self.bus.motors:
            self.bus.write("Operating_Mode", motor, OperatingMode.POSITION.value)
            self.bus.write("P_Coefficient", motor, 16)
            self.bus.write("I_Coefficient", motor, 0)
            self.bus.write("D_Coefficient", motor, 32)

Step 5: Implement Sensors Reading and Action Sending

These are the most important runtime functions: the core I/O loop.

get_observation()

Returns a dictionary of sensor values from the robot. These typically include motor states, camera frames, various sensors, etc. In the LeRobot framework, these observations are what will be fed to a policy in order to predict the actions to take. The dictionary keys and structure must match observation_features.

def get_observation(self) -> dict[str, Any]:
    if not self.is_connected:
        raise ConnectionError(f"{self} is not connected.")

    # Read arm position
    obs_dict = self.bus.sync_read("Present_Position")
    obs_dict = {f"{motor}.pos": val for motor, val in obs_dict.items()}

    # Capture images from cameras
    for cam_key, cam in self.cameras.items():
        obs_dict[cam_key] = cam.async_read()

    return obs_dict

send_action()

Takes a dictionary that matches action_features, and sends it to your hardware. You can add safety limits (clipping, smoothing) and return what was actually sent.

For simplicity, we won’t be adding any modification of the actions in our example here.

def send_action(self, action: dict[str, Any]) -> dict[str, Any]:
    goal_pos = {key.removesuffix(".pos"): val for key, val in action.items()}

    # Send goal position to the arm
    self.bus.sync_write("Goal_Position", goal_pos)

    return action

Adding a Teleoperator

For implementing teleoperation devices, we also provide a Teleoperator base class. This class is very similar to the Robot base class and also doesn’t assume anything on form factor.

The main differences are in the I/O functions: a teleoperator allows you to produce action via get_action and can receive feedback actions via send_feedback. Feedback could be anything controllable on the teleoperation device that could help the person controlling it understand the consequences of the actions sent. Think motion/force feedback on a leader arm, vibrations on a gamepad controller for example. To implement a teleoperator, you can follow this same tutorial and adapt it for these two methods.

Wrapping Up

Once your robot class is complete, you can leverage the LeRobot ecosystem:

  • Control your robot with available teleoperators or integrate directly your teleoperating device
  • Record training data and visualize it
  • Integrate it into RL or imitation learning pipelines

Don’t hesitate to reach out to the community for help on our Discord 🤗

< > Update on GitHub