chore(docs): prioritize use of entry points in docs + fix nightly badge (#1692)

* chore(docs): fix typo in nightly badge

* chore(docs): prioritize the use of entrypoints for consistency
This commit is contained in:
Steven Palma
2025-08-07 14:25:44 +02:00
committed by GitHub
parent c66cd40176
commit ce3b9f627e
31 changed files with 105 additions and 107 deletions

View File

@@ -9,7 +9,7 @@ To instantiate a camera, you need a camera identifier. This identifier might cha
To find the camera indices of the cameras plugged into your system, run the following script:
```bash
python -m lerobot.find_cameras opencv # or realsense for Intel Realsense cameras
lerobot-find-cameras opencv # or realsense for Intel Realsense cameras
```
The output will look something like this if you have two cameras connected:

View File

@@ -412,7 +412,7 @@ Example configuration for training the [reward classifier](https://huggingface.c
To train the classifier, use the `train.py` script with your configuration:
```bash
python -m lerobot.scripts.train --config_path path/to/reward_classifier_train_config.json
lerobot-train --config_path path/to/reward_classifier_train_config.json
```
**Deploying and Testing the Model**
@@ -458,7 +458,7 @@ The reward classifier will automatically provide rewards based on the visual inp
3. **Train the classifier**:
```bash
python -m lerobot.scripts.train --config_path src/lerobot/configs/reward_classifier_train_config.json
lerobot-train --config_path src/lerobot/configs/reward_classifier_train_config.json
```
4. **Test the classifier**:

View File

@@ -19,7 +19,7 @@ pip install -e ".[hopejr]"
Before starting calibration and operation, you need to identify the USB ports for each HopeJR component. Run this script to find the USB ports for the arm, hand, glove, and exoskeleton:
```bash
python -m lerobot.find_port
lerobot-find-port
```
This will display the available USB ports and their associated devices. Make note of the port paths (e.g., `/dev/tty.usbmodem58760433331`, `/dev/tty.usbmodem11301`) as you'll need to specify them in the `--robot.port` and `--teleop.port` parameters when recording data, replaying episodes, or running teleoperation scripts.
@@ -31,7 +31,7 @@ Before performing teleoperation, HopeJR's limbs need to be calibrated. Calibrati
### 1.1 Calibrate Robot Hand
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--robot.type=hope_jr_hand \
--robot.port=/dev/tty.usbmodem58760432281 \
--robot.id=blue \
@@ -81,7 +81,7 @@ Once you have set the appropriate boundaries for all joints, click "Save" to sav
### 1.2 Calibrate Teleoperator Glove
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--teleop.type=homunculus_glove \
--teleop.port=/dev/tty.usbmodem11201 \
--teleop.id=red \
@@ -120,7 +120,7 @@ Once calibration is complete, the system will save the calibration to `/Users/yo
### 1.3 Calibrate Robot Arm
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--robot.type=hope_jr_arm \
--robot.port=/dev/tty.usbserial-1110 \
--robot.id=white
@@ -146,7 +146,7 @@ Use the calibration interface to set the range boundaries for each joint. Move e
### 1.4 Calibrate Teleoperator Exoskeleton
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--teleop.type=homunculus_arm \
--teleop.port=/dev/tty.usbmodem11201 \
--teleop.id=black
@@ -178,7 +178,7 @@ Due to global variable conflicts in the Feetech middleware, teleoperation for ar
### Hand
```bash
python -m lerobot.teleoperate \
lerobot-teleoperate \
--robot.type=hope_jr_hand \
--robot.port=/dev/tty.usbmodem58760432281 \
--robot.id=blue \
@@ -194,7 +194,7 @@ python -m lerobot.teleoperate \
### Arm
```bash
python -m lerobot.teleoperate \
lerobot-teleoperate \
--robot.type=hope_jr_arm \
--robot.port=/dev/tty.usbserial-1110 \
--robot.id=white \
@@ -214,7 +214,7 @@ Record, Replay and Train with Hope-JR is still experimental.
This step records the dataset, which can be seen as an example [here](https://huggingface.co/datasets/nepyope/hand_record_test_with_video_data/settings).
```bash
python -m lerobot.record \
lerobot-record \
--robot.type=hope_jr_hand \
--robot.port=/dev/tty.usbmodem58760432281 \
--robot.id=right \
@@ -236,7 +236,7 @@ python -m lerobot.record \
### Replay
```bash
python -m lerobot.replay \
lerobot-replay \
--robot.type=hope_jr_hand \
--robot.port=/dev/tty.usbmodem58760432281 \
--robot.id=right \
@@ -248,7 +248,7 @@ python -m lerobot.replay \
### Train
```bash
python -m lerobot.scripts.train \
lerobot-train \
--dataset.repo_id=nepyope/hand_record_test_with_video_data \
--policy.type=act \
--output_dir=outputs/train/hopejr_hand \
@@ -263,7 +263,7 @@ python -m lerobot.scripts.train \
This training run can be viewed as an example [here](https://wandb.ai/tino/lerobot/runs/rp0k8zvw?nw=nwusertino).
```bash
python -m lerobot.record \
lerobot-record \
--robot.type=hope_jr_hand \
--robot.port=/dev/tty.usbmodem58760432281 \
--robot.id=right \

View File

@@ -45,7 +45,7 @@ Note that the `id` associated with a robot is used to store the calibration file
<hfoptions id="teleoperate_so101">
<hfoption id="Command">
```bash
python -m lerobot.teleoperate \
lerobot-teleoperate \
--robot.type=so101_follower \
--robot.port=/dev/tty.usbmodem58760431541 \
--robot.id=my_awesome_follower_arm \
@@ -101,7 +101,7 @@ With `rerun`, you can teleoperate again while simultaneously visualizing the cam
<hfoptions id="teleoperate_koch_camera">
<hfoption id="Command">
```bash
python -m lerobot.teleoperate \
lerobot-teleoperate \
--robot.type=koch_follower \
--robot.port=/dev/tty.usbmodem58760431541 \
--robot.id=my_awesome_follower_arm \
@@ -174,7 +174,7 @@ Now you can record a dataset. To record 5 episodes and upload your dataset to th
<hfoptions id="record">
<hfoption id="Command">
```bash
python -m lerobot.record \
lerobot-record \
--robot.type=so101_follower \
--robot.port=/dev/tty.usbmodem585A0076841 \
--robot.id=my_awesome_follower_arm \
@@ -376,7 +376,7 @@ You can replay the first episode on your robot with either the command below or
<hfoptions id="replay">
<hfoption id="Command">
```bash
python -m lerobot.replay \
lerobot-replay \
--robot.type=so101_follower \
--robot.port=/dev/tty.usbmodem58760431541 \
--robot.id=my_awesome_follower_arm \
@@ -428,10 +428,10 @@ Your robot should replicate movements similar to those you recorded. For example
## Train a policy
To train a policy to control your robot, use the [`python -m lerobot.scripts.train`](https://github.com/huggingface/lerobot/blob/main/src/lerobot/scripts/train.py) script. A few arguments are required. Here is an example command:
To train a policy to control your robot, use the [`lerobot-train`](https://github.com/huggingface/lerobot/blob/main/src/lerobot/scripts/train.py) script. A few arguments are required. Here is an example command:
```bash
python -m lerobot.scripts.train \
lerobot-train \
--dataset.repo_id=${HF_USER}/so101_test \
--policy.type=act \
--output_dir=outputs/train/act_so101_test \
@@ -453,7 +453,7 @@ Training should take several hours. You will find checkpoints in `outputs/train/
To resume training from a checkpoint, below is an example command to resume from `last` checkpoint of the `act_so101_test` policy:
```bash
python -m lerobot.scripts.train \
lerobot-train \
--config_path=outputs/train/act_so101_test/checkpoints/last/pretrained_model/train_config.json \
--resume=true
```
@@ -490,7 +490,7 @@ You can use the `record` script from [`lerobot/record.py`](https://github.com/hu
<hfoptions id="eval">
<hfoption id="Command">
```bash
python -m lerobot.record \
lerobot-record \
--robot.type=so100_follower \
--robot.port=/dev/ttyACM1 \
--robot.cameras="{ up: {type: opencv, index_or_path: /dev/video10, width: 640, height: 480, fps: 30}, side: {type: intelrealsense, serial_number_or_name: 233522074606, width: 640, height: 480, fps: 30}}" \

View File

@@ -96,10 +96,10 @@ If you uploaded your dataset to the hub you can [visualize your dataset online](
## Train a policy
To train a policy to control your robot, use the [`python -m lerobot.scripts.train`](https://github.com/huggingface/lerobot/blob/main/src/lerobot/scripts/train.py) script. A few arguments are required. Here is an example command:
To train a policy to control your robot, use the [`lerobot-train`](https://github.com/huggingface/lerobot/blob/main/src/lerobot/scripts/train.py) script. A few arguments are required. Here is an example command:
```bash
python -m lerobot.scripts.train \
lerobot-train \
--dataset.repo_id=${HF_USER}/il_gym \
--policy.type=act \
--output_dir=outputs/train/il_sim_test \

View File

@@ -31,7 +31,7 @@ pip install -e ".[dynamixel]"
To find the port for each bus servo adapter, run this script:
```bash
python -m lerobot.find_port
lerobot-find-port
```
<hfoptions id="example">
@@ -98,7 +98,7 @@ For a visual reference on how to set the motor ids please refer to [this video](
<hfoption id="Command">
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--robot.type=koch_follower \
--robot.port=/dev/tty.usbmodem575E0031751 # <- paste here the port found at previous step
```
@@ -174,7 +174,7 @@ Do the same steps for the leader arm but modify the command or script accordingl
<hfoption id="Command">
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--teleop.type=koch_leader \
--teleop.port=/dev/tty.usbmodem575E0031751 \ # <- paste here the port found at previous step
```
@@ -211,7 +211,7 @@ Run the following command or API example to calibrate the follower arm:
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--robot.type=koch_follower \
--robot.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--robot.id=my_awesome_follower_arm # <- Give the robot a unique name
@@ -249,7 +249,7 @@ Do the same steps to calibrate the leader arm, run the following command or API
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--teleop.type=koch_leader \
--teleop.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--teleop.id=my_awesome_leader_arm # <- Give the robot a unique name

View File

@@ -60,7 +60,7 @@ First, we will assemble the two SO100/SO101 arms. One to attach to the mobile ba
To find the port for each bus servo adapter, run this script:
```bash
python -m lerobot.find_port
lerobot-find-port
```
<hfoptions id="example">
@@ -116,7 +116,7 @@ The instructions for configuring the motors can be found in the SO101 [docs](./s
You can run this command to setup motors for LeKiwi. It will first setup the motors for arm (id 6..1) and then setup motors for wheels (9,8,7)
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--robot.type=lekiwi \
--robot.port=/dev/tty.usbmodem58760431551 # <- paste here the port found at previous step
```
@@ -174,7 +174,7 @@ The calibration process is very important because it allows a neural network tra
Make sure the arm is connected to the Raspberry Pi and run this script or API example (on the Raspberry Pi via SSH) to launch calibration of the follower arm:
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--robot.type=lekiwi \
--robot.id=my_awesome_kiwi # <- Give the robot a unique name
```
@@ -193,7 +193,7 @@ Then, to calibrate the leader arm (which is attached to the laptop/pc). Run the
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--teleop.type=so100_leader \
--teleop.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--teleop.id=my_awesome_leader_arm # <- Give the robot a unique name

View File

@@ -54,7 +54,7 @@ If you don't have a gpu device, you can train using our notebook on [![Google Co
Pass your dataset to the training script using `--dataset.repo_id`. If you want to test your installation, run the following command where we use one of the datasets we collected for the [SmolVLA Paper](https://huggingface.co/papers/2506.01844).
```bash
cd lerobot && python -m lerobot.scripts.train \
cd lerobot && lerobot-train \
--policy.path=lerobot/smolvla_base \
--dataset.repo_id=${HF_USER}/mydataset \
--batch_size=64 \
@@ -73,7 +73,7 @@ cd lerobot && python -m lerobot.scripts.train \
Fine-tuning is an art. For a complete overview of the options for finetuning, run
```bash
python -m lerobot.scripts.train --help
lerobot-train --help
```
<p align="center">
@@ -97,7 +97,7 @@ Similarly for when recording an episode, it is recommended that you are logged i
Once you are logged in, you can run inference in your setup by doing:
```bash
python -m lerobot.record \
lerobot-record \
--robot.type=so101_follower \
--robot.port=/dev/ttyACM0 \ # <- Use your port
--robot.id=my_blue_follower_arm \ # <- Use your robot id

View File

@@ -26,7 +26,7 @@ Unlike the SO-101, the motor connectors are not easily accessible once the arm i
To find the port for each bus servo adapter, run this script:
```bash
python -m lerobot.find_port
lerobot-find-port
```
<hfoptions id="example">
@@ -93,7 +93,7 @@ For a visual reference on how to set the motor ids please refer to [this video](
<hfoption id="Command">
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--robot.type=so100_follower \
--robot.port=/dev/tty.usbmodem585A0076841 # <- paste here the port found at previous step
```
@@ -168,7 +168,7 @@ Do the same steps for the leader arm.
<hfoptions id="setup_motors">
<hfoption id="Command">
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--teleop.type=so100_leader \
--teleop.port=/dev/tty.usbmodem575E0031751 # <- paste here the port found at previous step
```
@@ -568,7 +568,7 @@ Run the following command or API example to calibrate the follower arm:
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--robot.type=so100_follower \
--robot.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--robot.id=my_awesome_follower_arm # <- Give the robot a unique name
@@ -606,7 +606,7 @@ Do the same steps to calibrate the leader arm, run the following command or API
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--teleop.type=so100_leader \
--teleop.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--teleop.id=my_awesome_leader_arm # <- Give the robot a unique name

View File

@@ -162,7 +162,7 @@ It is advisable to install one 3-pin cable in the motor after placing them befor
To find the port for each bus servo adapter, connect MotorBus to your computer via USB and power. Run the following script and disconnect the MotorBus when prompted:
```bash
python -m lerobot.find_port
lerobot-find-port
```
<hfoptions id="example">
@@ -240,7 +240,7 @@ Connect the usb cable from your computer and the power supply to the follower ar
<hfoption id="Command">
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--robot.type=so101_follower \
--robot.port=/dev/tty.usbmodem585A0076841 # <- paste here the port found at previous step
```
@@ -316,7 +316,7 @@ Do the same steps for the leader arm.
<hfoption id="Command">
```bash
python -m lerobot.setup_motors \
lerobot-setup-motors \
--teleop.type=so101_leader \
--teleop.port=/dev/tty.usbmodem575E0031751 # <- paste here the port found at previous step
```
@@ -353,7 +353,7 @@ Run the following command or API example to calibrate the follower arm:
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--robot.type=so101_follower \
--robot.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--robot.id=my_awesome_follower_arm # <- Give the robot a unique name
@@ -402,7 +402,7 @@ Do the same steps to calibrate the leader arm, run the following command or API
<hfoption id="Command">
```bash
python -m lerobot.calibrate \
lerobot-calibrate \
--teleop.type=so101_leader \
--teleop.port=/dev/tty.usbmodem58760431551 \ # <- The port of your robot
--teleop.id=my_awesome_leader_arm # <- Give the robot a unique name