Initial commit

This commit is contained in:
Ury Zhilinsky
2025-02-03 21:43:26 -08:00
commit 231a1cf7ca
121 changed files with 16349 additions and 0 deletions

7
docs/docker.md Normal file
View File

@@ -0,0 +1,7 @@
### Docker Setup
All of the examples in this repo provide instructions for being run normally, and also using Docker. Although not required, the Docker option is recommended as this will simplify software installation, produce a more stable environment, and also allow you to avoid installing ROS and cluttering your machine, for examples which depend on ROS.
Docker installation instructions are [here](https://docs.docker.com/engine/install/). If using a GPU you must also install the [NVIDIA container toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html). If your host machine is Ubuntu 22.04, you can use the convenience scripts `scripts/docker/install_docker_ubuntu22.sh` and `scripts/docker/install_nvidia_container_toolkit.sh`.
During the first run of any example, Docker will build the images. Go grab a coffee while this happens. Subsequent runs will be faster since the images are cached.

42
docs/remote_inference.md Normal file
View File

@@ -0,0 +1,42 @@
# Running openpi models remotely
We provide utilities for running openpi models remotely. This is useful for running inference on more powerful GPUs off-robot, and also helps keep the robot and policy environments separate (and e.g. avoid dependency hell with robot software).
## Starting a remote policy server
To start a remote policy server, you can simply run the following command:
```bash
uv run scripts/serve_policy.py --env=[DROID | ALOHA | LIBERO]
```
The `env` argument specifies which $\pi_0$ checkpoint should be loaded. Under the hood, this script will execute a command like the following, which you can use to start a policy server, e.g. for checkpoints you trained yourself (here an example for the DROID environment):
```bash
uv run scripts/serve_policy.py policy:checkpoint --policy.config=pi0_fast_droid --policy.dir=s3://openpi-assets/checkpoints/pi0_fast_droid
```
This will start a policy server that will serve the policy specified by the `config` and `dir` arguments. The policy will be served on the specified port (default: 8000).
## Querying the remote policy server from your robot code
We provide a client utility with minimal dependencies that you can easily embed into any robot codebase.
First, install the `openpi-client` package in your robot environment:
```bash
cd $OPENPI_ROOT/packages/openpi-client
pip install -e .
```
Then, you can use the client to query the remote policy server from your robot code. Here's an example of how to do this:
```python
from openpi_client import websocket_client_policy
policy_client = websocket_client_policy.WebsocketClientPolicy(host="10.32.255.0", port=8000)
action_chunk = policy_client.infer(example)["actions"]
```
Here, the `host` and `port` arguments specify the IP address and port of the remote policy server. You can also specify these as command-line arguments to your robot code, or hard-code them in your robot codebase. The `example` is a dictionary of observations and the prompt, following the specification of the policy inputs for the policy you are serving. We have concrete examples of how to construct this dictionary for different environments in the [simple client example](examples/simple_client/main.py).