Skip to content

Commit 5d99c44

Browse files
alibertsJIy3AHKO
authored andcommitted
Add typos checks (huggingface#770)
1 parent 75cdceb commit 5d99c44

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

47 files changed

+114
-82
lines changed

.github/workflows/quality.yml

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,3 +42,15 @@ jobs:
4242

4343
- name: Ruff format
4444
run: ruff format --diff
45+
46+
typos:
47+
name: Typos
48+
runs-on: ubuntu-latest
49+
steps:
50+
- name: Checkout Repository
51+
uses: actions/checkout@v4
52+
with:
53+
persist-credentials: false
54+
55+
- name: typos-action
56+
uses: crate-ci/[email protected]

.pre-commit-config.yaml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,11 @@ repos:
1313
- id: check-toml
1414
- id: end-of-file-fixer
1515
- id: trailing-whitespace
16+
- repo: https://github.com/crate-ci/typos
17+
rev: v1.29.10
18+
hooks:
19+
- id: typos
20+
args: [--force-exclude]
1621
- repo: https://github.com/asottile/pyupgrade
1722
rev: v3.19.1
1823
hooks:

CONTRIBUTING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -228,7 +228,7 @@ Follow these steps to start contributing:
228228
git commit
229229
```
230230

231-
Note, if you already commited some changes that have a wrong formatting, you can use:
231+
Note, if you already committed some changes that have a wrong formatting, you can use:
232232
```bash
233233
pre-commit run --all-files
234234
```

benchmarks/video/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ We tried to measure the most impactful parameters for both encoding and decoding
114114

115115
Additional encoding parameters exist that are not included in this benchmark. In particular:
116116
- `-preset` which allows for selecting encoding presets. This represents a collection of options that will provide a certain encoding speed to compression ratio. By leaving this parameter unspecified, it is considered to be `medium` for libx264 and libx265 and `8` for libsvtav1.
117-
- `-tune` which allows to optimize the encoding for certains aspects (e.g. film quality, fast decoding, etc.).
117+
- `-tune` which allows to optimize the encoding for certain aspects (e.g. film quality, fast decoding, etc.).
118118

119119
See the documentation mentioned above for more detailed info on these settings and for a more comprehensive list of other parameters.
120120

examples/11_use_lekiwi.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -185,7 +185,7 @@ sudo chmod 666 /dev/ttyACM1
185185

186186
#### d. Update config file
187187

188-
IMPORTANTLY: Now that you have your ports of leader and follower arm and ip adress of the mobile-so100, update the **ip** in Network configuration, **port** in leader_arms and **port** in lekiwi. In the [`LeKiwiRobotConfig`](../lerobot/common/robot_devices/robots/configs.py) file. Where you will find something like:
188+
IMPORTANTLY: Now that you have your ports of leader and follower arm and ip address of the mobile-so100, update the **ip** in Network configuration, **port** in leader_arms and **port** in lekiwi. In the [`LeKiwiRobotConfig`](../lerobot/common/robot_devices/robots/configs.py) file. Where you will find something like:
189189
```python
190190
@RobotConfig.register_subclass("lekiwi")
191191
@dataclass
@@ -324,7 +324,7 @@ You should see on your laptop something like this: ```[INFO] Connected to remote
324324
| F | Decrease speed |
325325

326326
> [!TIP]
327-
> If you use a different keyboard you can change the keys for each commmand in the [`LeKiwiRobotConfig`](../lerobot/common/robot_devices/robots/configs.py).
327+
> If you use a different keyboard you can change the keys for each command in the [`LeKiwiRobotConfig`](../lerobot/common/robot_devices/robots/configs.py).
328328
329329
## Troubleshoot communication
330330

examples/11_use_moss.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ This tutorial explains how to use [Moss v1](https://github.com/jess-moss/moss-ro
22

33
## Source the parts
44

5-
Follow this [README](https://github.com/jess-moss/moss-robot-arms). It contains the bill of materials, with link to source the parts, as well as the instructions to 3D print the parts, and advices if it's your first time printing or if you don't own a 3D printer already.
5+
Follow this [README](https://github.com/jess-moss/moss-robot-arms). It contains the bill of materials with link to source the parts, as well as the instructions to 3D print the parts and advice if it's your first time printing or if you don't own a 3D printer already.
66

77
**Important**: Before assembling, you will first need to configure your motors. To this end, we provide a nice script, so let's first install LeRobot. After configuration, we will also guide you through assembly.
88

examples/7_get_started_with_real_robot.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -398,7 +398,7 @@ And here are the corresponding positions for the leader arm:
398398

399399
You can watch a [video tutorial of the calibration procedure](https://youtu.be/8drnU9uRY24) for more details.
400400

401-
During calibration, we count the number of full 360-degree rotations your motors have made since they were first used. That's why we ask yo to move to this arbitrary "zero" position. We don't actually "set" the zero position, so you don't need to be accurate. After calculating these "offsets" to shift the motor values around 0, we need to assess the rotation direction of each motor, which might differ. That's why we ask you to rotate all motors to roughly 90 degrees, to mesure if the values changed negatively or positively.
401+
During calibration, we count the number of full 360-degree rotations your motors have made since they were first used. That's why we ask yo to move to this arbitrary "zero" position. We don't actually "set" the zero position, so you don't need to be accurate. After calculating these "offsets" to shift the motor values around 0, we need to assess the rotation direction of each motor, which might differ. That's why we ask you to rotate all motors to roughly 90 degrees, to measure if the values changed negatively or positively.
402402

403403
Finally, the rest position ensures that the follower and leader arms are roughly aligned after calibration, preventing sudden movements that could damage the motors when starting teleoperation.
404404

@@ -663,7 +663,7 @@ camera.disconnect()
663663

664664
**Instantiate your robot with cameras**
665665

666-
Additionaly, you can set up your robot to work with your cameras.
666+
Additionally, you can set up your robot to work with your cameras.
667667

668668
Modify the following Python code with the appropriate camera names and configurations:
669669
```python
@@ -825,8 +825,8 @@ It contains:
825825
- `dtRlead: 5.06 (197.5hz)` which is the delta time of reading the present position of the leader arm.
826826
- `dtWfoll: 0.25 (3963.7hz)` which is the delta time of writing the goal position on the follower arm ; writing is asynchronous so it takes less time than reading.
827827
- `dtRfoll: 6.22 (160.7hz)` which is the delta time of reading the present position on the follower arm.
828-
- `dtRlaptop:32.57 (30.7hz) ` which is the delta time of capturing an image from the laptop camera in the thread running asynchrously.
829-
- `dtRphone:33.84 (29.5hz)` which is the delta time of capturing an image from the phone camera in the thread running asynchrously.
828+
- `dtRlaptop:32.57 (30.7hz) ` which is the delta time of capturing an image from the laptop camera in the thread running asynchronously.
829+
- `dtRphone:33.84 (29.5hz)` which is the delta time of capturing an image from the phone camera in the thread running asynchronously.
830830

831831
Troubleshooting:
832832
- On Linux, if you encounter a hanging issue when using cameras, uninstall opencv and re-install it with conda:
@@ -846,7 +846,7 @@ At the end of data recording, your dataset will be uploaded on your Hugging Face
846846
echo https://huggingface.co/datasets/${HF_USER}/koch_test
847847
```
848848

849-
### b. Advices for recording dataset
849+
### b. Advice for recording dataset
850850

851851
Once you're comfortable with data recording, it's time to create a larger dataset for training. A good starting task is grasping an object at different locations and placing it in a bin. We suggest recording at least 50 episodes, with 10 episodes per location. Keep the cameras fixed and maintain consistent grasping behavior throughout the recordings.
852852

examples/8_use_stretch.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,7 @@ python lerobot/scripts/control_robot.py \
9898
```
9999
This is equivalent to running `stretch_robot_home.py`
100100

101-
> **Note:** If you run any of the LeRobot scripts below and Stretch is not poperly homed, it will automatically home/calibrate first.
101+
> **Note:** If you run any of the LeRobot scripts below and Stretch is not properly homed, it will automatically home/calibrate first.
102102
103103
**Teleoperate**
104104
Before trying teleoperation, you need activate the gamepad controller by pressing the middle button. For more info, see Stretch's [doc](https://docs.hello-robot.com/0.3/getting_started/hello_robot/#gamepad-teleoperation).

examples/9_use_aloha.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -172,10 +172,10 @@ python lerobot/scripts/control_robot.py \
172172
As you can see, it's almost the same command as previously used to record your training dataset. Two things changed:
173173
1. There is an additional `--control.policy.path` argument which indicates the path to your policy checkpoint with (e.g. `outputs/train/eval_act_aloha_test/checkpoints/last/pretrained_model`). You can also use the model repository if you uploaded a model checkpoint to the hub (e.g. `${HF_USER}/act_aloha_test`).
174174
2. The name of dataset begins by `eval` to reflect that you are running inference (e.g. `${HF_USER}/eval_act_aloha_test`).
175-
3. We use `--control.num_image_writer_processes=1` instead of the default value (`0`). On our computer, using a dedicated process to write images from the 4 cameras on disk allows to reach constent 30 fps during inference. Feel free to explore different values for `--control.num_image_writer_processes`.
175+
3. We use `--control.num_image_writer_processes=1` instead of the default value (`0`). On our computer, using a dedicated process to write images from the 4 cameras on disk allows to reach constant 30 fps during inference. Feel free to explore different values for `--control.num_image_writer_processes`.
176176

177177
## More
178178

179-
Follow this [previous tutorial](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md#4-train-a-policy-on-your-data) for a more in-depth explaination.
179+
Follow this [previous tutorial](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md#4-train-a-policy-on-your-data) for a more in-depth explanation.
180180

181181
If you have any question or need help, please reach out on Discord in the channel `#aloha-arm`.

lerobot/common/datasets/compute_stats.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ def compute_episode_stats(episode_data: dict[str, list[str] | np.ndarray], featu
9292
axes_to_reduce = (0, 2, 3) # keep channel dim
9393
keepdims = True
9494
else:
95-
ep_ft_array = data # data is alreay a np.ndarray
95+
ep_ft_array = data # data is already a np.ndarray
9696
axes_to_reduce = 0 # compute stats over the first axis
9797
keepdims = data.ndim == 1 # keep as np.array
9898

0 commit comments

Comments
 (0)