Skip to content

Commit 1303a0e

Browse files
authored
Fix typos and incorrect param order in Javadoc (#1740)
1 parent ab41d2d commit 1303a0e

File tree

29 files changed

+46
-46
lines changed

29 files changed

+46
-46
lines changed

docs/source/docs/advanced-installation/sw_install/romi.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ SSH into the Raspberry Pi (using Windows command line, or a tool like [Putty](ht
1515
:::{attention}
1616
The version of WPILibPi for the Romi is 2023.2.1, which is not compatible with the current version of PhotonVision. **If you are using WPILibPi 2023.2.1 on your Romi, you must install PhotonVision v2023.4.2 or earlier!**
1717

18-
To install a compatible version of PhotonVision, enter these commands in the SSH terminal connected to the Raspberry Pi. This will download and run the install script, which will intall PhotonVision on your Raspberry Pi and configure it to run at startup.
18+
To install a compatible version of PhotonVision, enter these commands in the SSH terminal connected to the Raspberry Pi. This will download and run the install script, which will install PhotonVision on your Raspberry Pi and configure it to run at startup.
1919

2020
```bash
2121
$ wget https://git.io/JJrEP -O install.sh

docs/source/docs/contributing/design-descriptions/camera-matching.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ When a new camera (ie, one we can't match by-path to a deserialized CameraConfig
5050

5151
## Startup:
5252

53-
- GIVEN An emtpy set of deserialized Camera Configurations
53+
- GIVEN An empty set of deserialized Camera Configurations
5454
<br>WHEN PhotonVision starts
5555
<br>THEN no VisionModules will be started
5656

@@ -72,12 +72,12 @@ When a new camera (ie, one we can't match by-path to a deserialized CameraConfig
7272

7373
## Camera (re)enumeration:
7474

75-
- GIVEN a NEW USB CAMERA is avaliable for enumeration
75+
- GIVEN a NEW USB CAMERA is available for enumeration
7676
<br>WHEN a USB camera is discovered by VisionSourceManager
7777
<br>AND the USB camera's VIDEO DEVICE PATH is not in the set of DESERIALIZED CAMERA CONFIGURATIONS
7878
<br>THEN a UNIQUE NAME will be assigned to the camera info
7979

80-
- GIVEN a NEW USB CAMERA is avaliable for enumeration
80+
- GIVEN a NEW USB CAMERA is available for enumeration
8181
<br>WHEN a USB camera is discovered by VisionSourceManager
8282
<br>AND the USB camera's VIDEO DEVICE PATH is in the set of DESERIALIZED CAMERA CONFIGURATIONS
8383
<br>THEN a UNIQUE NAME equal to the matching DESERIALIZED CAMERA CONFIGURATION will be assigned to the camera info
@@ -86,13 +86,13 @@ When a new camera (ie, one we can't match by-path to a deserialized CameraConfig
8686
## Creating from a new camera
8787

8888
- Given: A UNIQUE NAME from a NEW USB CAMERA
89-
<br>WHEN I request a new VisionModule is created for this NEW USB CAMREA
89+
<br>WHEN I request a new VisionModule is created for this NEW USB CAMERA
9090
<br>AND the camera has a VALID USB PATH
9191
<br>AND the camera's VALID USB PATH is not in use by any CURRENTLY ACTIVE CAMERAS
9292
<br>THEN a NEW VisionModule will be started for the NEW USB CAMERA using the VALID USB PATH
9393

9494
- Given: A UNIQUE NAME from a NEW USB CAMERA
95-
<br>WHEN I request a new VisionModule is created for this NEW USB CAMREA
95+
<br>WHEN I request a new VisionModule is created for this NEW USB CAMERA
9696
<br>AND the camera does not have a VALID USB PATH
9797
<br>AND the camera's VIDEO DEVICE PATH is not in use by any CURRENTLY ACTIVE CAMERAS
9898
<br>THEN a NEW VisionModule will be started for the NEW USB CAMERA using the VIDEO DEVICE PATH

docs/source/docs/contributing/design-descriptions/e2e-latency.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
## A primer on time
55

6-
Expecially starting around 2022 with AprilTags making localization easier, providing a way to know when a camera image was captured at became more important for localization.
6+
Especially starting around 2022 with AprilTags making localization easier, providing a way to know when a camera image was captured at became more important for localization.
77
Since the [creation of USBFrameProvider](https://github.com/PhotonVision/photonvision/commit/f92bf670ded52b59a00352a4a49c277f01bae305), we used the time [provided by CSCore](https://github.wpilib.org/allwpilib/docs/release/java/edu/wpi/first/cscore/CvSink.html#grabFrame(org.opencv.core.Mat)) to tell when a camera image was captured at, but just keeping track of "CSCore told us frame N was captured 104.21s after the Raspberry Pi turned on" isn't very helpful. We can decompose this into asking:
88

99
- At what time was a particular image captured at, in the coprocessor's timebase?
@@ -29,13 +29,13 @@ WPILib's CSCore is a platform-agnostic wrapper around Windows, Linux, and MacOS
2929

3030
Prior to https://github.com/wpilibsuite/allwpilib/pull/7609, CSCore used the [time it dequeued the buffer at](https://github.com/wpilibsuite/allwpilib/blob/17a03514bad6de195639634b3d57d5ac411d601e/cscore/src/main/native/linux/UsbCameraImpl.cpp#L559) as the image capture time. But this doesn't account for exposure time or latency introduced by the camera + USB stack + Linux itself.
3131

32-
V4L does expose (with some [very heavy caviets](https://github.com/torvalds/linux/blob/fc033cf25e612e840e545f8d5ad2edd6ba613ed5/drivers/media/usb/uvc/uvc_video.c#L600) for some troublesome cameras) its best guess at the time an image was captured at via [buffer flags](https://www.kernel.org/doc/html/v4.9/media/uapi/v4l/buffer.html#buffer-flags). In my testing, all my cameras were able to provide timestamps with both these flags set:
32+
V4L does expose (with some [very heavy caveats](https://github.com/torvalds/linux/blob/fc033cf25e612e840e545f8d5ad2edd6ba613ed5/drivers/media/usb/uvc/uvc_video.c#L600) for some troublesome cameras) its best guess at the time an image was captured at via [buffer flags](https://www.kernel.org/doc/html/v4.9/media/uapi/v4l/buffer.html#buffer-flags). In my testing, all my cameras were able to provide timestamps with both these flags set:
3333
- `V4L2_BUF_FLAG_TIMESTAMP_MONOTONIC`: The buffer timestamp has been taken from the CLOCK_MONOTONIC clock [...] accessible via `clock_gettime()`.
3434
- `V4L2_BUF_FLAG_TSTAMP_SRC_SOE`: Start Of Exposure. The buffer timestamp has been taken when the exposure of the frame has begun.
3535

3636
I'm sure that we'll find a camera that doesn't play nice, because we can't have nice things :). But until then, using this timestamp gets us a free accuracy bump.
3737

38-
Other things to note: This gets us an estimate at when the camera *started* collecting photons. The camera's sensor will remain collecitng light for up to the total integration time, plus readout time for rolling shutter cameras.
38+
Other things to note: This gets us an estimate at when the camera *started* collecting photons. The camera's sensor will remain collecting light for up to the total integration time, plus readout time for rolling shutter cameras.
3939

4040
## Latency Testing
4141

@@ -105,7 +105,7 @@ public class Robot extends TimedRobot {
105105
```
106106
</details>
107107

108-
I've decreased camera exposure as much as possible (so we know with reasonable confidence that the image was collected right at the start of the exposure time reported by V4L), but we only get back new images at 60fps. So we don't know when between frame N and N+1 the LED turned on - just that somtime between now and 1/60th of a second a go, the LED turned on.
108+
I've decreased camera exposure as much as possible (so we know with reasonable confidence that the image was collected right at the start of the exposure time reported by V4L), but we only get back new images at 60fps. So we don't know when between frame N and N+1 the LED turned on - just that sometime between now and 1/60th of a second a go, the LED turned on.
109109

110110
The test coprocessor was an Orange Pi 5 running a PhotonVision 2025 (Ubuntu 24.04 based) image, with an ArduCam OV9782 at 1280x800, 60fps, MJPG running a reflective pipeline.
111111

@@ -133,4 +133,4 @@ With the camera capturing at 60fps, the time between successive frames is only ~
133133

134134
### Future Work
135135

136-
This test also makes no effort to isolate error from time syncronization from error introduced by frame time measurement - we're just interested in overall error. Future work could investigate the latency contribution
136+
This test also makes no effort to isolate error from time synchronization from error introduced by frame time measurement - we're just interested in overall error. Future work could investigate the latency contribution

docs/source/docs/contributing/design-descriptions/time-sync.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ Communication between server and clients shall occur over the User Datagram Prot
7676

7777
## Message Format
7878

79-
The message format forgoes CRCs (as these are provided by the Ethernet physical layer) or packet delimination (as our packetsa are assumed be under the network MTU). **TSP Ping** and **TSP Pong** messages shall be encoded in a manor compatible with a WPILib packed struct with respect to byte alignment and endienness.
79+
The message format forgoes CRCs (as these are provided by the Ethernet physical layer) or packet delineation (as our packets are assumed be under the network MTU). **TSP Ping** and **TSP Pong** messages shall be encoded in a manor compatible with a WPILib packed struct with respect to byte alignment and endianness.
8080

8181
### TSP Ping
8282

@@ -98,7 +98,7 @@ The message format forgoes CRCs (as these are provided by the Ethernet physical
9898

9999
## Optional Protocol Extensions
100100

101-
Clients may publish statistics to NetworkTables. If they do, they shall publish to a key that is globally unique per participant in the Time Synronization network. If a client implements this, it shall provide the following publishers:
101+
Clients may publish statistics to NetworkTables. If they do, they shall publish to a key that is globally unique per participant in the Time Synchronization network. If a client implements this, it shall provide the following publishers:
102102

103103
| Key | Type | Notes |
104104
| ------ | ------ | ---- |

docs/source/docs/description.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ Using PhotonVision allows the user to calibrate for their specific camera, which
2727

2828
### Low Latency, High FPS Processing
2929

30-
PhotonVision exposes specalized hardware on select coprocessors to maximize processing speed. This allows for lower-latency detection of targets to ensure you aren't losing out on any performance.
30+
PhotonVision exposes specialized hardware on select coprocessors to maximize processing speed. This allows for lower-latency detection of targets to ensure you aren't losing out on any performance.
3131

3232
### Fully Open Source and Active Developer Community
3333

docs/source/docs/examples/aimingatatarget.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,8 @@ The following example is from the PhotonLib example repository ([Java](https://g
55
## Knowledge and Equipment Needed
66

77
- A Robot
8-
- A camera mounted rigidly to the robot's frame, cenetered and pointed forward.
9-
- A coprocessor running PhotonVision with an AprilTag or Aurco 2D Pipeline.
8+
- A camera mounted rigidly to the robot's frame, centered and pointed forward.
9+
- A coprocessor running PhotonVision with an AprilTag or Aruco 2D Pipeline.
1010
- [A printout of AprilTag 7](https://firstfrc.blob.core.windows.net/frc2025/FieldAssets/Apriltag_Images_and_User_Guide.pdf), mounted on a rigid and flat surface.
1111

1212
## Code

docs/source/docs/programming/photonlib/using-target-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,7 @@ You can get a [translation](https://docs.wpilib.org/en/latest/docs/software/adva
106106
.. code-block:: C++
107107
108108
// Calculate a translation from the camera to the target.
109-
frc::Translation2d translation = photonlib::PhotonUtils::EstimateCameraToTargetTranslationn(
109+
frc::Translation2d translation = photonlib::PhotonUtils::EstimateCameraToTargetTranslation(
110110
distance, frc::Rotation2d(units::degree_t(-target.GetYaw())));
111111
112112
.. code-block:: Python

docs/source/docs/quick-start/arducam-cameras.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,6 @@ Arducam cameras are supported for setups with multiple devices. This is possible
1717
3. **Save Settings**: Ensure that you save the settings after selecting the appropriate camera model for each device.
1818

1919
```{image} images/setArducamModel.png
20-
:alt: The camera model can be selected from the Arudcam model selector in the cameras tab
20+
:alt: The camera model can be selected from the Arducam model selector in the cameras tab
2121
:align: center
2222
```

photon-core/src/main/java/org/photonvision/common/configuration/DatabaseSchema.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919

2020
/**
2121
* Add migrations by adding the SQL commands for each migration sequentially to this array. DO NOT
22-
* edit or delete existing SQL commands. That will lead to producing an icompatible database.
22+
* edit or delete existing SQL commands. That will lead to producing an incompatible database.
2323
*
2424
* <p>You can use multiple SQL statements in one migration step as long as you separate them with a
2525
* semicolon (;).

photon-core/src/main/java/org/photonvision/common/hardware/OsImageVersion.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
* Our blessed images inject the current version via this build workflow:
2929
* https://github.com/PhotonVision/photon-image-modifier/blob/2e5ddb6b599df0be921c12c8dbe7b939ecd7f615/.github/workflows/main.yml#L67
3030
*
31-
* <p>This class provides a convienent abstraction around this
31+
* <p>This class provides a convenient abstraction around this
3232
*/
3333
public class OsImageVersion {
3434
private static final Logger logger = new Logger(OsImageVersion.class, LogGroup.General);

0 commit comments

Comments
 (0)