diff --git a/docs/hardware/VIVE/focus3_xre.mdx b/docs/hardware/VIVE/focus3_xre.mdx index 17a9600c..70d3a5db 100644 --- a/docs/hardware/VIVE/focus3_xre.mdx +++ b/docs/hardware/VIVE/focus3_xre.mdx @@ -1,4 +1,5 @@ import ReactPlayer from 'react-player' +import {TroubleShootTable, CustomLink, TextColor, EditUrl} from '@site/src/components/Utils.tsx' # Vive Focus 3 / Vive XR Elite @@ -6,14 +7,15 @@ import ReactPlayer from 'react-player' The Vive Focus 3 and XR Elite are standalone VR headsets powered by the Qualcomm XR2, similar to the Quest 2 and Quest Pro headsets from Meta. Although by default having neither eye nor face tracking, the Focus 3 has 2 add-on modules that can be installed to add eye and face tracking capabilities, -and the XRE has a single, combo module that can be installed to add both eye and face tracking capabilities. -There is currently no way for VRCFT to simultaneously extract data from the headset and send eye and face tracking to the Vive standalone version of VRChat, so the following solutions are only for PCVR. +and the XRE has a single combo module that can be installed to add both eye and face tracking capabilities. +While the XRE can send a (extremely) limited set of face tracking parameters to the Vive standalone version of VRChat, this function is unrelated to VRCFaceTracking and questions/issues regarding this headset feature should be directed to Vive Support. +The follow instructions are specifically for **PCVR and VRCFT**. Since September 2023, the Vive PCVR VR streamer programs (Vive Business Streaming and Vive Streamer Hub) have had the ability to control VRCFT avatars in VRChat on their own (by copying the VRCFT program's functions). While users can choose to forgo using VRCFT, we would recommend using VRCFT over the Vive Streamer's built-in OSC function. -As of now, Vive's implementation of VRCFT's functionality is buggy and slow, with some VRCFT avatars completely not working. -We will be unable to provide support to users who experience issues with the VRCFT clone in Vive's streamer software. -Note that both VRCFT and the Vive Streamer OSC output cannot be used at the same time. +Even now, Vive's implementation of VRCFT's functionality is buggy, slow, and handles some parameters (notably MouthClosed) completely incorrectly. + +The VRCFT server will be unable to provide support to users who experience issues with the VRCFT clone in Vive's streamer software. ## Setup @@ -23,11 +25,11 @@ There are two PCVR Streaming methods that supports the eye and face tracking fea 2. ALXR The **Vive Streamer** will be more straightforward to set up and use and is recommended for most users. -ALXR on Vive standalone headsets may have VR streaming issues currently (January 2024). +ALXR on Vive standalone headsets will require some user tweaking and the ALXR remote module doesn't handle eye-openness and gaze correctly yet (August 2024). ### Preliminary Setup -1. Install the eye and/or face tracking modules to the headset. Both modules should come with their own hardware quick-start guides in the box and should generally simply involve a single USB-C port in a nearby location on the headset. +1. Install the eye and/or face tracking modules to the headset. Both modules should come with their own hardware quick-start guides in the box and should generally simply involve connecting the module to a single USB-C port on the headset.
Digital Quick Start Guides For Focus 3 Modules @@ -49,22 +51,37 @@ ALXR on Vive standalone headsets may have VR streaming issues currently (January
2. Make sure that you agree to the privacy notices for eye and face tracking after installation, follow the instructions for eye tracking calibration, and have the eye and face tracking options enabled in the headset Input settings. -3. Install VIVE Console onto your computer. We need this for the latest version (**1.3.6.8+**) of [SRanipal](./sranipal.mdx#installing-via-vive-console) + - If you do not see eye/face tracking Input options in your headset settings, try re-seating the connector(s). + + ### Vive Streamer Setup
Vive Streamer Setup -1. Install VIVE Business Streaming or VIVE Streaming Hub onto your computer. They are functionally identical. Traditionally one would use VBS for the Focus 3 and the Streaming Hub for the XR Elite. -2. Update the streaming app on the Focus 3 or XR Elite by plugging the headset into the computer then clicking the Update button in the VIVE Streaming application for "Headset software version". - - Focus 3: you will need to unplug the eye tracking module - - XR Elite: you can use the USB-C port on the top of the battery or the dangling USB-C port if using the XRE without the battery +:::info +As of August 2, 2024, you should opt into the BETA version of the Vive Streaming Hub. The live version at the time of writing has various issues related to eye/face tracking. Make sure to check for updates and update both Vive Hub and the Vive Streamer app on the headset to the beta versions after enabling the Beta toggle. + +
+Vive Streamer Beta option +
+::: + +1. Install VIVE Business Streaming or VIVE (Streaming) Hub onto your computer. They are functionally identical, but the typical cases are VBS for the Focus 3 and Vive Hub for the XR Elite. You can use them interchangably. +2. Update the streaming app to the latest version on the Focus 3 or XR Elite by plugging the headset into the computer then clicking the Update button in the VIVE Streaming application for "Headset software version" (Settings ➜ About ➜ Vive Streaming). + - Focus 3: you will need to unplug the eye tracking module to use the USB-C port on the side of the headset + - XR Elite: you can use the USB-C port on the top of the battery cradle or the dangling USB-C port if using the XRE without the battery
-3. Disable the OSC output from the Vive Streamer. The OSC settings may be accesible from the Streamer application itself in a future update and should be disabled there if the settings exist. + + +3. Make sure that the "Eye and facial tracking data" toggle under "Stream avatar data to VRChat via OSC" is **enabled** in the Vive Hub or VBS application (Settings ➜ Vive Streaming ➜ Input). + +
+ Vive Streaming Eye and Face tracking toggle
-:::warning -If you do not disable the Streamer's output, it can interfere with VRCFT's ability to bind to the port to get messages from VRC, or it will double-send messages to VRC causing a "stuttering" effect. -::: +4. Download and install the **[Vive Streaming Face Tracking Module](https://github.com/ViveSoftware/ViveStreamingFaceTrackingModule)** from Vive. + - Download the latest module .zip from the Releases section found at the right side of the Github page + - Use the "Install Module from Zip" button in the VRCFT Module Registry page + +
-4. Proceed to [Modules](#modules) for the module to use with Vive Streamer. + +
+ Vive Streamer with SRanipal Module Setup + + SRanipal was the original ET/FT method that was available for the Focus 3 headset, and still works for both the Focus 3 and XR Elite. + It offers no obvious improvement over the Vive Streaming Face Tracking module, involves more setup and software, and like all Vive implementations, has its own quirks. + However, it is still better than the built-in output from the Vive Hub software itself... + + 0. Follow the ["Vive Streamer Setup" instructions](#vive-streamer-setup) up until installing the Vive Streaming Face Tracking Module. + 1. Install VIVE Console onto your computer. We need this for the latest version (**1.3.6.8+**) of [SRanipal](./sranipal.mdx#installing-via-vive-console). + - The easiest way is to search for "VIVE Console" in Steam store, and install it through Steam. + - Run Vive Console once to let it complete whatever it needs to install + - You can *completely ignore* Vive Console afterwards, you only need the install for SRanipal, not Vive Console itself + 2. **Disable** the OSC output from the Vive Streamer by unchecking "Eye and facial tracking data" under "Stream avatar data to VRChat via OSC" in the Input tab of the VIVE Streaming Settings of the Vive Hub application. + - Alternatively, you can do this manually by opening `C:\ProgramData\HTC\ViveSoftware\ViveRR\RRServer\serverSetting.setting` and setting the `VOF` key to "**false**". + +
+ Vive Streaming Eye and Face tracking toggle +
+ + 3. Install the **SRanipalTrackingModule** module from the VRCFaceTracking module repository. This should open a UAC prompt asking for permission to start the SRanipal runtime (sr_runtime). Make sure to allow it to run. + + :::warning + If you do not disable the Streamer's output, it can double-send messages to VRC in tandemn with VRCFT, causing a "stuttering" effect. + :::
+ ### ALXR Setup :::warning -ALXR on Vive standalone headsets may have VR streaming issues currently (January 2024). +The ALXR remote module currently doesn't handle eye-openness and gaze correctly for the XR Elite or Focus 3. :::
ALXR Setup 1. Download and install the latest ALXR client *and server* from the [ALXR-nightly](https://github.com/korejan/ALXR-nightly/releases) repository. -If this is your first time using ALXR, follow the [Usage guide](https://github.com/korejan/ALVR/wiki/ALXR-Client#usage) and [Android-specific client install instructions](https://github.com/korejan/ALVR/wiki/ALXR-Client#android-all-flavors---questpicogenericetc) +If this is your first time using ALXR, follow the [Usage guide](https://github.com/korejan/ALVR/wiki/ALXR-Client#usage) and [Android-specific client install instructions](https://github.com/korejan/ALVR/wiki/ALXR-Client#android-all-flavors---questpicogenericetc) 2. Install the **ALXR Remote** module from the VRCFaceTracking module repository. 3. Open the `ALXRModuleConfig.json` found in the installed module directory. - You may need to navigate to `C:\Users\[username]\AppData\Local\Packages\96ba052f-0948-44d8-86c4-a0212e4ae047_d7rcq4vxghz0r\LocalCache\Roaming\VRCFaceTracking\` to find the module directory and config json. - [Learn more about the ALXR module configuration options](https://github.com/korejan/VRCFT-ALXR-Modules#module-settings) -4. In `ALXRModuleConfig.json`, in the "RemoteConfig" section set "ClientIpAddress" to the headset IP, this can be found in the ALVR server dashboard. +4. In `ALXRModuleConfig.json`, in the "RemoteConfig" section set "ClientIpAddress" to the headset IP. This can be found in the ALVR server dashboard. - You will need to restart VRCFT to reinitialize the ALXR Remote Module with the updated configuration. -5. Proceed to [Modules](#modules) for the module to use with ALXR.
## Modules -There are 2 modules that can be used with the Vive Focus 3 or XR Elite, one for each possible PCVR streaming method. -Both modules are readily available for installation via the VRCFaceTracking built-in module registry. -[Learn how to install modules from the module registry](../../intro/getting-started.mdx#installing-the-vrcfacetracking-module). +There are 3 modules that can be used with the Vive Focus 3 or XR Elite, 2 for Vive Streaming and 1 for ALXR. -- If you are using a Vive Streamer (Vive Business Streaming / Vive Streamer Hub), you should install the **SRanipalTrackingModule**. +- If you are using a Vive Streamer (Vive Business Streaming / Vive Hub), you can use the **Vive Streaming Face Tracking Module** or the **SRanipalTrackingModule**. - If you are using ALXR, you should install the **ALXR Remote Module**. +Make sure to follow the setup instructions above for which module to use. +The SRanipal and ALXR Remote modules are readily available to be installed from the VRCFT module registry. +[Learn how to install modules from the module registry](../../intro/getting-started.mdx#installing-the-vrcfacetracking-module). +The [Vive Streaming Face Tracking Module](https://github.com/ViveSoftware/ViveStreamingFaceTrackingModule) is not part of the VRCFT module registry and must be installed manually. + Interested in the source code? Check out the [SRanipalTrackingModule source repository](https://github.com/VRCFaceTracking/SRanipalTrackingModule) and the [ALXR Remote module](https://github.com/korejan/VRCFT-ALXR-Modules) repos. +The Vive Streaming module is closed source and thus does not have a publicly accessible source code. + + +## Troubleshooting + +
+ My avatar's lower lip is up too high/clipping even when I have a neutral facial expression IRL + + + Follow the instructions on this page to use VRCFT instead. + The mouth clipping issue is only caused by the Vive Streamer Hub's direct output to VRChat, so if you see it, either you did not setup VRCFT and your chosen module correctly, or you did not set up VRCFT at all. + + +
\ No newline at end of file diff --git a/docs/hardware/img/vive/rr/streamer_beta.png b/docs/hardware/img/vive/rr/streamer_beta.png new file mode 100644 index 00000000..e914ef43 Binary files /dev/null and b/docs/hardware/img/vive/rr/streamer_beta.png differ diff --git a/docs/hardware/img/vive/rr/streamer_osc_toggle.png b/docs/hardware/img/vive/rr/streamer_osc_toggle.png new file mode 100644 index 00000000..ca0d3725 Binary files /dev/null and b/docs/hardware/img/vive/rr/streamer_osc_toggle.png differ diff --git a/docs/hardware/img/vive/rr/streamer_update.png b/docs/hardware/img/vive/rr/streamer_update.png index 7be59603..51db2232 100644 Binary files a/docs/hardware/img/vive/rr/streamer_update.png and b/docs/hardware/img/vive/rr/streamer_update.png differ diff --git a/docs/hardware/interface-compatibilities.mdx b/docs/hardware/interface-compatibilities.mdx index 44252d3d..5bd86cdc 100644 --- a/docs/hardware/interface-compatibilities.mdx +++ b/docs/hardware/interface-compatibilities.mdx @@ -47,16 +47,17 @@ range of motions/expressions supported by the interface. 'EyeTrackVR', 'VIVE Focus 3 (Eye Tracker)', 'VIVE Focus 3 (Facial Tracker)', - 'HP Reverb G2 Omnicept' + 'HP Reverb G2 Omnicept', + 'VIVE XR Elite (Full Facial Tracker)' ]} omitHeaders={['Tracking Feature']} rows={[ - ['Category', 'HMD', 'Accessory', 'HMD', 'Standalone HMD', 'Standalone HMD', 'Accessory', 'HMD', 'Software/Mobile', 'Software', 'Software/DIY Hardware', 'Accessory', 'Accessory', 'HMD'], - ['General Face Tracking Capability', 'Eye', 'Lower Face', 'Eye', 'Full', 'Full', 'Eye', 'Eye', 'Full', 'Lower Face', 'Eye', 'Eye', 'Lower Face', 'Eye'], - ['Gaze', '✔', '~', '✔', '✔', '✔', '✔', '✔', 'Eye Expression', '~', '✔', '✔', '~', '✔'], - ['Gaze Convergence', '✔', '~', '✔', '❌', '❌', '❌', '❌', 'N/A', '~', '✔', '✔', '~', '✔'], - ['Eye Openness', 'Granular', '~', 'Granular', 'Granular', 'Granular', '2 Steps', 'Granular', 'Granular', '~', 'Granular', 'Granular', '~', 'Binary'], - ['Pupil Dilation', '✔', '~', '✔', '❌', '❌', '❌', '❌', '❌', '~', '❌', '❌', '~', '✔'], + ['Category', 'HMD', 'Accessory', 'HMD', 'Standalone HMD', 'Standalone HMD', 'Accessory', 'HMD', 'Software/Mobile', 'Software', 'Software/DIY Hardware', 'Accessory', 'Accessory', 'HMD', 'Accessory'], + ['General Face Tracking Capability', 'Eye', 'Lower Face', 'Eye', 'Full', 'Full', 'Eye', 'Eye', 'Full', 'Lower Face', 'Eye', 'Eye', 'Lower Face', 'Eye', 'Full'], + ['Gaze', '✔', '~', '✔', '✔', '✔', '✔', '✔', 'Eye Expression', '~', '✔', '✔', '~', '✔', '✔'], + ['Gaze Convergence', '✔', '~', '✔', '❌', '❌', '❌', '❌', 'N/A', '~', '✔', '✔', '~', '✔', '✔'], + ['Eye Openness', 'Granular', '~', 'Granular', 'Granular', 'Granular', '2 Steps', 'Granular', 'Granular', '~', 'Granular', 'Granular', '~', 'Binary', 'Granular'], + ['Pupil Dilation', '✔', '~', '✔', '❌', '❌', '❌', '❌', '❌', '~', '❌', '❌', '~', '✔', '❌'], [ 'Upper Face Expression Support', <>Widen
Squeeze
Brow(Emulated), @@ -69,13 +70,14 @@ range of motions/expressions supported by the interface. <>Widen
Squint
Brow, '❌', '❌', - <>Widen
Squeeze
Brow(Emulated), - `~`, - '~' + <>Widen(broken)
Squeeze(broken), + '~', + '~', + <>Widen(broken)
Squeeze(broken) ], [ 'Upper Face Expressibility', - '5/10', + '6/10', '~', 'N/A', '9/10', @@ -85,9 +87,10 @@ range of motions/expressions supported by the interface. '9/10', '❌', '❌', - '5/10', + '3/10', + '~', '~', - '~' + '3/10' ], [ 'Upper Face Tracking Quality', @@ -101,9 +104,10 @@ range of motions/expressions supported by the interface. '8/10', '❌', '❌', - '7/10', + '4/10', + '~', '~', - '~' + '4/10' ], [ 'Lower Face Expression Support', @@ -118,8 +122,9 @@ range of motions/expressions supported by the interface. <>Jaw
Lip
Mouth
Cheek
Nose, '~', '~', - <>Jaw
Lip
Mouth
Cheek
Nose, - '~' + <>Jaw
Lip
Mouth
Cheek, + '~', + <>Jaw
Lip
Mouth
Cheek, ], [ 'Lower Face Expressibility', @@ -135,7 +140,8 @@ range of motions/expressions supported by the interface. '~', '~', '7/10', - '~' + '~', + '7/10' ], [ 'Face Tracking Quality', @@ -151,8 +157,9 @@ range of motions/expressions supported by the interface. '~', '~', '7/10', - '~' + '~', + '7/10' ], - ['Tongue Expression Support', '~', 'Tongue Out & Directions', '~', 'Tongue Out', 'Tongue Out', '~', '~', 'Tongue Out', 'All Tongue Expressions', '~', '~', 'Tongue Out & Directions', '~'], + ['Tongue Expression Support', '~', 'Tongue Out & Directions', '~', 'Tongue Out', 'Tongue Out', '~', '~', 'Tongue Out', 'All Tongue Expressions', '~', '~', 'Tongue Out & Directions', '~', 'Tongue Out & Directions'], ]} />