Skip to content

Commit 25411f2

Browse files
committed
formatting
1 parent 44e1092 commit 25411f2

File tree

2 files changed

+26
-18
lines changed

2 files changed

+26
-18
lines changed

README.md

Lines changed: 24 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22

33
EyeNav is a modular web interaction framework. It fuses real-time eye-tracking (with Tobii-Pro SDK) and on-device natural-language processing (using Vosk) within a Chrome extension and Python backend to deliver:
44

5-
* **Accessible input** for users with motor impairments
6-
* **Hands-free browser control** for developers and general users
7-
* **Automated test generation** via record-and-replay (Gherkin + WebdriverIO)
5+
* Accessible input for users with motor impairments
6+
* Hands-free browser control for developers and general users
7+
* Automated test generation via record-and-replay (Gherkin + WebdriverIO)
88

99
By orchestrating gaze-driven pointer control, voice-command parsing, and concurrent logging threads, EyeNav enables both interactive accessibility and behavior-driven development in web environments.
1010

@@ -37,11 +37,16 @@ By orchestrating gaze-driven pointer control, voice-command parsing, and concurr
3737

3838
EyeNav implements the following core features:
3939

40-
* **Gaze-Driven Pointer Control**: Maps eye gaze to cursor movements using the Tobii Pro Nano and the `tobii-research` SDK.
41-
* **NLP Command Parsing**: Transcribes and interprets voice commands (click, input, scroll, navigate) with Vosk running locally.
42-
* **Record-and-Replay Test Generation**: Logs interactions in Gherkin syntax and replays them via Kraken & WebdriverIO.
43-
* **Modularity**: Enable or disable any of the three subsystems (gaze, voice, test logger) independently.
44-
* **Internationalization**: Supports English and Spanish out of the box; additional languages via voice-model download and locale translation.
40+
* Gaze-Driven Pointer Control.
41+
> Maps eye gaze to cursor movements using the Tobii Pro Nano and the `tobii-research` SDK.
42+
* NLP Command Parsing
43+
> Transcribes and interprets voice commands (click, input, scroll, navigate) with Vosk running locally.
44+
* Record-and-Replay Test Generation.
45+
> Logs interactions in Gherkin syntax and replays them via Kraken & WebdriverIO.
46+
* Modularity
47+
> Enable or disable any of the three subsystems (gaze, voice, test logger) independently.
48+
* Internationalization
49+
> Supports English and Spanish out of the box; additional languages can be downloaded and locale translation.
4550
4651
---
4752

@@ -120,13 +125,12 @@ EyeNav implements the following core features:
120125

121126
![Components Diagram](assets/imgs/components-diagram.jpg)
122127

123-
* **Frontend (Chrome Extension)**: UI panel + event listener + WebSocket client
124-
* **Backend Service**: HTTP API + thread dispatcher spawning:
125-
128+
* Frontend (Chrome Extension): UI panel + event listener + WebSocket client
129+
* Backend Service: HTTP API + thread dispatcher spawning:
126130
* Eye-Tracking Thread (Tobii SDK)
127131
* Voice Thread (Vosk)
128132
* Logging Thread (Gherkin generator)
129-
* **Test Runner**: Kraken/WebdriverIO integration for replay
133+
* Test Runner: Kraken/WebdriverIO integration for replay
130134

131135
## Context
132136

@@ -136,9 +140,13 @@ EyeNav implements the following core features:
136140

137141
# Use Cases
138142

139-
1. **Accessible Browsing**: Hands-free navigation for users with disabilities.
140-
2. **Automated Testing (A-TDD)**: Generate and replay acceptance tests for regression.
141-
3. **Accessibility Evaluation**: Collect interaction data for consultants and researchers.
142-
4. **Intelligent Agents**: [TBD] Enable bots to navigate and test web UIs via gaze & speech.
143+
1. Accessible Browsing
144+
> Hands-free navigation for users with disabilities.
145+
2. Automated Testing (A-TDD)
146+
> Generate and replay acceptance tests for regression.
147+
3. Accessibility Evaluation
148+
> Collect interaction data for consultants and researchers.
149+
4. Intelligent Agents
150+
> [TBD] Enable bots to navigate and test web UIs via gaze & speech.
143151
144152

_layouts/default.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,9 @@ <h1 class="title-logo" align="center" style="margin-bottom: 10px;">
2828
<p style="margin-bottom: 7px; text-align: center;">{{ site.description | default: site.github.project_tagline }}</p>
2929

3030
{% if site.show_downloads %}
31-
<div style="text-align: center; padding: 20px 0;">
31+
<div style="text-align: center; padding: 10px 0;">
3232
<a href="{{ site.github.repository_url }}"
33-
style="display: inline-block; padding: 10px 16px; background-color: #24292e; color: white; text-decoration: none; border-radius: 6px; font-weight: bold;">
33+
style="display: inline-block; padding: 8px 16px; background-color: #24292e; color: white; text-decoration: none; border-radius: 6px; font-weight: bold;">
3434
View on GitHub
3535
</a>
3636
</div>

0 commit comments

Comments
 (0)