Skip to content

Commit 32f3f01

Browse files
committed
Add README for coffee_expressions_test_ui
1 parent 7f505ed commit 32f3f01

File tree

1 file changed

+156
-0
lines changed
  • coffee_ws/src/coffee_expressions_test_ui

1 file changed

+156
-0
lines changed
Lines changed: 156 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,156 @@
1+
# Coffee Expressions Test UI
2+
3+
A ROS2 package providing a Qt-based graphical user interface for testing and debugging coffee robot expressions. This tool allows developers and operators to manually trigger different emotional expressions and monitor the robot's affective state system.
4+
5+
## Features
6+
7+
- **Expression Testing**: Select from predefined expressions (Happy, Angry, Loving, Sad, Surprised)
8+
- **Trigger Source Control**: Specify the source of expression triggers (vision, audio, event, mock)
9+
- **Real-time Adjustment**: Live sliders for intensity and confidence levels (0-100%)
10+
- **Gaze Control**: Interactive gaze target positioning with visual preview
11+
- **Publishing Modes**: Manual publishing or real-time automatic publishing
12+
- **Visual Feedback**: Live preview widget showing gaze target position
13+
- **Status Monitoring**: Real-time publishing status and timestamps
14+
15+
## Requirements
16+
17+
- ROS2 (Jazzy or compatible)
18+
- Python 3
19+
- python_qt_binding (included with ROS2)
20+
- coffee_expressions_msgs (interface package)
21+
22+
## Installation
23+
24+
1. Ensure the package is in your ROS2 workspace:
25+
```bash
26+
cd /path/to/your/workspace/src
27+
# coffee_expressions_test_ui/
28+
```
29+
30+
2. Build the package:
31+
```bash
32+
cd /path/to/your/workspace
33+
# Build interface package first
34+
colcon build --packages-select coffee_expressions_msgs
35+
# Build test UI
36+
colcon build --packages-select coffee_expressions_test_ui
37+
```
38+
39+
3. Source the workspace:
40+
```bash
41+
source /path/to/your/workspace/install/setup.bash
42+
```
43+
44+
## Usage
45+
46+
### Launch the Test UI
47+
48+
```bash
49+
ros2 run coffee_expressions_test_ui expressions_test_ui
50+
```
51+
52+
### Interface Controls
53+
54+
#### Expression Selection
55+
- **Expression Dropdown**: Choose from available emotional expressions
56+
- Happy
57+
- Angry
58+
- Loving
59+
- Sad
60+
- Surprised
61+
62+
#### Trigger Configuration
63+
- **Trigger Source**: Radio buttons to specify the source of the expression
64+
- `vision`: Triggered by computer vision system
65+
- `audio`: Triggered by audio processing
66+
- `event`: Triggered by system events
67+
- `mock`: Manual testing trigger (default)
68+
69+
#### Intensity & Confidence
70+
- **Intensity Slider**: Control expression intensity (0-100%)
71+
- **Confidence Slider**: Set confidence level of the expression (0-100%)
72+
73+
#### Gaze Control
74+
- **Gaze X/Y Sliders**: Set gaze target coordinates (-100 to +100)
75+
- **Gaze Preview**: Visual widget showing current gaze target position
76+
- **Interactive Preview**: Click and drag in the preview to set gaze target
77+
78+
#### Publishing Options
79+
- **Is Idle Checkbox**: Mark the robot as idle/active
80+
- **Real-time Publishing**: Automatically publish changes as you adjust controls
81+
- **Publish Button**: Manually publish current settings when real-time is disabled
82+
83+
### ROS2 Integration
84+
85+
#### Published Topics
86+
87+
- **`/robot/affective_state`** (`coffee_expressions_msgs/AffectiveState`)
88+
- Complete affective state information including expression, intensity, confidence, gaze target, and trigger source
89+
90+
#### Message Structure
91+
92+
The published `AffectiveState` message contains:
93+
```
94+
string expression # Selected expression name
95+
float32 intensity # Expression intensity (0.0-1.0)
96+
float32 confidence # Confidence level (0.0-1.0)
97+
string trigger_source # Source of the trigger
98+
geometry_msgs/Point gaze_target # Gaze target coordinates
99+
bool is_idle # Robot idle state
100+
```
101+
102+
### Testing Workflows
103+
104+
#### Expression Development
105+
1. Launch the test UI
106+
2. Select "Real-time Publishing" for immediate feedback
107+
3. Choose an expression and adjust intensity/confidence
108+
4. Monitor the robot's response in real-time
109+
110+
#### Gaze System Testing
111+
1. Use the gaze sliders or click in the preview widget
112+
2. Observe robot head/eye movement responses
113+
3. Test different gaze positions and combinations
114+
115+
#### Integration Testing
116+
1. Set trigger source to match your system component
117+
2. Test different expressions with varying intensities
118+
3. Verify message reception in your expression processing nodes
119+
120+
## Development
121+
122+
### Adding New Expressions
123+
124+
To add new expressions, modify the expression list in `expressions_test_ui.py`:
125+
126+
```python
127+
self.expression_combo.addItems(['Happy', 'Angry', 'Loving', 'Sad', 'Surprised', 'NewExpression'])
128+
```
129+
130+
### Customizing Gaze Range
131+
132+
Adjust the slider ranges in the gaze control section:
133+
134+
```python
135+
slider.setRange(-100, 100) # Modify range as needed
136+
```
137+
138+
## Troubleshooting
139+
140+
### UI Not Starting
141+
- Ensure `python_qt_binding` is available (comes with ROS2)
142+
- Check that all dependencies are built and sourced
143+
- Verify display is available for GUI applications
144+
145+
### Messages Not Publishing
146+
- Check topic name matches your expression processing nodes
147+
- Verify `coffee_expressions_msgs` package is built and sourced
148+
- Monitor with `ros2 topic echo /robot/affective_state`
149+
150+
### Gaze Preview Not Responding
151+
- Ensure mouse tracking is enabled in the widget
152+
- Check coordinate system matches your robot's expectations
153+
154+
## License
155+
156+
Apache License 2.0

0 commit comments

Comments
 (0)