-
Notifications
You must be signed in to change notification settings - Fork 93
Open
Description
User Story
Who
Healthcare OEM/ODM engineers, solution architects, and clinical workflow stakeholders evaluating Intel Core Ultra for end-to-end NICU safety and care scenarios. They care less about model internals and more about whether the system reflects a sensible clinical workflow.
What
The developer wants a visual workflow progression panel in the NICU GUI that:
- Shows the main NICU care protocol steps as a sequence.
- Highlights the current step.
- Marks previous steps as completed.
- Updates automatically based on what the system “sees” (via action recognition and related models).
This gives them a clear, at-a-glance understanding of how the AI system interprets NICU activities.
Why
They need to validate that the AI workloads on Intel Core Ultra are not just detecting objects or actions in isolation, but contributing to a meaningful clinical workflow. This is critical for discussing safety, compliance, and value with clinical and business stakeholders. If they cannot see a clear, intuitive workflow view, the demo loses impact.
How
The developer will:
- Use the one-command launcher to start the NICU sample app.
- Use the provided sample warmer video or a live camera capturing a NICU-like interaction.
- Watch the workflow panel as the scenario progresses (caregiver approaches, interacts with warmer, secures latches, etc.).
They should see the protocol steps advance in a way that matches the visual changes in the scene, without any manual interaction or domain knowledge about the underlying models.
User Flow(s) and Acceptance Criteria
- Developer shall clone the Healthcare AI Suite repo (2026.1 tag) on a fresh Ubuntu 24.04 system with Intel Core Ultra (ARL or PTL).
- Developer shall follow the NICU README to install dependencies and launch the NICU Warmer sample app using the provided launcher script.
- On launch, the GUI shall display a workflow progression panel (e.g., in a sidebar or top/bottom strip) that shows the current NICU care protocol step(s).
- The workflow panel shall contain clear, human-readable step labels (e.g., “Prepare Warmer”, “Caregiver at Bedside”, “Safety Doors Latched”, “Monitoring in Progress”) that are understandable to non-technical users.
- As the demo runs (sample video or camera), the workflow panel shall update automatically based on action recognition and related model outputs (no manual step selection).
- When the scene changes (e.g., caregiver appears, doors are latched, baby is present), the workflow panel shall advance or update to reflect the new state within ~1–2 seconds.
- The workflow panel shall show at least:
- The current active step
- A visual indication of completed steps
- Steps not yet reached (future steps)
- Developer shall be able to restart the demo (e.g., by replaying the sample video or resetting state) and see the workflow progression repeat from the beginning.
- KEI: From repo clone on fresh Ubuntu to seeing the workflow panel update in response to scene changes shall be doable in ≤ 60 minutes, including installation and reading the README.
Assumptions
- Intel Core Ultra (ARL/PTL) system is running Ubuntu 24.04 LTS and can already run the NICU models.
- NICU action recognition and other models are pre-trained to infer workflow stages; this issue focuses on visualizing that workflow in the GUI.
- The NICU README and launcher script already bring up the GUI without manual backend steps.
- A sample video or scripted demo scenario is available that naturally exercises multiple workflow steps.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels