Skip to content

add renderer test for student performance subject comparison#312

Open
AviraL0013 wants to merge 6 commits intoanimint:masterfrom
AviraL0013:test-renderer-student-subject
Open

add renderer test for student performance subject comparison#312
AviraL0013 wants to merge 6 commits intoanimint:masterfrom
AviraL0013:test-renderer-student-subject

Conversation

@AviraL0013
Copy link

Renderer Test: Student Performance Subject Comparison

This PR adds a renderer test based on a subject comparison plot from
my Student Performance Analytics Dashboard built with animint2.

Visualization

The test renders a subject comparison bar chart showing mean grades
per subject (Mathematics, Physics, Chemistry, Biology, English, History)
with individual student grade points overlaid. The semester selector
allows filtering by semester.

What the test verifies

  • 6 bar rect elements are rendered, one per subject
  • Point circles are rendered for individual student grades
  • A selector widget is present in the page
  • The semester selector has exactly 2 options (Sem1 and Sem2)
  • The plot title "Subject Performance Comparison" is rendered in SVG
  • All 6 subject names appear as x-axis tick labels

How to run

library(animint2)
library(XML)
setwd("tests/testthat")
source("helper-functions.R")
source("helper-HTML.R")
tests_init()
testthat::test_file("test-renderer-grade-trends.R")

Result

PASS 11 | FAIL 0 | WARN 0 | SKIP 0

Screencast

demo.1.mp4

@tdhock Please review :)

Copy link
Collaborator

@tdhock tdhock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

your code should show interaction, clickID()

in your video please try line by line execution so we can see the web browser before and after clickID()

info$html,
'//text[contains(text(), "Subject Performance Comparison")]'
)
expect_true(length(titles) > 0)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should change expect_true to expect_ something more specific

@AviraL0013
Copy link
Author

AviraL0013 commented Mar 10, 2026

Hi @tdhock,

Thank you for the review! I have addressed both points:

  1. Added clickID("Physics") in test 7 to verify that clicking a subject bar updates the selected subject and circles are still rendered correctly in the browser.

  2. Changed expect_true(length(titles) > 0) to expect_equal(length(titles), 1L) in test 5 for a more specific assertion.

All 13 tests pass locally:
PASS 13 | FAIL 0 | WARN 0 | SKIP 0

Screencast will be added shortly.

@tdhock
Copy link
Collaborator

tdhock commented Mar 10, 2026

thanks, but the video does not work for me in the comment.
why didn’t you upload to vimeo as was requested in the hard test question?

@AviraL0013
Copy link
Author

So Sorry about that! There was a voice glitch in the video.
I am re-recording and will upload the correct Vimeo link shortly.

@AviraL0013
Copy link
Author

Hi @tdhock,

Please find the screencast here:
https://vimeo.com/1172330910

The video shows:

  • install.packages("chromote")
  • tests_init() to start the remote controlled browser
  • animint2HTML() to render the subject comparison visualization
  • Line by line execution with browser visible throughout
  • clickID() interaction showing Physics label opacity change from 0.5 to 1 before and after click
  • All 13 tests passing with FAIL 0

Thank you for your patience!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants