Skip to content

Conversation

DaltheCow
Copy link
Collaborator

This PR actually implements the application logic including the UI/charts and the api setup that retrieves data from the window object.

@DaltheCow DaltheCow added the UI Front-end workstream label May 19, 2025

📦 Build Artifacts Available
The build artifacts (.whl and .tar.gz) have been successfully generated and are available for download: https://github.com/neuralmagic/guidellm/actions/runs/15118091856/artifacts/3153328557.
They will be retained for up to 30 days.

📦 Build Artifacts Available
The build artifacts (.whl and .tar.gz) have been successfully generated and are available for download: https://github.com/neuralmagic/guidellm/actions/runs/15118171388/artifacts/3153359262.
They will be retained for up to 30 days.

📦 Build Artifacts Available
The build artifacts (.whl and .tar.gz) have been successfully generated and are available for download: https://github.com/neuralmagic/guidellm/actions/runs/15120301691/artifacts/3154098422.
They will be retained for up to 30 days.

@DaltheCow DaltheCow changed the base branch from implement-base-ui-app to implement-app-tooling May 19, 2025 19:28

📦 Build Artifacts Available
The build artifacts (.whl and .tar.gz) have been successfully generated and are available for download: https://github.com/neuralmagic/guidellm/actions/runs/15123202017/artifacts/3155104868.
They will be retained for up to 30 days.


1. Use the Hosted Build (Recommended for Most Users)

After running a benchmark with GuideLLM, a report.html file will be generated (by default at guidellm_report/report.html). This file references the latest stable version of the UI hosted at:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is based off of a WIP effort i have. I don't know if this is where report.html will end up but that is what I'm working with for now.

npx serve out
```

This will start a local server (e.g., at http://localhost:3000). Then, in your GuideLLM config or CLI flags, point to this local server as the asset base for report generation.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not something I've set up yet. Currently in this file the asset bases are configured via environment. One for prod, staging, dev, and local. But what I have in the readme implies you can see it to something other than the preconfigured options. Maybe it'd be better to have local hardcoded to localhost:3000 and wait on adding an option for the user to configure. Not sure.


### 🚧 Future Possibilities

We're evaluating options for hosting dev/staging/prod builds on GitHub Pages. For now, production builds will be published at:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should go with GitHub pages. I just left the other ideas in there as food for thought since gh pages is being used for docs right now, but I think these things can coexist.

@DaltheCow DaltheCow closed this May 19, 2025
@DaltheCow DaltheCow deleted the implmement-guidellm-ui branch May 19, 2025 21:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

UI Front-end workstream

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant