Skip to content

Conversation

@crisbeto
Copy link
Member

Adds the web-codegen-scorer run script that allows users to run an evaluated app in their browser. It spins up a server using the local LLM output and the existing environment config.

@crisbeto crisbeto requested a review from devversion September 19, 2025 12:58
@crisbeto crisbeto force-pushed the run-script branch 2 times, most recently from 9db6609 to d494e7a Compare September 19, 2025 13:02

async function resolveConfig(options: Options) {
if (!options.environment) {
throw new UserFacingError(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We aren't using Yargs demandOption here because we want this helpful error, right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, the Yargs error isn't super readable.

.option('prompt', {
type: 'string',
default: '',
description: 'Prompt within the environment that should be run',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the description is a bit ambiguous/confusing. Is this a path to a prompt, or a basename? Maybe it should be a path?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's actually the ID within the llm-output/<environment.id>. I'll update the message.

console.error(
chalk.red('An error occurred during the assessment process:')
);
console.error(chalk.red(error));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we print the stack in those cases? (if available)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the UserFacingError we don't print the stack trace, because it can be noisy. It's meant for more readable errors that we produce (e.g. the environment path is wrong).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, but we could print if it's not the user facing error?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yeah in that case we should just throw. I copied this over from the eval-cli 😅

Adds the `web-codegen-scorer run` script that allows users to run an evaluated app in their browser. It spins up a server using the local LLM output and the existing environment config.
@crisbeto crisbeto merged commit d4ae0a6 into angular:main Sep 19, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants