Skip to content

Conversation

@Rolland-He
Copy link
Collaborator

Fix llama.cpp test cases not accounting for the new system prompt parameter after both PRs were merged.

@Rolland-He Rolland-He requested review from david-yz-liu and wkukka1 and removed request for david-yz-liu June 16, 2025 17:11
Copy link
Collaborator

@wkukka1 wkukka1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Rolland-He Looks good!

One suggestion, add a test that uses system prompts so that you can test that the prompt is being built correctly.

@Rolland-He
Copy link
Collaborator Author

Rolland-He commented Jun 16, 2025

@Rolland-He Looks good!

One suggestion, add a test that uses system prompts so that you can test that the prompt is being built correctly.

Thank you for your suggestion, @wkukka1 ! I believe it's already covered in the test_system_prompt.py. I will make sure that it's all covered.

EDITED: I have double-checked, and it is indeed covered.

@wkukka1
Copy link
Collaborator

wkukka1 commented Jun 16, 2025

@Rolland-He Looks good!
One suggestion, add a test that uses system prompts so that you can test that the prompt is being built correctly.

Thank you for your suggestion, @wkukka1 ! I believe it's already covered in the test_system_prompt.py. I will make sure that it's all covered.

Alright, never mind then.

@Rolland-He Rolland-He requested a review from david-yz-liu June 16, 2025 17:59
@david-yz-liu david-yz-liu merged commit d257326 into MarkUsProject:main Jun 17, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants