-
Notifications
You must be signed in to change notification settings - Fork 4
feat: allow change prompts via django admin #77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks for the pull request, @Henrrypg! This repository is currently maintained by Once you've gone through the following steps feel free to tag them in a comment and let them know that your changes are ready for engineering review. 🔘 Get product approvalIf you haven't already, check this list to see if your contribution needs to go through the product review process.
🔘 Provide contextTo help your reviewers and other members of the community understand the purpose and larger context of your changes, feel free to add as much of the following information to the PR description as you can:
🔘 Get a green buildIf one or more checks are failing, continue working on your changes until this is no longer the case and your build turns green. DetailsWhere can I find more information?If you'd like to get more details on all aspects of the review process for open source pull requests (OSPRs), check out the following resources: When can I expect my changes to be merged?Our goal is to get community contributions seen and reviewed as efficiently as possible. However, the amount of time that it takes to review and merge a PR can vary significantly based on factors such as:
💡 As a result it may take up to several weeks or months to complete a review and merge your PR. |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #77 +/- ##
==========================================
+ Coverage 89.80% 90.09% +0.28%
==========================================
Files 46 46
Lines 4041 4129 +88
Branches 264 266 +2
==========================================
+ Hits 3629 3720 +91
+ Misses 321 320 -1
+ Partials 91 89 -2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
cde23d9 to
760b11d
Compare
felipemontoya
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Simple and efective.
Combining this with the flexibility of json5 is fantastic.
The one thing I would like is to add a new base template for this. I would call it base.box_custom_prompt
{
"orchestrator_class": "DirectLLMResponse",
"processor_config": {
"LLMProcessor": {
"stream": true,
"prompt": "\
You are an educational assistant embedded in an online course. \
\
Rephrase the given content using different wording while preserving the original meaning, intent, and factual accuracy. \
\
Guidelines: \
- Do not add new information. \
- Do not remove important details. \
- Do not simplify unless explicitly asked. \
- Keep the length roughly similar to the original. \
- Use clear, natural language suitable for the course audience. \
- Avoid repeating sentence structures or phrases from the original text. \
\
Output only the rephrased content. \
",
}
},
}41164f4 to
df95048
Compare
|
Thank you @Henrrypg for taking charge of the QA for this PR |
This pull request introduces support for custom system prompts in the AI extensions processors, allowing for greater flexibility when configuring prompt behavior. The main changes ensure that a custom prompt, if provided in the configuration, is used in place of the default system role in relevant API calls.
Custom Prompt Handling:
custom_promptattribute to thelitellm_base_processor.pyclass, initialized from the configuration if present.Prompt Usage in API Calls:
_build_response_api_paramsinllm_processor.pyto useself.custom_promptas the system prompt if it exists, otherwise defaulting to the originalsystem_role._call_completion_wrapperinllm_processor.pyto useself.custom_promptas the system prompt if it exists, otherwise defaulting to the originalsystem_role.How to test
You can setup prompt in LLMProcessor or EducatorAssistantProcessor via AIWorkflowProfile overrides