-
-
Notifications
You must be signed in to change notification settings - Fork 6
feat: summarizer plugin #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis pull request introduces a new summarization feature through the addition of the Changes
Sequence Diagram(s)sequenceDiagram
participant Client as Caller
participant Summarizer as OpenAISummarizer
participant Solver as OpenAIChatCompletionsSolver
Client->>Summarizer: get_tldr(document, lang)
Summarizer->>Summarizer: Format prompt using template
Summarizer->>Solver: get_spoken_answer(prompt)
Solver-->>Summarizer: Return summary
Summarizer-->>Client: Return summarized text
Poem
Tip 🌐 Web search-backed reviews and chat
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
ovos_solver_openai_persona/engines.py(5 hunks)ovos_solver_openai_persona/summarizer.py(1 hunks)setup.py(2 hunks)
🔥 Files not summarized due to errors (1)
- ovos_solver_openai_persona/engines.py: Error: Disallowed special token found: <|im_end|>
🧰 Additional context used
🪛 Ruff (0.8.2)
ovos_solver_openai_persona/engines.py
15-15: Undefined name LanguageTranslator
(F821)
16-16: Undefined name LanguageDetector
(F821)
89-89: Undefined name LanguageTranslator
(F821)
90-90: Undefined name LanguageDetector
(F821)
ovos_solver_openai_persona/summarizer.py
11-11: Undefined name Optional
(F821)
11-11: Undefined name Dict
(F821)
12-12: Undefined name Optional
(F821)
12-12: Undefined name LanguageTranslator
(F821)
13-13: Undefined name Optional
(F821)
13-13: Undefined name LanguageDetector
(F821)
17-17: Undefined name Optional
(F821)
29-29: Undefined name Optional
(F821)
37-37: Undefined name content
(F821)
setup.py
70-70: Undefined name SUMMARIZER_ENTRY_POINT
(F821)
🪛 GitHub Actions: Run Build Tests
setup.py
[error] 70-70: NameError: name 'SUMMARIZER_ENTRY_POINT' is not defined. Did you mean: 'SUMMARIZER_PLUGIN_ENTRY_POINT'?
🔇 Additional comments (1)
ovos_solver_openai_persona/summarizer.py (1)
11-27: LGTM!The constructor properly initializes both the parent class and the LLM instance, with good support for configuration overrides.
🧰 Tools
🪛 Ruff (0.8.2)
11-11: Undefined name
Optional(F821)
11-11: Undefined name
Dict(F821)
12-12: Undefined name
Optional(F821)
12-12: Undefined name
LanguageTranslator(F821)
13-13: Undefined name
Optional(F821)
13-13: Undefined name
LanguageDetector(F821)
17-17: Undefined name
Optional(F821)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
ovos_solver_openai_persona/summarizer.py (2)
14-30: Fix indentation in constructor.The indentation in the super() call and LLM initialization is inconsistent with Python's style guide (PEP 8).
Apply this diff to fix the indentation:
- super().__init__(config=config, translator=translator, - detector=detector, priority=priority, - enable_tx=enable_tx, enable_cache=enable_cache, - internal_lang=internal_lang) - self.llm = OpenAIChatCompletionsSolver(config=config, translator=translator, - detector=detector, priority=priority, - enable_tx=enable_tx, enable_cache=enable_cache, - internal_lang=internal_lang) + super().__init__( + config=config, + translator=translator, + detector=detector, + priority=priority, + enable_tx=enable_tx, + enable_cache=enable_cache, + internal_lang=internal_lang + ) + self.llm = OpenAIChatCompletionsSolver( + config=config, + translator=translator, + detector=detector, + priority=priority, + enable_tx=enable_tx, + enable_cache=enable_cache, + internal_lang=internal_lang + )
40-41: Add error handling for LLM calls.The LLM call might fail due to API errors or network issues. Consider adding error handling to provide a graceful fallback.
Apply this diff to add error handling:
def get_tldr(self, document: str, lang: Optional[str] = None) -> str: prompt = self.prompt_template.format(content=document) - return self.llm.get_spoken_answer(prompt, lang) + try: + return self.llm.get_spoken_answer(prompt, lang) + except Exception as e: + # Log the error and return a fallback message or re-raise + # depending on your error handling strategy + raise RuntimeError(f"Failed to generate summary: {str(e)}") from e
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
ovos_solver_openai_persona/engines.py(3 hunks)ovos_solver_openai_persona/summarizer.py(1 hunks)setup.py(2 hunks)
🔥 Files not summarized due to errors (1)
- ovos_solver_openai_persona/engines.py: Error: Disallowed special token found: <|im_end|>
🚧 Files skipped from review as they are similar to previous changes (1)
- setup.py
🔇 Additional comments (2)
ovos_solver_openai_persona/summarizer.py (2)
1-13: LGTM!The imports are complete, and the class declaration with the template definition is well-structured. The template provides clear instructions for the summarization task.
32-41: LGTM!The method is well-implemented with proper type hints, clear documentation, and correct usage of parameters. The previous issue with the undefined
contentvariable has been fixed.
Summary by CodeRabbit