Skip to content

Comments

fix: add missing retry dependency to LLM example requirements#359

Open
Chinyemba-ck wants to merge 1 commit intokubeedge:mainfrom
Chinyemba-ck:fix/add-retry-dependency
Open

fix: add missing retry dependency to LLM example requirements#359
Chinyemba-ck wants to merge 1 commit intokubeedge:mainfrom
Chinyemba-ck:fix/add-retry-dependency

Conversation

@Chinyemba-ck
Copy link

/kind bug

What this PR does / why we need it:

This PR fixes a missing dependency in the cloud-edge collaborative inference for LLM example. The code imports the retry package in testalgorithms/query-routing/models/api_llm.py but doesn't list it in requirements.txt, causing a ModuleNotFoundError at runtime when users try to run the example.

This prevents new users from successfully running the LLM example following the documentation.

Which issue(s) this PR fixes:

N/A - This issue was discovered during a complete reproduction of the example from scratch.

Additional context:

  • Root Cause: Line 21 of testalgorithms/query-routing/models/api_llm.py contains from retry import retry, but the retry package is not listed in the example's requirements.txt
  • Impact: The benchmark fails immediately on startup with ModuleNotFoundError before any inference can run
  • Fix: Added retry as a dependency in examples/cloud-edge-collaborative-inference-for-llm/requirements.txt
  • Testing: Verified the fix resolves the import error. The benchmark initializes successfully and proceeds past the module loading stage.

Reproduction steps (before fix):

  1. Build Docker image: docker build -t ianvs-experiment-image ./examples/cloud-edge-collaborative-inference-for-llm/
  2. Run container and activate conda environment
  3. Execute: ianvs -f examples/cloud-edge-collaborative-inference-for-llm/benchmarkingjob.yaml
  4. Result: ModuleNotFoundError: No module named 'retry'

After fix: The ModuleNotFoundError is resolved.

@kubeedge-bot kubeedge-bot added the kind/bug Categorizes issue or PR as related to a bug. label Feb 15, 2026
@kubeedge-bot
Copy link
Collaborator

Welcome @Chinyemba-ck! It looks like this is your first PR to kubeedge/ianvs 🎉

@gemini-code-assist
Copy link

Summary of Changes

Hello @Chinyemba-ck, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request resolves a critical bug in the cloud-edge collaborative inference for LLM example by addressing a missing dependency. The absence of the retry package in the requirements.txt file previously led to a ModuleNotFoundError, preventing the example from running. Integrating this dependency ensures the example can now initialize and execute as intended, significantly improving the reliability and user experience for those following the documentation.

Highlights

  • Missing Dependency Fix: The retry package, which is imported by api_llm.py in the cloud-edge LLM example, was not listed in the corresponding requirements.txt file. This omission caused a ModuleNotFoundError when attempting to run the example.
  • Example Functionality Restored: By adding retry to the requirements.txt, the example now initializes successfully and proceeds past the module loading stage, resolving the ModuleNotFoundError and allowing users to run the benchmark as intended.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • examples/cloud-edge-collaborative-inference-for-llm/requirements.txt
    • Added the retry package to the list of required dependencies.
Activity
  • The PR author, Chinyemba-ck, identified a ModuleNotFoundError during a complete reproduction of the example from scratch.
  • The fix was verified to resolve the import error, allowing the benchmark to initialize successfully and proceed past the module loading stage.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@kubeedge-bot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: Chinyemba-ck
To complete the pull request process, please assign moorezheng after the PR has been reviewed.
You can assign the PR to them by writing /assign @moorezheng in a comment when ready.

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@kubeedge-bot kubeedge-bot added the size/XS Denotes a PR that changes 0-9 lines, ignoring generated files. label Feb 15, 2026
The cloud-edge collaborative inference for LLM example imports
the 'retry' package in api_llm.py but doesn't list it in
requirements.txt, causing ModuleNotFoundError at runtime.

This adds 'retry' to the example's requirements.txt to fix
the missing dependency issue.

Signed-off-by: Chinyemba-ck <chinyembakalenga99@gmail.com>
@Chinyemba-ck Chinyemba-ck force-pushed the fix/add-retry-dependency branch from 09b0168 to 6909e89 Compare February 15, 2026 03:13
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly addresses a ModuleNotFoundError by adding the missing retry dependency. The change is accurate and well-described. My review includes a suggestion to improve the maintainability of the requirements.txt file by sorting the dependencies and ensuring it ends with a newline, which are common best practices.

@Chinyemba-ck
Copy link
Author

/assign @MooreZheng

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

kind/bug Categorizes issue or PR as related to a bug. size/XS Denotes a PR that changes 0-9 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants