Skip to content

Conversation

@lewistransts
Copy link
Collaborator

No description provided.

…file inputs and provide detailed processing feedback

This change allows users to process multiple files in a single command, improving usability. It also adds error handling and notifications for successful and failed file processing, ensuring users receive clear feedback on the command's execution.
…y and maintainability

fix(command_processor.py): ensure consistent handling of file content types and improve variable declarations
…e unused notification code to streamline the file command handling process
…e processing and simplify success notification format
combined_parts.append(f"<file path='{file_path}'>\n{content}\n</file>")
combined_content = "\n\n".join(combined_parts)

message_content: Dict[str, Any] = {"role": "user", "content": [{"type": "text", "text": combined_content}]}
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change message_content to this format to be consistence with other messages.
I got issue that can't receive response from LLM when using previous format {"role": "user", "content": [file_content]}

content_text: str
if isinstance(file_content, dict) and "text" in file_content:
content_text = file_content["text"]
elif isinstance(file_content, str):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not process file here, llm already handle the file message in agent.format_message function. only append file content to message as it is

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reverted to the previous logic, just modified

  • adding a for loop for supporting multiple files
  • adding type hints
  • changing the message history format to {"role": "user", "content": [{"type": "text", "text": "<file file_path=".."> .... </file>"]}

…oving unnecessary checks and consolidating logic for better readability and maintainability
@lewistransts lewistransts force-pushed the feat/add-multiple-files branch from 9c0ac5d to d1a4187 Compare July 6, 2025 02:59
…s to specify it contains dictionaries and enhance file content formatting

feat(command_processor.py): wrap file content with XML tags and structure it as a dictionary for better message handling
# Process file with the file handling service
if self.message_handler.file_handler is None:
self.message_handler.file_handler = FileHandler()
file_content: Optional[Any] = self.message_handler.file_handler.process_file(file_path)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to define type for file_content here

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

if file_content:
if file_content:
# Wrap file content with XML tags including file path
formatted_content: str = f'<file file_path="{file_path}">\n{file_content}\n</file>'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

append directly the file_content value, agent and file_handler already format file content

Copy link
Collaborator Author

@lewistransts lewistransts Jul 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

…emoving unnecessary variable assignments and XML formatting to enhance code clarity and maintainability
@daltonnyx daltonnyx merged commit ab65cf1 into saigontechnology:main Jul 6, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants