Skip to content

Commit ed587c2

Browse files
committed
feat: add logger
1 parent 0f3dd95 commit ed587c2

File tree

7 files changed

+50
-39
lines changed

7 files changed

+50
-39
lines changed

Makefile

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,13 @@
1-
.PHONY: chat logs
1+
.PHONY: chat dev console
22

33
install:
44
uv sync && uv run pre-commit install
55

66
chat:
77
uv run chat
88

9-
logs:
9+
console:
1010
uv run textual console -x SYSTEM -x EVENT -x DEBUG -x INFO
11+
12+
dev:
13+
uv run textual run --dev -c chat

README.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,3 +27,19 @@ Additional MCP servers are configured in `agent-chat-cli.config.yaml` and prompt
2727
- Typechecking is via [MyPy](https://github.com/python/mypy):
2828
- `uv run mypy src`
2929
- Linting and formatting is via [Ruff](https://docs.astral.sh/ruff/)
30+
31+
Textual has an integrated logging console which one can boot separately from the app to receive logs.
32+
33+
In one terminal pane boot the console:
34+
35+
```bash
36+
make console
37+
```
38+
39+
> Note: this command intentionally filters out more verbose notifications. See the Makefile to configure.
40+
41+
And then in a second, start the textual dev server:
42+
43+
```bash
44+
make dev
45+
```

agent-chat-cli.config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
system_prompt: system.md
55

66
# Model to use (e.g., sonnet, opus, haiku)
7-
model: sonnet
7+
model: haiku
88

99
# Enable streaming responses
1010
include_partial_messages: true

src/agent_chat_cli/app.py

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,21 @@
1-
import logging
21
import asyncio
32

43
from textual.app import App, ComposeResult
54
from textual.containers import VerticalScroll
65
from textual.binding import Binding
7-
from textual.logging import TextualHandler
86

97
from agent_chat_cli.components.header import Header
108
from agent_chat_cli.components.chat_history import ChatHistory, MessagePosted
119
from agent_chat_cli.components.thinking_indicator import ThinkingIndicator
1210
from agent_chat_cli.components.user_input import UserInput
1311
from agent_chat_cli.utils import AgentLoop
1412
from agent_chat_cli.utils.message_bus import MessageBus
13+
from agent_chat_cli.utils.logger import setup_logging
1514

1615
from dotenv import load_dotenv
1716

1817
load_dotenv()
19-
20-
logging.basicConfig(
21-
level="NOTSET",
22-
handlers=[TextualHandler()],
23-
)
18+
setup_logging()
2419

2520

2621
class AgentChatCLIApp(App):
@@ -45,7 +40,6 @@ def compose(self) -> ComposeResult:
4540
yield UserInput(query=self.agent_loop.query)
4641

4742
async def on_mount(self) -> None:
48-
logging.debug("Starting agent loop...")
4943
asyncio.create_task(self.agent_loop.start())
5044

5145
async def on_message_posted(self, event: MessagePosted) -> None:

src/agent_chat_cli/docs/architecture.md

Lines changed: 0 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -4,28 +4,6 @@
44

55
Agent Chat CLI is a Python TUI application built with Textual that provides an interactive chat interface for Claude AI with MCP (Model Context Protocol) server support.
66

7-
## Directory Structure
8-
9-
```
10-
src/agent_chat_cli/
11-
├── app.py # Main application entry point
12-
├── components/ # UI components (Textual widgets)
13-
│ ├── chat_history.py # Container for chat messages
14-
│ ├── messages.py # Message widgets and UI Message type
15-
│ ├── user_input.py # User input component
16-
│ ├── thinking_indicator.py # Loading indicator
17-
│ └── header.py # App header with config info
18-
├── utils/ # Business logic and utilities
19-
│ ├── agent_loop.py # Claude SDK conversation loop
20-
│ ├── message_bus.py # Routes messages between agent and UI
21-
│ ├── config.py # Configuration loading and validation
22-
│ ├── enums.py # All enum types
23-
│ ├── system_prompt.py # System prompt assembly
24-
│ ├── format_tool_input.py # Tool input formatting
25-
│ └── tool_info.py # MCP tool name parsing
26-
└── utils/styles.tcss # Textual CSS styles
27-
```
28-
297
## Core Components
308

319
### App Layer (`app.py`)

src/agent_chat_cli/utils/logger.py

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
import json
2+
import logging
3+
from typing import Any
4+
5+
from textual.logging import TextualHandler
6+
7+
8+
def setup_logging():
9+
logging.basicConfig(
10+
level="NOTSET",
11+
handlers=[TextualHandler()],
12+
)
13+
14+
15+
def log(message: str):
16+
logging.info(message)
17+
18+
19+
def log_json(message: Any):
20+
logging.info(json.dumps(message, indent=2))

src/agent_chat_cli/utils/message_bus.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -23,12 +23,6 @@ def __init__(self, app: "App") -> None:
2323
self.current_agent_message: AgentMessageWidget | None = None
2424
self.current_response_text = ""
2525

26-
async def _scroll_to_bottom(self) -> None:
27-
"""Scroll the container to the bottom after a slight pause."""
28-
await asyncio.sleep(0.1)
29-
container = self.app.query_one("#container")
30-
container.scroll_end(animate=False, immediate=True)
31-
3226
async def handle_agent_message(self, message: AgentMessage) -> None:
3327
match message.type:
3428
case AgentMessageType.STREAM_EVENT:
@@ -38,6 +32,12 @@ async def handle_agent_message(self, message: AgentMessage) -> None:
3832
case AgentMessageType.RESULT:
3933
await self._handle_result()
4034

35+
async def _scroll_to_bottom(self) -> None:
36+
"""Scroll the container to the bottom after a slight pause."""
37+
await asyncio.sleep(0.1)
38+
container = self.app.query_one("#container")
39+
container.scroll_end(animate=False, immediate=True)
40+
4141
async def _handle_stream_event(self, message: AgentMessage) -> None:
4242
text_chunk = message.data.get("text", "")
4343

0 commit comments

Comments
 (0)