Skip to content

Conversation

logan-markewich
Copy link
Contributor

There was a small bug in the llama-index examples that used state

Basically, the tools that modify state need to save it at the end. Otherwise, there is a final state snapshot event emitted from the integration code that will erase all the modifications that have been made to the state so far

The fix is just to add await ctx.store.set("state", state) -- however, newer versions of llama-index workflows have an async context manager to make it even easier to access and modify state safely. Within the context manager, no other tool can modify state, so its safe if the agent makes multiple tool calls at once.

@tylerslaton
Copy link
Contributor

Ignore the failures ^ just an issue with our CI not running for external contributors.

@tylerslaton
Copy link
Contributor

@logan-markewich for the failure in Check Generated Files just cd into the dojo folder and run pnpm generate-files and commit the results.

That'll fix it 👍🏻

@tylerslaton
Copy link
Contributor

Looks like the E2E Suite caught an issue as well - any ideas?

[Llama Index] Traceback (most recent call last):
[Llama Index]   File "/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/.venv/bin/dev", line 4, in <module>
[Llama Index]     from server import main
[Llama Index]   File "/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/server/__init__.py", line 5, in <module>
[Llama Index]     from .routers.agentic_chat import agentic_chat_router
[Llama Index]   File "/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/server/routers/agentic_chat.py", line 1, in <module>
[Llama Index]     from llama_index.llms.openai import OpenAI
[Llama Index]   File "/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/.venv/lib/python3.12/site-packages/llama_index/llms/openai/__init__.py", line 1, in <module>
[Llama Index]     from llama_index.llms.openai.base import AsyncOpenAI, OpenAI, SyncOpenAI, Tokenizer
[Llama Index]   File "/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/.venv/lib/python3.12/site-packages/llama_index/llms/openai/base.py", line 65, in <module>
[Llama Index]     from llama_index.llms.openai.utils import (
[Llama Index]   File "/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/.venv/lib/python3.12/site-packages/llama_index/llms/openai/utils.py", line 24, in <module>
[Llama Index]     from llama_index.core.base.llms.types import (
[Llama Index] ImportError: cannot import name 'ThinkingBlock' from 'llama_index.core.base.llms.types' (/home/runner/work/ag-ui/ag-ui/typescript-sdk/integrations/llamaindex/server-py/.venv/lib/python3.12/site-packages/llama_index/core/base/llms/types.py)
[Llama Index] uv run dev exited with code 1

@logan-markewich
Copy link
Contributor Author

@tylerslaton I needed to pin deps harder, should be resolved. Also ran and committed the generated files thing

@tylerslaton
Copy link
Contributor

Merged in #416

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants