Replies: 1 comment 6 replies
-
To return Plotly figures from an agent tool to the frontend using Chainlit, you can extend the
from typing import Union
import plotly.graph_objects as go
from llama_index.core.agent.runner.base import TaskStepOutput
class CustomTaskStepOutput(TaskStepOutput):
output: Union[str, go.Figure]
from llama_index.core.agent.runner.base import AgentRunner
class CustomAgentRunner(AgentRunner):
def run_step(
self,
task_id: str,
input: Optional[str] = None,
step: Optional[TaskStep] = None,
**kwargs: Any,
) -> CustomTaskStepOutput:
"""Run step."""
step = validate_step_from_args(task_id, input, step, **kwargs)
# Custom logic to handle Plotly figures
if isinstance(step.output, go.Figure):
return CustomTaskStepOutput(output=step.output)
else:
return super().run_step(task_id, input, step, **kwargs)
class CustomAgentRunner(AgentRunner):
def finalize_response(
self,
task_id: str,
step_output: Optional[CustomTaskStepOutput] = None,
) -> AGENT_CHAT_RESPONSE_TYPE:
"""Finalize response."""
if step_output is None:
step_output = self.state.get_completed_steps(task_id)[-1]
if not step_output.is_last:
raise ValueError(
"finalize_response can only be called on the last step output"
)
if isinstance(step_output.output, go.Figure):
# Handle Plotly figure
return step_output.output.to_json()
else:
return super().finalize_response(task_id, step_output)
Modify your Chainlit code to handle the JSON representation of the Plotly figure: import chainlit as cl
from utils import agent
import plotly.graph_objects as go
import json
@cl.on_chat_start
async def on_chat_start():
cl.user_session.set("agent", agent)
await cl.Message("Hi!").send()
@cl.on_message
async def on_message(msg: cl.Message):
agent = cl.user_session.get("agent")
response = await cl.make_async(agent.chat)(msg.content)
# Check if the response is a Plotly figure JSON
try:
fig = go.Figure(json.loads(response))
await cl.Message(content=fig.to_html(), author="iFactory Assistant").send()
except json.JSONDecodeError:
await cl.Message(content=str(response), author="iFactory Assistant").send() By following these steps, you can extend the LlamaIndex framework to handle and return Plotly figures, and integrate this functionality with Chainlit to display the figures in your frontend [1]. |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello!
I'm designing a tool for an agent to use to generate plotly images. I'm able to make this work in Jupyter Notebook but I'm not sure about how to get the agent to return just the plotly graph figure to my frontend which I'm using chainlit to do.
Code snippet without chainlit:
This code snippet runs neatly in a notebook with the figure showing. When I tried leaving everything as is with chainlit, My plotly diagram opens in a new tab, and the llm replies "Here is the chart" with a blank image icon. When I returned fig instead of plotted there is an error because the llm doesn't actually know how to work with the graph object, and is still trying to reply in natural text.
Code snippet with chainlit
I was wondering is there a way to just send the figure as an output from the agent's tool to the frontend? Otherwise I'll have to create tmp folders to save images and render them as static images.
Beta Was this translation helpful? Give feedback.
All reactions