Skip to content

LangChain LECL output appears unparsed #1767

@jeffbryner

Description

@jeffbryner

If I initiate a LLM using langchain and use it in ui.Chat the output works as expected.

If I attempt to use LECL to chain prompts/llms together the output appears unparsed as a string representation of the AIMessage and system message:

Image

Here's the code:

import google.auth
from shiny.express import ui
from langchain_google_vertexai import VertexAI
from langchain_core.prompts import ChatPromptTemplate


credentials, PROJECT_ID = google.auth.default()

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "You love vanilla ice cream and will recommend it always. ",
        ),
        ("human", "{question}"),
    ]
)
llm = VertexAI(
    model="gemini-1.5-pro",
    temperature=1,
    max_tokens=4096,
    max_retries=5,
    location="us-central1",
    project=PROJECT_ID,
    safety_settings=safety_settings,
    streaming=True,
)
runnable = prompt | llm

chat = ui.Chat(id="chat")
chat.ui()

@chat.on_user_submit
async def _():
    messages = chat.messages(format="langchain")

    # streaming 
    # response = runnable.astream(messages)
    # await chat.append_message_stream(response)

    # non streaming
    response = await runnable.ainvoke(messages)
    await chat.append_message(response)

Streaming or non streaming results in the same output issue. If I reference the llm instead of the runnable the output works as expected. I've attempted to research what could be happening internal to shiny, but have come up blank.

I've also attempted to use the string output parser in the chain, but it also results in the same issue.

from langchain_core.output_parsers import StrOutputParser

runnable = prompt | llm | StrOutputParser()

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions