Skip to content

[BUG] colang content causing streaming issues #546

@shiv248

Description

@shiv248

Hello I have this sample code below that I am trying to run.

its based off of streaming docs.
though I'm hitting issues with streaming the output of the LLMRail when using colang content. Do I have something setup incorrectly for my colang _content?

pip install --upgrade --force-reinstall nemoguardrails openai==0.27.8

import asyncio
from langchain_community.chat_models import ChatOpenAI
from nemoguardrails import LLMRails, RailsConfig
from nemoguardrails.streaming import StreamingHandler
import logging

logging.basicConfig(level=logging.INFO)

openai_api_key = 'sk-...'  # YOUR_API_TOKEN
llm = ChatOpenAI(model_name='gpt-3.5-turbo-16k',
                 openai_api_key=openai_api_key,
                 temperature=0,
                 streaming=True)

YAML_CONFIG = """
models:
  - type: main
    engine: openai
    model: gpt-3.5-turbo-16k

streaming: True
"""

colang_content = """
define user ask weather
  "how is the weather today?"
  "should I wear a coat?"

define bot answer weather service down
    "Unfortunately, I'm unable to access my weather service API at the moment. This means I can't provide you with the real-time weather information such as weather conditions or forecasts for your area"

define flow weather
  user ask weather
  bot answer weather service down

define flow
    user ...
    bot greeting
"""

async def demo_1():
    """Demo using the streaming of response chunks directly."""
    config = RailsConfig.from_content(yaml_content=YAML_CONFIG, colang_content=colang_content)
    #config = RailsConfig.from_content(yaml_content=YAML_CONFIG) #allows streaming but no colang
    app = LLMRails(config, llm=llm, verbose=False)

    history = [{"role": "user", "content": "tell me a story about Unicorns?"}]

    async for chunk in app.stream_async(messages=history):
        print(chunk, end="")


asyncio.run(demo_1())

I get entire result at end but not print every chunk that gets processed. Is that expect output?

could it be I'm using a different version of Colang language that LLMRails is expecting?
Thank you for your time!

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions