Replies: 2 comments 2 replies
-
| 
         @fguillen Which LLM are you using? You should be able to stream the response back.  | 
  
Beta Was this translation helpful? Give feedback.
                  
                    2 replies
                  
                
            -
| 
         Thanks for the support @andreibondarev !  | 
  
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
I am using langchainrb to build a RAG Rails-based app. Thanks for the library! :)
In my road map, I feature showing the Model responses as it builds them, "chunk" by "chunk," to offer a more dynamic experience to the User.
What would be the approach to do this using langchainrb?
Beta Was this translation helpful? Give feedback.
All reactions