Skip to content

Remote Agentic AI Backend with LangServe, LangGraph consumed as RemoteRunnable in NextJS with AI/RSC #50

@elvenking

Description

@elvenking

First of all thank you @developersdigest for this excellent youtube tutorial and repo.

Please take following as an idea and potential feature request.

I feel like it is not ideal to mix all the AI logic with the web app. I would like to see some stable example of separation of concerns of "complex AI backend" and consumer NextJS web app.

Imagine following setup of Answer Engine

Backend:

  • Python - as it is lingua franca of LLM and the better part of LangChain libs
  • LangServe (with LCEL streaming) as API
  • LangGraph for complex state & process management and tool invocation
  • Persistence of history (graph) e.g. in EdgeDB or Zep
  • Rate limiting
  • Semantic Caching

Frontend:

  • NextJS app
  • Streaming AI backend consumed as RemoteRunnable of LangChain
  • Vercel AI SDK with AI/RSC for streaming React components from AI backend
  • Able to stream intermediate agentic results / graph state from remote backend

There are multiple requests for such setup and nobody came with a stable solution

I have filled similar request to Vercel AI SDK repo:
vercel/ai#1506

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions