Skip to content

1.23.0

Choose a tag to compare

@github-actions github-actions released this 08 Jul 18:02
· 120 commits to main since this release
5ad0c04
Streaming chat cleanup & in-process docling (#247)

* refactoring/cleanup

* type fixes and import cleanup

* name changes

lastFile:ui/src/pages/RagChatTab/State/RagChatContext.tsx

* wip

lastFile:llm-service/app/services/query/querier.py

* event testing

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* wip event queue

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* moving poison pill around

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* event wip

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* WIP event queue

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* WIP event queue

lastFile:llm-service/app/services/query/chat_events.py

* WIP even queue

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* wip on chat events

lastFile:llm-service/app/routers/index/sessions/__init__.py

* work in progress on chat events

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* WIP event queue

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* drop databases

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* wip on openai streaming events

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* send additional done after we're really done

lastFile:llm-service/app/routers/index/sessions/__init__.py

* getting close to streaming events on non openai agents

lastFile:llm-service/app/services/query/agents/tool_calling_querier.py

* gracefully shutdown handler and close loop

* python cleanup

* error handling in the non-openai streaming

* cleanup

* render contents of a tags and remove chat event queue

* input for date tool

* default input

* fix duplicated timestamp issue

* mypy

* remove openaiagent

* update lock file

* Docling enhancements to parsing support and sampling summary nodes (#248)

* work on docling native parsing

* native parsing to json work, no formatting

* docling + markdown + page numbers

* small cleanup

* use docling for parsing docs

* only use docling readers for pdf and html

* change chunk API to return a list of results

* batch the csv results

* conditionally condense questions

* Revert "batch the csv results"

This reverts commit ff3936fc7c2490609137d070052beab889c7a619.

* Revert "change chunk API to return a list of results"

This reverts commit 4ea267f7ebdf19fb442ace03eed53caf5e922d94.

* implement block-sampling for summarization

* add test for block sampling

* add some prints for debugging

* update test and doc strings

* handle case where no content in summary

* better status code

* fix mypy

---------

Co-authored-by: jwatson <jkwatson@gmail.com>

* Update release version to dev-testing

* filter out non-final agent response

* fixed typos

* fixed typos wip

* put limit on error retries on suggested questions and show error when opening suggested questions

* format agent stream output for non final response

* better error messages

* refactor and check for mistral agent

* fix issue with long responses to session name

* return empty string when no chat history instead of throwing exception

* throw error using bedrock model that does not support tool calling

* error handling for non tool calling models

lastFile:llm-service/app/services/query/querier.py

* mypy fix and refactoring

* use chat when no tools with tool calling enable

* remove print and modify error message

* remove print

---------

Co-authored-by: Baasit Sharief <baasitsharief@gmail.com>
Co-authored-by: Elijah Williams <ewilliams@cloudera.com>
Co-authored-by: Michael Liu <mliu@cloudera.com>
Co-authored-by: actions-user <actions@github.com>