Skip to content

Commit c3f6900

Browse files
authored
clean up dependencies (#1364)
* clean up dependencies * add requests
1 parent 93621f7 commit c3f6900

File tree

5 files changed

+127
-191
lines changed

5 files changed

+127
-191
lines changed

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,4 +20,4 @@ reflex.db
2020
venv/
2121
.env
2222
build
23-
*.egg-info
23+
*.egg-info

blog/2024-04-16-custom-components.md

Lines changed: 0 additions & 73 deletions
Original file line numberDiff line numberDiff line change
@@ -65,79 +65,6 @@ def zoom():
6565
)
6666
```
6767

68-
## High-Level Components
69-
70-
You can also create custom components that don't wrap React, but instead are built out of existing Reflex components. For example, you can define custom navigation bars, modals, or even entire pages as custom components.
71-
72-
In our [intro chatapp tutorial]({getting_started.chatapp_tutorial.path}) we share the code for creating a chat component in Reflex. Having the full code gives you full flexibility in how your chat component works and appears, but sometimes you just want to drop in a chat component and not worry about the details.
73-
74-
With custom components, we now have a [reflex-chat](https://github.com/picklelo/reflex-chat/) package that you can install with `pip` and use in your app.
75-
76-
```bash
77-
pip install reflex-chat
78-
```
79-
80-
You can then import the chat component into your app and use it like any other Reflex component. All the styling and logic is encapsulated, you only need to pass in the actual LLM logic.
81-
82-
```python exec
83-
import reflex as rx
84-
from openai import OpenAI
85-
from reflex_chat import chat
86-
87-
try:
88-
client = OpenAI()
89-
except:
90-
client = None
91-
92-
# Only define your logic, the chat component handles the rest.
93-
async def run_llm(chat_state):
94-
# Start a new session to answer the question.
95-
session = client.chat.completions.create(
96-
model="gpt-4o-mini",
97-
messages=chat_state.get_messages(),
98-
stream=True,
99-
)
100-
101-
# Stream the results, yielding after every word.
102-
for item in session:
103-
if hasattr(item.choices[0].delta, "content"):
104-
answer_text = item.choices[0].delta.content
105-
# Ensure answer_text is not None before concatenation
106-
chat_state.append_to_response(answer_text)
107-
yield
108-
```
109-
110-
```python demo
111-
rx.box(
112-
chat(process=run_llm),
113-
height="500px",
114-
width="100%"
115-
)
116-
```
117-
118-
```md alert info
119-
# Just set `process` prop of the chat component to connect to your LLM.
120-
121-
\```python
122-
async def run_llm(chat_state):
123-
# Start a new session to answer the question.
124-
session = client.chat.completions.create(
125-
model="gpt-4o-mini",
126-
messages=chat_state.get_messages(),
127-
stream=True,
128-
)
129-
130-
# Stream the results, yielding after every word.
131-
for item in session:
132-
if hasattr(item.choices[0].delta, "content"):
133-
answer_text = item.choices[0].delta.content
134-
chat_state.append_to_response(answer_text)
135-
yield
136-
\```
137-
```
138-
139-
Depending on how much control you want, you can either use a high level component directly or create your own component from scratch.
140-
14168
### Component State
14269

14370
A new feature we've added in 0.4.6 is the `rx.ComponentState` class. This class allows you to encapsulate state and UI for a component in a single class. In fact, this is what allows the `chat` component to work without having the user having to define a state - the component will dynamically create and handle its own state internally.

pcweb/pages/framework/demos/chatbot/chatbot.py

Lines changed: 19 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -107,17 +107,25 @@ def chatbot() -> rx.Component:
107107
)
108108

109109

110-
chatbot_code = """import reflex as rx
111-
# pip install reflex-chat
112-
from reflex_chat import chat, api
113-
114-
def index() -> rx.Component:
115-
return rx.container(
116-
rx.box(
117-
chat(process=api.openai()),
118-
height="100vh",
110+
chatbot_code = """
111+
rx.box(
112+
rx.icon_button("trash", on_click=ChatState.clear_chat),
113+
rx.box(
114+
rx.auto_scroll(
115+
rx.foreach(
116+
ChatState.chat_history,
117+
lambda messages: qa(messages[0], messages[1]),
118+
),
119+
),
120+
rx.form(
121+
rx.input(
122+
placeholder="Ask me anything",
123+
name="question",
124+
),
125+
rx.icon_button("arrow-up"),
126+
on_submit=TutorialState.submit,
127+
reset_on_submit=True
119128
),
120-
size="2",
121129
)
122-
130+
)
123131
"""

pyproject.toml

Lines changed: 5 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -8,25 +8,21 @@ authors = [
88
readme = "README.md"
99
requires-python = ">=3.11"
1010
dependencies = [
11-
"email-validator==2.1.1",
11+
"email-validator==2.2.0",
1212
"black==23.10.0",
1313
"pandas>=1.5.3",
14-
"psycopg[binary]==3.2.3",
14+
"psycopg[binary]==3.2.9",
1515
"plotly-express==0.4.1",
1616
"googletrans-py==4.0.0",
17-
"typesense==0.14.0",
18-
"openai==1.13.3",
19-
"pyyaml>=6.0.2",
17+
"openai==1.78.1",
2018
"flexdown>=0.1.5a12,<0.2",
2119
"reflex @ git+https://github.com/reflex-dev/reflex@main",
2220
"mistletoe>=1.2.1",
2321
"reflex-image-zoom>=0.0.2",
24-
"reflex-chat==0.0.2a1",
25-
"reflex_type_animation==0.0.1",
2622
"reflex-ag-grid==0.0.10",
27-
"replicate==0.32.1",
23+
"replicate==1.0.6",
2824
"reflex-pyplot==0.1.3",
29-
"python-multipart==0.0.20",
25+
"requests>=2.32.3",
3026
]
3127

3228
[dependency-groups]

0 commit comments

Comments
 (0)