Skip to content

Commit ef2863b

Browse files
committed
fix tests and format
1 parent 95e2767 commit ef2863b

File tree

5 files changed

+16
-6
lines changed

5 files changed

+16
-6
lines changed

docs/faq.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ It means that the call to the LLM failed. This is usually triggered for one of t
2929
1. An API key is not present or not passed correctly to the LLM
3030
1. The LLM API was passed arguments it doesn't expect. Our recommendation is to use the LiteLLM standard, and pass arguments that conform to that standard directly in the guard callable. It's helpful as a debugging step to remove all other arguments or to try and use the same arguments in a LiteLLM client directly.
3131
1. The LLM API is down or experiencing issues. This is usually temporary, and you can use LiteLLM or the LLM client directly to verify if the API is working as expected.
32-
1. You passed a custom LLM callable, and it either doesn't conform to the expected signature or it throws an error during execution. Make sure that the custom LLM callable can be called as a function that takes in a single prompt string and returns a string.
32+
1. You passed a custom LLM callable, and it either doesn't conform to the expected signature or it throws an error during execution. Make sure that the custom LLM callable can be called as a function that takes in messages kwarg and returns a string.
3333

3434
## How can I host Guardrails as its own server
3535

guardrails/classes/history/inputs.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,7 @@ def to_dict(self) -> Dict[str, Any]:
9898
@classmethod
9999
def from_interface(cls, i_inputs: IInputs) -> "Inputs":
100100
deserialized_messages = None
101-
if i_inputs.messages: # type: ignore
101+
if hasattr(i_inputs, "messages") and i_inputs.messages: # type: ignore
102102
deserialized_messages = []
103103
for msg in i_inputs.messages: # type: ignore
104104
ser_msg = {**msg}

guardrails/run/async_runner.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ async def async_prepare(
296296
"""Prepare by running pre-processing and input validation.
297297
298298
Returns:
299-
The instructions, prompt, and message history.
299+
The messages.
300300
"""
301301
prompt_params = prompt_params or {}
302302
if api is None:

guardrails/run/runner.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,7 @@ class Runner:
7575
disable_tracer: Optional[bool] = True
7676

7777
# QUESTION: Are any of these init args actually necessary for initialization?
78-
# ANSWER: _Maybe_ prompt, instructions, and messages for Prompt initialization
78+
# ANSWER: _Maybe_ messages for Prompt initialization
7979
# but even that can happen at execution time.
8080
# TODO: In versions >=0.6.x, remove this class and just execute a Guard functionally
8181
def __init__(

tests/integration_tests/test_async_streaming.py

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -165,7 +165,12 @@ async def test_async_streaming_fix_behavior_two_validators(mocker):
165165
max_tokens=10,
166166
temperature=0,
167167
stream=True,
168-
prompt=prompt,
168+
messages=[
169+
{
170+
"role": "user",
171+
"content": prompt,
172+
}
173+
],
169174
)
170175
text = ""
171176
original = ""
@@ -211,7 +216,12 @@ async def test_async_streaming_filter_behavior(mocker):
211216
max_tokens=10,
212217
temperature=0,
213218
stream=True,
214-
prompt=prompt,
219+
messages=[
220+
{
221+
"role": "user",
222+
"content": prompt,
223+
}
224+
],
215225
)
216226

217227
validated = ""

0 commit comments

Comments
 (0)