Replies: 2 comments
-
Thanks for this! |
Beta Was this translation helpful? Give feedback.
0 replies
-
Great. Thanks.
Il mar 29 ott 2024, 22:17 Jason Liu ***@***.***> ha scritto:
… Contributing to Instructor: A Developer's Guide
This guide explains how to contribute to the Instructor library,
particularly focusing on adding support for new LLM providers.
General Requirements
Before starting your contribution, ensure you have:
- Python 3.9+
- Understanding of async/await in Python
- Familiarity with Pydantic models
- Knowledge of type hints and generics
- Understanding of your target LLM provider's API
Adding a New Provider 1. Create Client File
Create client_yourprovider.py in the root directory:
from __future__ import annotationsfrom typing import Any, overloadimport instructor
@overloaddef from_yourprovider(
client: YourProviderClient,
mode: instructor.Mode = instructor.Mode.TOOLS,
**kwargs: Any,
) -> instructor.Instructor: ...
@overloaddef from_yourprovider(
client: AsyncYourProviderClient,
mode: instructor.Mode = instructor.Mode.TOOLS,
**kwargs: Any,
) -> instructor.AsyncInstructor: ...
def from_yourprovider(
client: YourProviderClient | AsyncYourProviderClient,
mode: instructor.Mode = instructor.Mode.TOOLS,
**kwargs: Any,
) -> instructor.Instructor | instructor.AsyncInstructor:
assert mode in {
instructor.Mode.TOOLS,
instructor.Mode.JSON,
}, "Mode must be one of {instructor.Mode.TOOLS, instructor.Mode.JSON}"
assert isinstance(
client, (YourProviderClient, AsyncYourProviderClient)
), "Client must be an instance of YourProviderClient"
if isinstance(client, YourProviderClient):
return instructor.Instructor(
client=client,
create=instructor.patch(create=client.chat.completions.create, mode=mode),
provider=instructor.Provider.YOUR_PROVIDER,
mode=mode,
**kwargs,
)
else:
return instructor.AsyncInstructor(
client=client,
create=instructor.patch(create=client.chat.completions.create, mode=mode),
provider=instructor.Provider.YOUR_PROVIDER,
mode=mode,
**kwargs,
)
2. Update Mode Enum
Add your provider modes to mode.py:
class Mode(enum.Enum):
# ... existing modes ...
YOUR_PROVIDER_TOOLS = "your_provider_tools"
YOUR_PROVIDER_JSON = "your_provider_json"
3. Add Provider Enum
Update utils.py:
class Provider(Enum):
# ... existing providers ...
YOUR_PROVIDER = "your_provider"
def get_provider(base_url: str) -> Provider:
# ... existing providers ...
elif "yourprovider" in str(base_url):
return Provider.YOUR_PROVIDER
4. Implement Response Handlers
Add to process_response.py:
def handle_yourprovider_tools(
response_model: type[T],
new_kwargs: dict[str, Any]
) -> tuple[type[T], dict[str, Any]]:
new_kwargs["tools"] = [{
"type": "function",
"function": response_model.openai_schema,
}]
new_kwargs["tool_choice"] = {
"type": "function",
"function": {"name": response_model.openai_schema["name"]},
}
return response_model, new_kwargs
def handle_yourprovider_json(
response_model: type[T],
new_kwargs: dict[str, Any]
) -> tuple[type[T], dict[str, Any]]:
new_kwargs["response_format"] = {"type": "json_object"}
return response_model, new_kwargs
5. Implement Retry Logic
Add to reask.py:
def reask_yourprovider_tools(
kwargs: dict[str, Any],
response: Any,
exception: Exception,
):
kwargs = kwargs.copy()
reask_msgs = [dump_message(response.choices[0].message)]
kwargs["messages"].extend(reask_msgs)
return kwargs
6. Update Package Initialization
Update __init__.py:
if importlib.util.find_spec("yourprovider") is not None:
from .client_yourprovider import from_yourprovider
__all__ += ["from_yourprovider"]
Testing Guidelines
1. Create test file tests/test_yourprovider.py:
import pytestfrom instructor import from_yourproviderfrom pydantic import BaseModel
class TestModel(BaseModel):
name: str
age: int
def test_yourprovider_tools():
client = YourProviderClient()
instructor_client = from_yourprovider(
client,
mode=instructor.Mode.YOUR_PROVIDER_TOOLS
)
response = instructor_client.chat.completions.create(
model="your-model",
response_model=TestModel,
messages=[{"role": "user", "content": "John is 30"}]
)
assert isinstance(response, TestModel)
assert response.name == "John"
assert response.age == 30
@pytest.mark.asyncioasync def test_yourprovider_tools_async():
client = AsyncYourProviderClient()
instructor_client = from_yourprovider(
client,
mode=instructor.Mode.YOUR_PROVIDER_TOOLS
)
response = await instructor_client.chat.completions.create(
model="your-model",
response_model=TestModel,
messages=[{"role": "user", "content": "John is 30"}]
)
assert isinstance(response, TestModel)
Required test cases:
- Sync and async clients
- All supported modes
- Error handling and retries
- Edge cases and validation
- Streaming responses (if supported)
—
Reply to this email directly, view it on GitHub
<#1133>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAT26L3W4LELZ5UF6ON2JGDZ573NNAVCNFSM6AAAAABQ2YOC6SVHI2DSMVQWIX3LMV43ERDJONRXK43TNFXW4OZXGM4TEOBRGQ>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Contributing to Instructor: A Developer's Guide
This guide explains how to contribute to the Instructor library, particularly focusing on adding support for new LLM providers.
General Requirements
Before starting your contribution, ensure you have:
Adding a New Provider
1. Create Client File
Create
client_yourprovider.py
in the root directory:2. Update Mode Enum
Add your provider modes to
mode.py
:3. Add Provider Enum
Update
utils.py
:4. Implement Response Handlers
Add to
process_response.py
:5. Implement Retry Logic
Add to
reask.py
:6. Update Package Initialization
Update
__init__.py
:Testing Guidelines
tests/test_yourprovider.py
:Required test cases:
Beta Was this translation helpful? Give feedback.
All reactions