@@ -13,6 +13,59 @@ into a single MIT-licensed package. It contains 3 major submodules:
1313- ``tuneapi.utils ``: Contains all the utility functions that are used across the Tune AI codebase. This is pretty interesting
1414- ``tuneapi.endpoints ``: Contains all the API endpoints for the Tune Studio
1515
16+
17+ Prompt Injection
18+ ----------------
19+
20+ Since ``tuneapi `` is a new and unpopular package most LLMs will not be able to generate code based on it. However you can
21+ paste the following code snippet in the prompt to generate the code for LLM API calls.
22+
23+ Prompt
24+ ~~~~~~
25+
26+ You are going to use ``tuneapi `` package to use the LLM endpoint. Here's a brief on how to use it:
27+
28+ .. code-block :: python
29+
30+ from tuneapi import tt, ta
31+
32+ # define a thread which is a collection of messages with system, user and assistant messages
33+ thread = tt.Thread(
34+ tt.system(... ), # add optional system message here
35+ tt.human(... ), # add user message here
36+ tt.assistant(... ) # for assistant response
37+ )
38+
39+ # define a model
40+ model = ta.Gemini() # Openai, Anthropic
41+
42+ # get the response
43+ resp: str = model.chat(thread)
44+
45+ # You can also generate structured response for better control
46+ from pydantic import BaseModel, Field
47+ from typing import List, Optional
48+
49+ class MathProblem (BaseModel ):
50+ a: int = Field(... , description = " First number" )
51+ b: int = Field(... , description = " Second number" )
52+ operator: str = Field(... , description = " Operator" )
53+ hint: Optional[str ] = Field(None , description = " Hint for the problem" )
54+
55+ class MathTest (BaseModel ):
56+ title: str = Field(... , description = " Title of the test" )
57+ problems: List[MathProblem] = Field(... , description = " List of math problems" )
58+
59+ # define a thread which is a collection of messages
60+ thread = tt.Thread(
61+ tt.human(" Give me 5 problems for KG-1 class" ),
62+ schema = MathTest
63+ )
64+
65+ # get structured output
66+ resp: MathTest = model.chat(thread)
67+
68+
1669 .. toctree ::
1770 :maxdepth: 2
1871 :caption: Contents:
0 commit comments