File tree Expand file tree Collapse file tree 2 files changed +7
-3
lines changed Expand file tree Collapse file tree 2 files changed +7
-3
lines changed Original file line number Diff line number Diff line change @@ -132,10 +132,12 @@ We can also pass a model instance for the chat model and the embedding model. Fo
132132 azure_deployment = " AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME" ,
133133 openai_api_version = " AZURE_OPENAI_API_VERSION" ,
134134 )
135-
135+ # Supposing model_tokens are 100K
136+ model_tokens_count = 100000
136137 graph_config = {
137138 " llm" : {
138- " model_instance" : llm_model_instance
139+ " model_instance" : llm_model_instance,
140+ " model_tokens" : model_tokens_count,
139141 },
140142 " embeddings" : {
141143 " model_instance" : embedder_model_instance
@@ -191,4 +193,4 @@ We can also pass a model instance for the chat model and the embedding model. Fo
191193 " embeddings" : {
192194 " model_instance" : embedder_model_instance
193195 }
194- }
196+ }
Original file line number Diff line number Diff line change 1111For example, if the user prompt is "What is the capital of France?",
1212you should return "capital of France". \n
1313If you return something else, you will get a really bad grade. \n
14+ What you return should be sufficient to get the answer from the internet. \n
15+ Don't just return a small part of the prompt, unless that is sufficient. \n
1416USER PROMPT: {user_prompt}"""
You can’t perform that action at this time.
0 commit comments