-
Checked other resources
Commit to Help
Example Codefrom langchain_community.tools import QuerySQLDatabaseTool
from langchain.chains.sql_database.query import create_sql_query_chain
from langchain_community.utilities import SQLDatabase
from langchain_mistralai import ChatMistralAI
llm = ChatMistralAI(
model_name="mistral-large-latest",
)
db = SQLDatabase.from_uri("postgresql://postgres:[email protected]:5432/Adventureworks", schema="person")
execute_query = QuerySQLDatabaseTool(db=db)
write_query = create_sql_query_chain(llm, db)
chain = (write_query | execute_query)
chain.invoke({"question": "How many employees are there"}) DescriptionThis code (which is taken mostly from https://python.langchain.com/v0.2/docs/tutorials/sql_qa/) doesn't work because the LLM returns the SQL query along with the analysis that led the LLM to generate this query. I tried class Query(BaseModel):
query: str = Field(description="The SQL query") My class gets correctly filled, only with the SQL query, but RunnablePassthrough.assign(**inputs) # type: ignore[return-value]
| (
lambda x: {
k: v
for k, v in x.items()
if k not in ("question", "table_names_to_use")
}
)
| prompt_to_use.partial(top_k=str(k))
| llm.bind(stop=["\nSQLResult:"])
| StrOutputParser()
| _strip How do I solve this? System Infolangchain 0.3.25 running with Python 3.13 under Jupyter Notebook on Windows 11 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Found answer by defining the llm like this: class Query(BaseModel):
query: str = Field(description="The SQL query")
llm = (
ChatMistralAI(
model_name="mistral-large-latest",
).with_structured_output(Query)
| (lambda output: output.query)) |
Beta Was this translation helpful? Give feedback.
Found answer by defining the llm like this: