Skip to content

Commit 5d9a16f

Browse files
authored
docs: update mirascope third party integration docs with new v1 release + custom panels (#394)
1 parent b6b6822 commit 5d9a16f

File tree

3 files changed

+29
-36
lines changed

3 files changed

+29
-36
lines changed
-391 KB
Loading
-286 KB
Loading
Lines changed: 29 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -1,49 +1,47 @@
1-
[Mirascope](https://github.com/Mirascope/mirascope) is an intuitive approach to building AI-powered applications using LLMs. Their library integrates with Logfire to make observability and monitoring for LLMs easy and seamless.
1+
[Mirascope][mirascope-repo] is a developer tool for building with LLMs. Their library focuses on abstractions that aren't obstructions and integrates with Logfire to make observability and monitoring for LLMs easy and seamless.
22

3-
You can enable it using their [`@with_logire`][mirascope-logfire] decorator, which will work with all of the [model providers that they support][mirascope-supported-providers] (e.g. OpenAI, Anthropic, Groq, and more).
3+
You can enable it using their [`@with_logire`][mirascope-logfire] decorator, which will work with all of the [model providers that they support][mirascope-supported-providers] (e.g. OpenAI, Anthropic, Gemini, Mistral, Groq, and more).
44

5-
```py hl_lines="1 2 5 8"
5+
```py hl_lines="1 3 5 8"
66
import logfire
7-
from mirascope.logfire import with_logfire
8-
from mirascope.anthropic import AnthropicCall
7+
from mirascope.core import anthropic, prompt_template
8+
from mirascope.integrations.logfire import with_logfire
99

1010
logfire.configure()
1111

1212

13-
@with_logfire
14-
class BookRecommender(AnthropicCall):
15-
prompt_template = "Please recommend some {genre} books"
13+
@with_logfire()
14+
@anthropic.call("claude-3-5-sonnet-20240620")
15+
@prompt_template("Please recommend some {genre} books")
16+
def recommend_books(genre: str): ...
1617

17-
genre: str
1818

19-
20-
recommender = BookRecommender(genre="fantasy")
21-
response = recommender.call() # this will automatically get logged with logfire
19+
response = recommend_books("fantasy") # this will automatically get logged with logfire
2220
print(response.content)
23-
#> Here are some recommendations for great fantasy book series: ...
21+
# > Certainly! Here are some popular and well-regarded fantasy books and series: ...
2422
```
2523

2624
This will give you:
2725

28-
* A span around the `AnthropicCall.call()` that captures items like the prompt template, templating properties and fields, and input/output attributes
26+
* A span around the `recommend_books` that captures items like the prompt template, templating properties and fields, and input/output attributes
2927
* Human-readable display of the conversation with the agent
3028
* Details of the response, including the number of tokens used
3129

3230
<figure markdown="span">
3331
![Logfire Mirascope Anthropic call](../../images/logfire-screenshot-mirascope-anthropic-call.png){ width="500" }
34-
<figcaption>Mirascope Anthropic Call span and Anthropic span and conversation</figcaption>
32+
<figcaption>Mirascope Anthropic call span and Anthropic span and conversation</figcaption>
3533
</figure>
3634

3735
Since Mirascope is built on top of [Pydantic][pydantic], you can use the [Pydantic plugin][pydantic-plugin] to track additional logs and metrics about model validation, which you can enable using the [`pydantic_plugin`][logfire.configure(pydantic_plugin)] configuration.
3836

3937
This can be particularly useful when [extracting structured information][mirascope-extracting-structured-information] using LLMs:
4038

41-
```py hl_lines="3 4 8 17"
39+
```py hl_lines="3 5 8 17"
4240
from typing import Literal, Type
4341

4442
import logfire
45-
from mirascope.logfire import with_logfire
46-
from mirascope.openai import OpenAIExtractor
43+
from mirascope.core import openai, prompt_template
44+
from mirascope.integrations.logfire import with_logfire
4745
from pydantic import BaseModel
4846

4947
logfire.configure(pydantic_plugin=logfire.PydanticPlugin(record="all"))
@@ -55,30 +53,23 @@ class TaskDetails(BaseModel):
5553
priority: Literal["low", "normal", "high"]
5654

5755

58-
@with_logfire
59-
class TaskExtractor(OpenAIExtractor[TaskDetails]):
60-
extract_schema: Type[TaskDetails] = TaskDetails
61-
prompt_template = """
62-
Extract the task details from the following task:
63-
{task}
64-
"""
65-
66-
task: str
56+
@with_logfire()
57+
@openai.call("gpt-4o-mini", response_model=TaskDetails)
58+
@prompt_template("Extract the details from the following task: {task}")
59+
def extract_task_details(task: str): ...
6760

6861

6962
task = "Submit quarterly report by next Friday. Task is high priority."
70-
task_details = TaskExtractor(
71-
task=task
72-
).extract() # this will be logged automatically with logfire
63+
task_details = extract_task_details(task) # this will be logged automatically with logfire
7364
assert isinstance(task_details, TaskDetails)
7465
print(task_details)
75-
#> description='Submit quarterly report' due_date='next Friday' priority='high'
66+
# > description='Submit quarterly report' due_date='next Friday' priority='high'
7667
```
7768

7869
This will give you:
7970

8071
* Tracking for validation of Pydantic models
81-
* A span around the `OpenAIExtractor.extract()` that captures items like the prompt template, templating properties and fields, and input/output attributes
72+
* A span around the `extract_task_details` that captures items like the prompt template, templating properties and fields, and input/output attributes
8273
* Human-readable display of the conversation with the agent including the function call
8374
* Details of the response, including the number of tokens used
8475

@@ -87,10 +78,12 @@ This will give you:
8778
<figcaption>Mirascope OpenAI Extractor span and OpenAI span and function call</figcaption>
8879
</figure>
8980

90-
For more information on Mirascope and what you can do with it, check out their [documentation](https://docs.mirascope.io).
81+
For more information on Mirascope and what you can do with it, check out their [documentation][mirascope-documentation].
9182

92-
[mirascope-logfire]: https://docs.mirascope.io/latest/integrations/logfire/#how-to-use-logfire-with-mirascope
93-
[mirascope-supported-providers]: https://docs.mirascope.io/latest/concepts/supported_llm_providers/
94-
[mirascope-extracting-structured-information]: https://docs.mirascope.io/latest/concepts/extracting_structured_information_using_llms/
83+
[mirascope-repo]: https://github.com/Mirascope/mirascope
84+
[mirascope-documentation]: https://mirascope.io/docs
85+
[mirascope-logfire]: https://mirascope.io/docs/latest/integrations/logfire/
86+
[mirascope-supported-providers]: https://mirascope.io/docs/latest/learn/calls/#supported-providers
87+
[mirascope-extracting-structured-information]: https://mirascope.io/docs/latest/learn/response_models/
9588
[pydantic]: https://docs.pydantic.dev/latest/
9689
[pydantic-plugin]: https://docs.pydantic.dev/latest/concepts/plugins/

0 commit comments

Comments
 (0)