Skip to content

Commit 95a43d4

Browse files
authored
Add documentation for the Logfire Query API (#405)
1 parent a404923 commit 95a43d4

File tree

20 files changed

+1607
-12
lines changed

20 files changed

+1607
-12
lines changed

.hyperlint/.vale.ini

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
StylesPath = styles
2+
MinAlertLevel = suggestion
3+
Vocab = hyperlint
4+
SkippedScopes = script, style, pre, figure, code, code-block
5+
6+
[*]
7+
BasedOnStyles = Vale, hyperlint
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
validator
2+
[Pp]ydantic
3+
validators
4+
namespace
5+
Hyperlint
6+
preprocess
7+
tokenization
8+
tokenizer
9+
API
10+
APIs
11+
SDKs
12+
SDK
13+
[Aa]sync
14+
[Ss]ync
15+
[Ll]ogfire
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
extends: repetition
2+
message: "'%s' is repeated, did you mean to repeat this word?"
3+
level: error
4+
alpha: true
5+
tokens:
6+
- '[^\s]+'

docs/guides/advanced/creating_write_tokens.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,8 @@ You can create a write token by following these steps:
77

88
1. Open the **Logfire** web interface at [logfire.pydantic.dev](https://logfire.pydantic.dev).
99
2. Select your project from the **Projects** section on the left hand side of the page.
10-
3. Click on the ⚙️ **Settings** tab on the top right corner of the page.
11-
4. Select the **{} Write tokens** tab on the left hand menu.
10+
3. Click on the ⚙️ **Settings** tab in the top right corner of the page.
11+
4. Select the **{} Write tokens** tab from the left hand menu.
1212
5. Click on the **Create write token** button.
1313

1414
After creating the write token, you'll see a dialog with the token value.

docs/guides/advanced/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,4 @@
33
* **[Testing](testing.md):** Verify your application's logging and span tracking with Logfire's testing utilities, ensuring accurate data capture and observability.
44
* **[Backfill](backfill.md):** Recover lost data and bulk load historical data into Logfire with the `logfire backfill` command, ensuring data continuity.
55
* **[Creating Write Tokens](creating_write_tokens.md):** Generate and manage multiple write tokens for different services.
6+
* **[Using Read Tokens](query_api.md):** Generate and manage read tokens for programmatic querying of your Logfire data.

docs/guides/advanced/query_api.md

Lines changed: 222 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,222 @@
1+
Logfire provides a web API for programmatically running arbitrary SQL queries against the data in your Logfire projects.
2+
This API can be used to retrieve data for export, analysis, or integration with other tools, allowing you to leverage
3+
your data in a variety of ways.
4+
5+
The API is available at `https://logfire-api.pydantic.dev/v1/query` and requires a **read token** for authentication.
6+
Read tokens can be generated from the Logfire web interface and provide secure access to your data.
7+
8+
The API can return data in various formats, including JSON, Apache Arrow, and CSV, to suit your needs.
9+
See [here](#additional-configuration) for more details about the available response formats.
10+
11+
## How to Create a Read Token
12+
13+
If you've set up Logfire following the [first steps guide](../first_steps/index.md), you can generate read tokens from
14+
the Logfire web interface, for use accessing the Logfire Query API.
15+
16+
To create a read token:
17+
18+
1. Open the **Logfire** web interface at [logfire.pydantic.dev](https://logfire.pydantic.dev).
19+
2. Select your project from the **Projects** section on the left-hand side of the page.
20+
3. Click on the ⚙️ **Settings** tab in the top right corner of the page.
21+
4. Select the **Read tokens** tab from the left-hand menu.
22+
5. Click on the **Create read token** button.
23+
24+
After creating the read token, you'll see a dialog with the token value.
25+
**Copy this value and store it securely, it will not be shown again.**
26+
27+
## Using the Read Clients
28+
29+
While you can [make direct HTTP requests](#making-direct-http-requests) to Logfire's querying API,
30+
we provide Python clients to simplify the process of interacting with the API from Python.
31+
32+
Logfire provides both synchronous and asynchronous clients.
33+
These clients are currently experimental, meaning we might introduce breaking changes in the future.
34+
To use these clients, you can import them from the `experimental` namespace:
35+
36+
```python
37+
from logfire.experimental.query_client import AsyncLogfireQueryClient, LogfireQueryClient
38+
```
39+
40+
!!! note "Additional required dependencies"
41+
42+
To use the query clients provided in `logfire.experimental.query_client`, you need to install `httpx`.
43+
44+
If you want to retrieve Arrow-format responses, you will also need to install `pyarrow`.
45+
46+
### Client Usage Examples
47+
48+
The `AsyncLogfireQueryClient` allows for asynchronous interaction with the Logfire API.
49+
If blocking I/O is acceptable and you want to avoid the complexities of asynchronous programming,
50+
you can use the plain `LogfireQueryClient`.
51+
52+
Here's an example of how to use these clients:
53+
54+
=== "Async"
55+
56+
```python
57+
from io import StringIO
58+
59+
import polars as pl
60+
from logfire.experimental.query_client import AsyncLogfireQueryClient
61+
62+
63+
async def main():
64+
query = """
65+
SELECT start_timestamp
66+
FROM records
67+
LIMIT 1
68+
"""
69+
70+
async with AsyncLogfireQueryClient(read_token='<your_read_token>') as client:
71+
# Load data as JSON, in column-oriented format
72+
json_cols = await client.query_json(sql=query)
73+
print(json_cols)
74+
75+
# Load data as JSON, in row-oriented format
76+
json_rows = await client.query_json_rows(sql=query)
77+
print(json_rows)
78+
79+
# Retrieve data in arrow format, and load into a polars DataFrame
80+
# Note that JSON columns such as `attributes` will be returned as
81+
# JSON-serialized strings
82+
df_from_arrow = pl.from_arrow(await client.query_arrow(sql=query))
83+
print(df_from_arrow)
84+
85+
# Retrieve data in CSV format, and load into a polars DataFrame
86+
# Note that JSON columns such as `attributes` will be returned as
87+
# JSON-serialized strings
88+
df_from_csv = pl.read_csv(StringIO(await client.query_csv(sql=query)))
89+
print(df_from_csv)
90+
91+
92+
if __name__ == '__main__':
93+
import asyncio
94+
95+
asyncio.run(main())
96+
```
97+
98+
=== "Sync"
99+
100+
```python
101+
from io import StringIO
102+
103+
import polars as pl
104+
from logfire.experimental.query_client import LogfireQueryClient
105+
106+
107+
def main():
108+
query = """
109+
SELECT start_timestamp
110+
FROM records
111+
LIMIT 1
112+
"""
113+
114+
with LogfireQueryClient(read_token='<your_read_token>') as client:
115+
# Load data as JSON, in column-oriented format
116+
json_cols = client.query_json(sql=query)
117+
print(json_cols)
118+
119+
# Load data as JSON, in row-oriented format
120+
json_rows = client.query_json_rows(sql=query)
121+
print(json_rows)
122+
123+
# Retrieve data in arrow format, and load into a polars DataFrame
124+
# Note that JSON columns such as `attributes` will be returned as
125+
# JSON-serialized strings
126+
df_from_arrow = pl.from_arrow(client.query_arrow(sql=query)) # type: ignore
127+
print(df_from_arrow)
128+
129+
# Retrieve data in CSV format, and load into a polars DataFrame
130+
# Note that JSON columns such as `attributes` will be returned as
131+
# JSON-serialized strings
132+
df_from_csv = pl.read_csv(StringIO(client.query_csv(sql=query)))
133+
print(df_from_csv)
134+
135+
136+
if __name__ == '__main__':
137+
main()
138+
```
139+
140+
## Making Direct HTTP Requests
141+
142+
If you prefer not to use the provided clients, you can make direct HTTP requests to the Logfire API using any HTTP
143+
client library, such as `requests` in Python. Below are the general steps and an example to guide you:
144+
145+
### General Steps to Make a Direct HTTP Request
146+
147+
1. **Set the Endpoint URL**: The base URL for the Logfire API is `https://logfire-api.pydantic.dev`.
148+
149+
2. **Add Authentication**: Include the read token in your request headers to authenticate.
150+
The header key should be `Authorization` with the value `Bearer <your_read_token_here>`.
151+
152+
3. **Define the SQL Query**: Write the SQL query you want to execute.
153+
154+
4. **Send the Request**: Use an HTTP GET request to the `/v1/query` endpoint with the SQL query as a query parameter.
155+
156+
**Note:** You can provide additional query parameters to control the behavior of your requests.
157+
You can also use the `Accept` header to specify the desired format for the response data (JSON, Arrow, or CSV).
158+
159+
### Example: Using Python `requests` Library
160+
161+
```python
162+
import requests
163+
164+
# Define the base URL and your read token
165+
base_url = 'https://logfire-api.pydantic.dev'
166+
read_token = '<your_read_token_here>'
167+
168+
# Set the headers for authentication
169+
headers = {
170+
'Authorization': f'Bearer {read_token}',
171+
'Content-Type': 'application/json'
172+
}
173+
174+
# Define your SQL query
175+
query = """
176+
SELECT start_timestamp
177+
FROM records
178+
LIMIT 1
179+
"""
180+
181+
# Prepare the query parameters for the GET request
182+
params = {
183+
'sql': query
184+
}
185+
186+
# Send the GET request to the Logfire API
187+
response = requests.get(f'{base_url}/v1/query', params=params, headers=headers)
188+
189+
# Check the response status
190+
if response.status_code == 200:
191+
print("Query Successful!")
192+
print(response.json())
193+
else:
194+
print(f"Failed to execute query. Status code: {response.status_code}")
195+
print(response.text)
196+
```
197+
198+
### Additional Configuration
199+
200+
The Logfire API supports various query parameters and response formats to give you flexibility in how you retrieve your data:
201+
202+
- **Response Format**: Use the `Accept` header to specify the response format. Supported values include:
203+
- `application/json`: Returns the data in JSON format. By default, this will be column-oriented unless specified otherwise with the `json_rows` parameter.
204+
- `application/vnd.apache.arrow.stream`: Returns the data in Apache Arrow format, suitable for high-performance data processing.
205+
- `text/csv`: Returns the data in CSV format, which is easy to use with many data tools.
206+
207+
If no `Accept` header is provided, the default response format is JSON.
208+
209+
- **Query Parameters**:
210+
- **`min_timestamp`**: An optional ISO-format timestamp to filter records with `start_timestamp` greater than this value for the `records` table or `recorded_timestamp` greater than this value for the `metrics` table. The same filtering can also be done manually within the query itself.
211+
- **`max_timestamp`**: Similar to `min_timestamp`, but serves as an upper bound for filtering `start_timestamp` in the `records` table or `recorded_timestamp` in the `metrics` table. The same filtering can also be done manually within the query itself.
212+
- **`limit`**: An optional parameter to limit the number of rows returned by the query. If not specified, **the default limit is 500**. The maximum allowed value is 10,000.
213+
- **`row_oriented`**: Only affects JSON responses. If set to `true`, the JSON response will be row-oriented; otherwise, it will be column-oriented.
214+
215+
All query parameters are optional and can be used in any combination to tailor the API response to your needs.
216+
217+
### Important Notes
218+
219+
- **Experimental Feature**: The query clients are under the `experimental` namespace, indicating that the API may change in future versions.
220+
- **Environment Configuration**: Remember to securely store your read token in environment variables or a secure vault for production use.
221+
222+
With read tokens, you have the flexibility to integrate Logfire into your workflow, whether using Python scripts, data analysis tools, or other systems.

docs/guides/onboarding_checklist/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ fix bugs, analyze user behavior, and make data-driven decisions.
88
!!! note
99

1010
If you aren't familiar with traces and spans, you might want to review
11-
[this section of the First Steps guide](../first_steps/index.md#opentelemetry-concepts).
11+
[this section of the First Steps guide](../first_steps/index.md#tracing-with-spans).
1212

1313
#### Logfire Onboarding Checklist
1414

logfire/experimental/__init__.py

Whitespace-only changes.

0 commit comments

Comments
 (0)