You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The request **Inspector** is available in **Discover** and for all **Dashboards** visualization panels that are built based on a query. The available information can differ based on the request.
2
+
3
+
1. Open the **Inspector**:
4
+
- If you're in **Discover**, select **Inspect** from the application's toolbar.
5
+
- If you're in **Dashboards**, open the panel menu and select **Inspect**.
6
+
1. Open the **View** dropdown, then select **Requests**.
7
+
1. Several tabs with different information can appear, depending on nature of the request:
8
+
:::{tip}
9
+
Some visualizations rely on several requests. From the dropdown, select the request you want to inspect.
10
+
:::
11
+
***Statistics**: Provides general information and statistics about the request. For example, you can check if the number of hits and query time match your expectations. If not, this may indicate an issue with the request used to build the visualization.
12
+
***Clusters and shards**: Lists the {{es}} clusters and shards per cluster queried to fetch the data and shows the status of the request on each of them. With the information in this tab, you can check if the request is properly executed, especially in case of cross-cluster search.
13
+
14
+
:::{note}
15
+
This tab is not available for {{esql}} queries and Vega visualizations.
16
+
:::
17
+
18
+
***Request**: Provides a full view of the visualization's request, which you can copy or **Open in Console** to refine, if needed.
19
+
***Response**: Provides a full view of the response returned by the request.
Copy file name to clipboardExpand all lines: explore-analyze/dashboards/using.md
+2-15Lines changed: 2 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -180,21 +180,8 @@ This action is possible for all charts created using **Lens** or {{esql}}. It is
180
180
181
181
#### View the requests that collect the data
182
182
183
-
This action is possible for all visualization panels that are built based on a query, but the available information can differ based on the panel type.
184
-
185
-
1. Open the panel menu and select **Inspect**.
186
-
1. Open the **View** dropdown, then select **Requests**.
187
-
1. Some visualizations rely on several requests. From the dropdown, select the request you want to inspect. Several tabs with different information can appear, depending on the panel type:
188
-
***Statistics**: Provides general information and statistics about the request. For example, you can check if the number of hits and query time match your expectations. If not, this may indicate an issue with the request used to build the visualization.
189
-
***Clusters and shards**: Lists the {{es}} clusters and shards per cluster queried to fetch the data and shows the status of the request on each of them. With the information in this tab, you can check if the request is properly executed, especially in case of cross-cluster search.
190
-
191
-
:::{note}
192
-
This tab is not available for {{esql}} and Vega visualizations.
193
-
:::
194
-
195
-
***Request**: Provides a full view of the visualization's request, which you can copy or **Open in Console** to refine, if needed.
196
-
***Response**: Provides a full view of the response returned by the request.
Copy file name to clipboardExpand all lines: explore-analyze/discover/discover-get-started.md
+14Lines changed: 14 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,6 +43,15 @@ Your query may include multiple data types that each have tailored experiences;
43
43
44
44
In this case **Discover** provides the default experience until it detects that you're interacting with a single type of data. For example, when you [](#look-inside-a-document).
45
45
46
+
### View active context-aware experience
47
+
48
+
You can check which experience is currently active for your current Discover session. This can help you confirm whether the type of data you're currently exploring is properly detected or if Discover is currently using its default experience.
49
+
50
+
1. Select **Inspect** from Discover's toolbar.
51
+
1. Open the **View** dropdown, then select **Profiles**.
52
+
53
+
The various profiles listed show details such as the active solution and data source contexts, which determine Discover's context-aware experiences.
54
+
46
55
## Load data into Discover [find-the-data-you-want-to-use]
47
56
48
57
Select the data you want to explore, and then specify the time range in which to view that data.
@@ -294,6 +303,11 @@ Note that in ES|QL mode, the **Documents** tab is named **Results**.
: Set a default topic to use for events sent by {{agent}} to the Kafka output.
130
+
: Set the default Kafka topic used for events sent by {{agent}}.
131
131
132
132
You can set a static topic, for example `elastic-agent`, or you can choose to set a topic dynamically based on an [Elastic Common Schema (ECS)](ecs://reference/index.md) field. Available fields include:
133
133
134
134
* `data_stream.type`
135
135
* `data_stream.dataset`
136
136
* `data_stream.namespace`
137
137
* `@timestamp`
138
-
* `event-dataset`
138
+
* `event.dataset`
139
139
140
-
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name.
140
+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, you can use the `fields.kafka_topic` custom field to set a dynamic topic for each event.
141
141
142
142
To set a dynamic topic value for outputting {{agent}} data to Kafka, you can add the [`add_fields` processor](/reference/fleet/add_fields-processor.md) to any integration policies on your {{fleet}}-managed {{agents}}.
143
143
144
-
For example, the following `add_fields` processor creates a dynamic topic value by interpolating multiple [data stream fields](ecs://reference/ecs-data_stream.md):
144
+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.kafka_topic` field by interpolating multiple [data stream fields](ecs://reference/ecs-data_stream.md):
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `kafka_topic` field.
Copy file name to clipboardExpand all lines: reference/fleet/kafka-output.md
+18-8Lines changed: 18 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -164,25 +164,35 @@ Use these options to set the Kafka topic for each {{agent}} event.
164
164
`topic`$$$kafka-topic-setting$$$
165
165
: The default Kafka topic used for produced events.
166
166
167
-
You can set a static topic, for example `elastic-agent`, or you can choose to set a topic dynamically based on an [Elastic Common Schema (ECS)](ecs://reference/index.md) field. Available fields include:
167
+
You can set a static topic, for example `elastic-agent`, or you can use a format string to set a topic dynamically based on an [Elastic Common Schema (ECS)](ecs://reference/index.md) field. Available fields include:
168
168
169
169
* `data_stream.type`
170
170
* `data_stream.dataset`
171
171
* `data_stream.namespace`
172
172
* `@timestamp`
173
-
* `event-dataset`
173
+
* `event.dataset`
174
+
175
+
For example:
176
+
177
+
```yaml
178
+
topic: '${data_stream.type}'
179
+
```
174
180
175
-
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name.
181
+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.kafka_topic` custom field to set the topic for each event:
182
+
183
+
```yaml
184
+
topic: '${fields.kafka_topic}'
185
+
```
176
186
177
-
To set a dynamic topic value for outputting {{agent}} data to Kafka, you can add the [`add_fields` processor](/reference/fleet/add_fields-processor.md) to the input configuration settings of your standalone {{agent}}.
187
+
To set a dynamic topic value for outputting {{agent}} data to Kafka, you can add the [`add_fields` processor](/reference/fleet/add_fields-processor.md) to the input configuration settings of your standalone {{agent}}.
178
188
179
-
For example, the following `add_fields` processor creates a dynamic topic value by interpolating multiple [data stream fields](ecs://reference/ecs-data_stream.md):
189
+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.kafka_topic` field by interpolating multiple [data stream fields](ecs://reference/ecs-data_stream.md):
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `kafka_topic` field.
Don’t forget that the APM Server is stateless. Several instances running do not need to know about each other. This means that with a properly sized {{es}} instance, APM Server scales out linearly.
0 commit comments