You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: manage-data/ingest.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,9 +16,9 @@ products:
16
16
- id: elasticsearch
17
17
---
18
18
19
-
# Ingestion
19
+
# Bring your data to Elastic
20
20
21
-
Bring your data! Whether you call it *adding*, *indexing*, or *ingesting* data, you have to get the data into {{es}} before you can search it, visualize it, and use it for insights.
21
+
Whether you call it *adding*, *indexing*, or *ingesting* data, you have to get the data into {{es}} before you can search it, visualize it, and use it for insights.
22
22
23
23
Our ingest tools are flexible, and support a wide range of scenarios. We can help you with everything from popular and straightforward use cases, all the way to advanced use cases that require additional processing in order to modify or reshape your data before it goes to {{es}}.
Depending on the type of data you want to ingest, you have a number of methods and tools available for use in your ingestion process. The table below provides more information about the available tools. Refer to our [Ingestion](/manage-data/ingest.md) overview for some guidelines to help you select the optimal tool for your use case.
43
+
Depending on the type of data you want to ingest, you have a number of methods and tools available for use in your ingestion process. The table below provides more information about the available tools.
44
+
45
+
Refer to our [Ingestion](/manage-data/ingest.md) overview for some guidelines to help you select the optimal tool for your use case.
44
46
45
47
<br>
46
48
@@ -49,14 +51,13 @@ Depending on the type of data you want to ingest, you have a number of methods a
49
51
| Integrations | Ingest data using a variety of Elastic integrations. |[Elastic Integrations](integration-docs://reference/index.md)|
50
52
| File upload | Upload data from a file and inspect it before importing it into {{es}}. |[Upload data files](/manage-data/ingest/upload-data-files.md)|
51
53
| APIs | Ingest data through code by using the APIs of one of the language clients or the {{es}} HTTP APIs. |[Document APIs](https://www.elastic.co/docs/api/doc/elasticsearch/group/endpoint-document)|
52
-
| OpenTelemetry | Collect and send your telemetry data to Elastic Observability |[Elastic Distributions of OpenTelemetry](opentelemetry://reference/index.md)|
54
+
| OpenTelemetry | Collect and send your telemetry data to Elastic Observability |[Elastic Distributions of OpenTelemetry](opentelemetry://reference/index.md).|
53
55
| Fleet and Elastic Agent | Add monitoring for logs, metrics, and other types of data to a host using Elastic Agent, and centrally manage it using Fleet. |[Fleet and {{agent}} overview](/reference/fleet/index.md) <br> [{{fleet}} and {{agent}} restrictions (Serverless)](/reference/fleet/fleet-agent-serverless-restrictions.md) <br> [{{beats}} and {{agent}} capabilities](/manage-data/ingest/tools.md)||
54
56
| {{elastic-defend}} | {{elastic-defend}} provides organizations with prevention, detection, and response capabilities with deep visibility for EPP, EDR, SIEM, and Security Analytics use cases across Windows, macOS, and Linux operating systems running on both traditional endpoints and public cloud environments. |[Configure endpoint protection with {{elastic-defend}}](/solutions/security/configure-elastic-defend.md)|
55
57
| {{ls}} | Dynamically unify data from a wide variety of data sources and normalize it into destinations of your choice with {{ls}}. |[Logstash](logstash://reference/index.md)|
56
58
| {{beats}} | Use {{beats}} data shippers to send operational data to Elasticsearch directly or through Logstash. |[{{beats}}](beats://reference/index.md)|
57
59
| APM | Collect detailed performance information on response time for incoming requests, database queries, calls to caches, external HTTP requests, and more. |[Application performance monitoring (APM)](/solutions/observability/apm/index.md)|
58
60
| Application logs | Ingest application logs using Filebeat, {{agent}}, or the APM agent, or reformat application logs into Elastic Common Schema (ECS) logs and then ingest them using Filebeat or {{agent}}. |[Stream application logs](/solutions/observability/logs/stream-application-logs.md) <br> [ECS formatted application logs](/solutions/observability/logs/ecs-formatted-application-logs.md)|
59
61
| Elastic Serverless forwarder for AWS | Ship logs from your AWS environment to cloud-hosted, self-managed Elastic environments, or {{ls}}. |[Elastic Serverless Forwarder](elastic-serverless-forwarder://reference/index.md)|
60
-
| Connectors | Use connectors to extract data from an original data source and sync it to an {{es}} index. | [Ingest content with Elastic connectors
| Connectors | Use connectors to extract data from an original data source and sync it to an {{es}} index. |[Ingest content with Elastic connectors](elasticsearch://reference/search-connectors/index.md) <br> [Connector clients](elasticsearch://reference/search-connectors/index.md)|
62
63
| Web crawler | Discover, extract, and index searchable content from websites and knowledge bases using the web crawler. |[Elastic Open Web Crawler](https://github.com/elastic/crawler#readme)|
Copy file name to clipboardExpand all lines: solutions/observability/logs.md
+36-8Lines changed: 36 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,18 +20,29 @@ Elastic Observability allows you to deploy and manage logs at a petabyte scale,
20
20
*[Run pattern analysis on log data](/solutions/observability/logs/run-pattern-analysis-on-log-data.md): Find patterns in unstructured log messages and make it easier to examine your data.
21
21
*[Troubleshoot logs](/troubleshoot/observability/troubleshoot-logs.md): Find solutions for errors you might encounter while onboarding your logs.
22
22
23
-
24
23
## Send logs data to your project [observability-log-monitoring-send-logs-data-to-your-project]
25
24
26
-
You can send logs data to your project in different ways depending on your needs:
25
+
You can send logs data to your project in different ways depending on your needs. When choosing between these options, consider the different features and functionalities between them.
26
+
27
+
Refer to [Ingest tools overview](/manage-data/ingest/tools.md) for more information on which option best fits your situation.
28
+
29
+
30
+
::::{tab-set}
31
+
32
+
:::{tab-item} {{edot}}
33
+
34
+
The Elastic Distribution of OpenTelemetry (EDOT) Collector and SDKs provide native OpenTelemetry support for collecting logs, metrics, and traces. This approach is ideal for:
27
35
28
-
* {{agent}}
29
-
* {{filebeat}}
36
+
* Native OpenTelemetry: When you want to use OpenTelemetry standards and are already using OpenTelemetry in your environment.
37
+
* Full observability: When you need to collect logs, metrics, and traces from a single collector.
38
+
* Modern applications: When building new applications with OpenTelemetry instrumentation.
39
+
* Standards compliance: When you need to follow OpenTelemetry specifications.
30
40
31
-
When choosing between {{agent}} and {{filebeat}}, consider the different features and functionalities between the two options. See [{{beats}} and {{agent}} capabilities](/manage-data/ingest/tools.md) for more information on which option best fits your situation.
41
+
For more information, refer to [Elastic Distribution of OpenTelemetry](opentelemetry://reference/index.md).
32
42
43
+
:::
33
44
34
-
###{{agent}}[observability-log-monitoring-agent]
45
+
:::{tab-item} {{agent}}
35
46
36
47
{{agent}} uses [integrations](https://www.elastic.co/integrations/data-integrations) to ingest logs from Kubernetes, MySQL, and many more data sources. You have the following options when installing and managing an {{agent}}:
37
48
@@ -45,7 +56,7 @@ See [install {{fleet}}-managed {{agent}}](/reference/fleet/install-fleet-managed
{{filebeat}} is a lightweight shipper for forwarding and centralizing log data. Installed as a service on your servers, {{filebeat}} monitors the log files or locations that you specify, collects log events, and forwards them to your Observability project for indexing.
63
75
64
76
*[{{filebeat}} overview](beats://reference/filebeat/index.md): General information on {{filebeat}} and how it works.
65
77
*[{{filebeat}} quick start](beats://reference/filebeat/filebeat-installation-configuration.md): Basic installation instructions to get you started.
66
78
*[Set up and run {{filebeat}}](beats://reference/filebeat/setting-up-running.md): Information on how to install, set up, and run {{filebeat}}.
67
79
80
+
:::
81
+
82
+
:::{tab-item} {{ls}}
83
+
84
+
{{ls}} is a powerful data processing pipeline that can collect, transform, and enrich log data before sending it to Elasticsearch. It's ideal for:
85
+
86
+
* Complex data processing: When you need to parse, filter, and transform logs before indexing.
87
+
* Multiple data sources: When you need to collect logs from various sources and normalize them.
88
+
* Advanced use cases: When you need data enrichment, aggregation, or routing to multiple destinations.
89
+
* Extending Elastic integrations: When you want to add custom processing to data collected by Elastic Agent or Beats.
90
+
91
+
For more information, refer to [Logstash](logstash://reference/index.md) and [Using Logstash with Elastic integrations](logstash://reference/using-logstash-with-elastic-integrations.md).
### Step 2: Install and start the {{agent}} [logs-stream-install-agent]
126
126
127
-
After downloading and extracting the installation package, you’re ready to install the {{agent}}. From the agent directory, run the install command that corresponds with your system:
127
+
After downloading and extracting the installation package, you're ready to install the {{agent}}. From the agent directory, run the install command that corresponds with your system:
128
128
129
129
::::{note}
130
130
On macOS, Linux (tar package), and Windows, run the `install` command to install and start {{agent}} as a managed service and start the service. The DEB and RPM packages include a service unit for Linux systems with systemd. For these systems, you must enable and start the service.
Structured logs follow a predefined, repeatable pattern or structure. This structure is applied at write time — preventing the need for parsing at ingest time. The Elastic Common Schema (ECS) defines a common set of fields to use when structuring logs. This structure allows logs to be easily ingested, and provides the ability to correlate, search, and aggregate on individual fields within your logs.
30
+
Structured logs follow a predefined, repeatable pattern or structure. This structure is applied at write time, preventing the need for parsing at ingest time. The Elastic Common Schema (ECS) defines a common set of fields to use when structuring logs. This structure allows logs to be ingested, and provides the ability to correlate, search, and aggregate on individual fields within your logs.
31
31
32
32
For example, the previous example logs might look like this when structured with ECS-compatible JSON:
33
33
@@ -92,15 +92,40 @@ Log sending is supported in the Java {{apm-agent}}.
92
92
93
93
Correlate your application logs with trace events to:
94
94
95
-
*view the context of a log and the parameters provided by a user
96
-
*view all logs belonging to a particular trace
97
-
*easily move between logs and traces when debugging application issues
95
+
*See the context of a log and the parameters provided by a user
96
+
*See all logs belonging to a particular trace
97
+
*Move between logs and traces when debugging application issues
98
98
99
99
Learn more about log correlation in the agent-specific ingestion guides:
100
100
101
+
::::{tab-set}
102
+
103
+
:::{tab-item} OpenTelemetry (EDOT)
104
+
105
+
The {{edot}} (EDOT) provides SDKs for multiple programming languages with built-in support for log correlation:
0 commit comments