diff --git a/manage-data/images/cloud-ec-python-logs-content.png b/manage-data/images/cloud-ec-python-logs-content.png index 875bd54d65..f639d5d5f0 100644 Binary files a/manage-data/images/cloud-ec-python-logs-content.png and b/manage-data/images/cloud-ec-python-logs-content.png differ diff --git a/manage-data/images/cloud-ec-python-logs-donut.png b/manage-data/images/cloud-ec-python-logs-donut.png index 2a8b8edd45..658ef728ba 100644 Binary files a/manage-data/images/cloud-ec-python-logs-donut.png and b/manage-data/images/cloud-ec-python-logs-donut.png differ diff --git a/manage-data/images/cloud-ec-python-logs-final-dashboard.png b/manage-data/images/cloud-ec-python-logs-final-dashboard.png index 86c208566c..d1a8019e67 100644 Binary files a/manage-data/images/cloud-ec-python-logs-final-dashboard.png and b/manage-data/images/cloud-ec-python-logs-final-dashboard.png differ diff --git a/manage-data/images/cloud-ec-python-logs-levels.png b/manage-data/images/cloud-ec-python-logs-levels.png index 71258ae857..a5e96a799b 100644 Binary files a/manage-data/images/cloud-ec-python-logs-levels.png and b/manage-data/images/cloud-ec-python-logs-levels.png differ diff --git a/manage-data/ingest/ingesting-data-from-applications/ingest-logs-from-python-application-using-filebeat.md b/manage-data/ingest/ingesting-data-from-applications/ingest-logs-from-python-application-using-filebeat.md index 63734e0250..a64260965a 100644 --- a/manage-data/ingest/ingesting-data-from-applications/ingest-logs-from-python-application-using-filebeat.md +++ b/manage-data/ingest/ingesting-data-from-applications/ingest-logs-from-python-application-using-filebeat.md @@ -11,58 +11,44 @@ products: # Ingest logs from a Python application using Filebeat -This guide demonstrates how to ingest logs from a Python application and deliver them securely into an {{ech}} deployment. You’ll set up Filebeat to monitor a JSON-structured log file that has standard Elastic Common Schema (ECS) formatted fields, and you’ll then view real-time visualizations of the log events in {{kib}} as they occur. While Python is used for this example, this approach to monitoring log output is applicable across many client types. Check the list of [available ECS logging plugins](ecs-logging://reference/intro.md). +In this guide, we show you how to ingest logs from a Python application and deliver them securely into an {{ech}} deployment. You’ll set up Filebeat to monitor a JSON-structured log file with fields formatted according to the Elastic Common Schema (ECS). You’ll then view real-time visualizations of the log events in {{kib}} as they occur. -*Time required: 1 hour* +While we use Python for this example, you can apply the same approach to monitoring log output across many client types. Check the list of [available ECS logging plugins](ecs-logging://reference/intro.md). We also use {{ech}} as the target {{stack}} destination for our logs, but with small modifications, you can adapt the steps in this guide to other deployments such as self-managed {{stack}} and {{ece}}. -## Prerequisites [ec_prerequisites_2] - -To complete these steps you need to have [Python](https://www.python.org/) installed on your system as well as the [Elastic Common Schema (ECS) logger](ecs-logging-python://reference/installation.md) for the Python logging library. +In this guide, you will: -To install *ecs-logging-python*, run: - -```sh -python -m pip install ecs-logging -``` +- [Create a Python script with logging](#ec-python-logs-create-script) +- [Prepare your connection and authentication details](#ec-authentication-details) +- [Set up Filebeat](#ec-python-logs-filebeat) +- [Send Python logs to {{es}}](#ec-python-logs-send-ess) +- [Create log visualizations in {{kib}}](#ec-python-logs-view-kibana) +_Time required: 1 hour_ -## Create a deployment [ec_get_elasticsearch_service_3] +## Prerequisites [ec_prerequisites_2] -::::{tab-set} +To complete the steps in this guide, you need to have: -:::{tab-item} Elastic Cloud Hosted -1. [Get a free trial](https://cloud.elastic.co/registration?page=docs&placement=docs-body). -2. Log into [Elastic Cloud](https://cloud.elastic.co?page=docs&placement=docs-body). -3. Select **Create deployment**. -4. Give your deployment a name. You can leave all other settings at their default values. -5. Select **Create deployment** and save your Elastic deployment credentials. You need these credentials later on. -6. When the deployment is ready, click **Continue** and a page of **Setup guides** is displayed. To continue to the deployment homepage click **I’d like to do something else**. +- An {{ech}} deployment with the _Elastic for Observability_ solution view and the superuser credentials provided at deployment creation. For more details, see [Create an {{ech}} deployment](../../../deploy-manage/deploy/elastic-cloud/create-an-elastic-cloud-hosted-deployment.md). +- A [Python](https://www.python.org/) version installed which is compatible with the ECS logging library for Python. For a list of compatible Python versions, check the library's [README](https://github.com/elastic/ecs-logging-python/blob/main/README.md). +- The [ECS logging library for Python](ecs-logging-python://reference/index.md) installed. -Prefer not to subscribe to yet another service? You can also get {{ech}} through [AWS, Azure, and GCP marketplaces](../../../deploy-manage/deploy/elastic-cloud/subscribe-from-marketplace.md). -::: +To install the ECS logging library for Python, run: -:::{tab-item} Elastic Cloud Enterprise -1. Log into the Elastic Cloud Enterprise admin console. -2. Select **Create deployment**. -3. Give your deployment a name. You can leave all other settings at their default values. -4. Select **Create deployment** and save your Elastic deployment credentials. You need these credentials later on. -5. When the deployment is ready, click **Continue** and a page of **Setup guides** is displayed. To continue to the deployment homepage click **I’d like to do something else**. -::: +```sh +python -m pip install ecs-logging +``` +::::{note} +Depending on the Python version you're using, you may need to install the library in a [Python virtual environment](https://docs.python.org/3/library/venv.html). :::: -## Connect securely [ec_connect_securely_2] - -When connecting to {{ech}} or {{ece}}, you can use a Cloud ID to specify the connection details. Find your Cloud ID by going to the {{kib}} main menu and selecting Management > Integrations, and then selecting View deployment details. - -To connect to, stream data to, and issue queries, you need to think about authentication. Two authentication mechanisms are supported, *API key* and *basic authentication*. Here, to get you started quickly, we’ll show you how to use basic authentication, but you can also generate API keys as shown later on. API keys are safer and preferred for production environments. - ## Create a Python script with logging [ec-python-logs-create-script] -In this step, you’ll create a Python script that generates logs in JSON format, using Python’s standard logging module. +In this step, you’ll create a Python script that generates logs in JSON format using Python’s standard logging module. -1. In a local directory, create a new file *elvis.py* and save it with these contents: +1. In a local directory, create a new file named `elvis.py`, and save it with these contents: ```python #!/usr/bin/python @@ -112,42 +98,97 @@ In this step, you’ll create a Python script that generates logs in JSON format time.sleep(random2) ``` - This Python script randomly generates one of twelve log messages, continuously, at a random interval of between 1 and 10 seconds. The log messages are written to file `elvis.json`, each with a timestamp, a log level of *info*, *warning*, *error*, or *critical*, and other data. Just to add some variance to the log data, the *info* message *Elvis has left the building* is set to be the most probable log event. + This Python script randomly generates one of twelve log messages, continuously, at a random interval of between 1 and 10 seconds. The log messages are written to an `elvis.json` file, each with a timestamp, a log level of _info_, _warning_, _error_, or _critical_, and other data. To add some variance to the log data, the _info_ message _Elvis has left the building_ is set to be the most probable log event. - For simplicity, there is just one log file and it is written to the local directory where `elvis.py` is located. In a production environment you may have multiple log files, associated with different modules and loggers, and likely stored in `/var/log` or similar. To learn more about configuring logging in Python, check [Logging facility for Python](https://docs.python.org/3/library/logging.html). + For simplicity, there is just one log file (`elvis.json`), and it is written to the local directory where `elvis.py` is located. In a production environment, you may have multiple log files associated with different modules and loggers and likely stored in `/var/log` or similar. To learn more about configuring logging in Python, check [Logging facility for Python](https://docs.python.org/3/library/logging.html). - Having your logs written in a JSON format with ECS fields allows for easy parsing and analysis, and for standardization with other applications. A standard, easily parsible format becomes increasingly important as the volume and type of data captured in your logs expands over time. + Having your logs written in a JSON format with ECS fields allows for easy parsing and analysis, and for standardization with other applications. A standard, easily parsable format becomes increasingly important as the volume and type of data captured in your logs expands over time. - Together with the standard fields included for each log entry is an extra *http.request.body.content* field. This extra field is there just to give you some additional, interesting data to work with, and also to demonstrate how you can add optional fields to your log data. Check the [ECS Field Reference](ecs://reference/ecs-field-reference.md) for the full list of available fields. + Together with the standard fields included for each log entry is an extra `http.request.body.content` field. This extra field is there to give you some additional, interesting data to work with, and also to demonstrate how you can add optional fields to your log data. Check the [ECS field reference](ecs://reference/ecs-field-reference.md) for the full list of available fields. -2. Let’s give the Python script a test run. Open a terminal instance in the location where you saved *elvis.py* and run the following: +2. Let’s give the Python script a test run. Open a terminal instance in the location where you saved `elvis.py`, and run the following: ```sh python elvis.py ``` - After the script has run for about 15 seconds, enter *CTRL + C* to stop it. Have a look at the newly generated *elvis.json*. It should contain one or more entries like this one: + After the script has run for about 15 seconds, enter _CTRL + C_ to stop it. Have a look at the newly generated `elvis.json` file. It should contain one or more entries like this one: + + ```json + {"@timestamp":"2025-06-16T02:19:34.687Z","log.level":"info","message":"Elvis has left the building.","ecs":{"version":"1.6.0"},"http":{"request":{"body":{"content":"Elvis has left the building."}}},"log":{"logger":"app","origin":{"file":{"line":39,"name":"elvis.py"},"function":""},"original":"Elvis has left the building."},"process":{"name":"MainProcess","pid":3044,"thread":{"id":4444857792,"name":"MainThread"}}} + ``` + +3. After confirming that `elvis.py` runs as expected, you can delete `elvis.json`. + + +## Prepare your connection and authentication details [ec-authentication-details] + +To connect to your {{ech}} deployment, stream data, and issue queries, you have to specify the connection details using your deployment's Cloud ID, and you have to authenticate using either _basic authentication_ or an _API key_. + +### Cloud ID + +To find the [Cloud ID](/deploy-manage/deploy/elastic-cloud/find-cloud-id.md) of your deployment, go to the {{kib}} main menu, then select **Management** → **Integrations** → **Connection details**. Note that the Cloud ID value is in the format `deployment-name:hash`. Save this value to use it later. + +### Basic authentication + +To authenticate and send data to {{ech}}, you can use the username and password you saved when you created your deployment. We use this method to set up the Filebeat connection in the [Configure Filebeat to access {{ech}}](#ec-configure-access) section. + +### API key + +You can also generate an [API key](/deploy-manage/api-keys.md) through the {{ech}} console, and configure Filebeat to use the new key to connect securely to your deployment. API keys are the preferred method for connecting to production environments. + +To create an API key for Filebeat: + +1. Log in to the [{{ecloud}} Console](https://cloud.elastic.co?page=docs&placement=docs-body), and select your deployment. +2. In the main menu, go to **Developer tools**. +3. Enter the following request: + + ```json + POST /_security/api_key + { + "name": "filebeat-api-key", + "role_descriptors": { + "logstash_read_write": { + "cluster": ["manage_index_templates", "monitor"], + "index": [ + { + "names": ["filebeat-*"], + "privileges": ["create_index", "write", "read", "manage"] + } + ] + } + } + } + ``` + + This request creates an API key with the cluster `monitor` privilege, which gives read-only access for determining the cluster state, and `manage_index_templates` privilege, which allows all operations on index templates. Additional privileges allow `create_index`, `write`, and `manage` operations for the specified index (`filebeat-*`). The index `manage` privilege is added to enable index refreshes. + +4. Click **▶** to run the request. The output should be similar to the following: ```json - {"@timestamp":"2021-06-16T02:19:34.687Z","log.level":"info","message":"Elvis has left the building.","ecs":{"version":"1.6.0"},"http":{"request":{"body":{"content":"Elvis has left the building."}}},"log":{"logger":"app","origin":{"file":{"line":39,"name":"elvis.py"},"function":""},"original":"Elvis has left the building."},"process":{"name":"MainProcess","pid":3044,"thread":{"id":4444857792,"name":"MainThread"}}} + { + "api_key": "tV1dnfF-GHI59ykgv4N0U3", + "id": "2TBR42gBabmINotmvZjv", + "name": "filebeat-api-key" + } ``` -3. After confirming that *elvis.py* runs as expected, you can delete *elvis.json*. +Learn how to set up the API key in the Filebeat configuration in the [Configure an API key](#ec-configure-api-key) section. ## Set up Filebeat [ec-python-logs-filebeat] -Filebeat offers a straightforward, easy to configure way to monitor your Python log files and port the log data into your deployment. +Filebeat offers a straightforward, easy-to-configure way to monitor your Python log files, and port the log data into your deployment. -**Get Filebeat** +### Get Filebeat -[Download Filebeat](https://www.elastic.co/downloads/beats/filebeat) and unpack it on the local server from which you want to collect data. +[Download Filebeat](https://www.elastic.co/downloads/beats/filebeat), then unpack it on the machine where you created the `elvis.py` script. -**Configure Filebeat to access {{ech}} or {{ece}}** +### Configure Filebeat to access {{ech}} [ec-configure-access] -In */filebeat-/* (where ** is the directory where Filebeat is installed and ** is the Filebeat version number), open the *filebeat.yml* configuration file for editing. +Go to the directory where you unpacked Filebeat, and open the `filebeat.yml` configuration file. In the **Elastic Cloud** section, make the following modifications to set up basic authentication: -```txt +```yml # =============================== Elastic Cloud ================================ # These settings simplify using Filebeat with the Elastic Cloud (https://cloud.elastic.co/). @@ -155,69 +196,101 @@ In */filebeat-/* (where ** is the directory where # The cloud.id setting overwrites the `output.elasticsearch.hosts` and # `setup.kibana.host` options. # You can find the `cloud.id` in the Elastic Cloud web UI. -cloud.id: my-deployment:long-hash <1> +cloud.id: deployment-name:hash <1> # The cloud.auth setting overwrites the `output.elasticsearch.username` and # `output.elasticsearch.password` settings. The format is `:`. -cloud.auth: elastic:password <2> +cloud.auth: username:password <2> ``` -1. Uncomment the `cloud.id` line and add the deployment’s Cloud ID. You can include or omit the *:* prefix at the beginning of the Cloud ID. Both versions work fine. Find your Cloud ID by going to the {{kib}} main menu and selecting Management > Integrations, and then selecting View deployment details. -2. Uncomment the `cloud.auth` line and add the username and password for your deployment that you recorded when you created your deployment. The format is *:*, for example *elastic:57ugj782kvkwmSKg8uVe*. +1. Uncomment the `cloud.id` line, and add the deployment’s Cloud ID as the key's value. Note that the `cloud.id` value is in the format `deployment-name:hash`. Find your Cloud ID by going to the {{kib}} main menu, and selecting **Management** → **Integrations** → **Connection details**. +2. Uncomment the `cloud.auth` line, and add the username and password for your deployment in the format `username:password`. For example, `cloud.auth: elastic:57ugj782kvkwmSKg8uVe`. +::::{note} +As an alternative to configuring the connection using [`cloud.id` and `cloud.auth`](beats://reference/filebeat/configure-cloud-id.md), you can specify the {{es}} URL and authentication details directly in the [{{es}} output](beats://reference/filebeat/elasticsearch-output.md). This is useful when connecting to a different deployment type, such as a self-managed cluster. +:::: -**Configure Filebeat inputs** +#### Configure an API key [ec-configure-api-key] -Filebeat has several ways to collect logs. For this example, you’ll configure log collection manually. +To use an _API key_ to authenticate, leave the comment on the `cloud.auth` line as Filebeat will use an API key instead of the deployment credentials to authenticate. -In the *filebeat.inputs* section of *filebeat.yml*, set *enabled:* to *true*, and set *paths:* to the location of your log file or files. In this example, set the same directory where you saved *elvis.py*: +In the `output.elasticsearch` section of `filebeat.yml`, uncomment the `api_key` line, and add the API key you've created for Filebeat. The format of the value is `id:api_key`, where `id` and `api_key` are the values returned by the [Create API key API](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-security-create-api-key). -```txt +Using the example values returned by the `POST` request we used earlier, the configuration for an API key authentication would look like this: + +```yml +cloud.id: my-deployment:yTMtd5VzdKEuP2NwPbNsb3VkLtKzLmldJDcyMzUyNjBhZGP7MjQ4OTZiNTIxZTQyOPY2C2NeOGQwJGQ2YWQ4M5FhNjIyYjQ9ODZhYWNjKDdlX2Yz4ELhRYJ7 +#cloud.auth: + +output.elasticsearch: + ... + api_key: "2TBR42gBabmINotmvZjv:tV1dnfF-GHI59ykgv4N0U3" +``` + +### Configure Filebeat inputs + +Filebeat has several ways to collect logs. For this example, you’ll configure log collection manually. In the `filebeat.inputs` section of `filebeat.yml`: + +```yml filebeat.inputs: # Each - is an input. Most options can be set at the input level, so # you can use different inputs for various configurations. -# Below are the input specific configurations. +# Below are the input-specific configurations. + +# filestream is an input for collecting log messages from files. +- type: filestream -- type: log + # Unique ID among all inputs, an ID is required. + id: my-filestream-id # Change to true to enable this input configuration. - enabled: true + enabled: true <1> # Paths that should be crawled and fetched. Glob based paths. paths: - - /path/to/log/files/*.json + - /path/to/log/files/*.json <2> ``` -You can specify a wildcard (*\**) character to indicate that all log files in the specified directory should be read. You can also use a wildcard to read logs from multiple directories. For example `/var/log/*/*.log`. +1. Set `enabled: true`. +2. Set `paths` to the location of your log files. For this example, set `paths` to the directory where you saved `elvis.py`. -**Add the JSON input options** +You can use a wildcard (`*`) character to indicate that all log files in the specified directory should be read. You can also use a wildcard to read logs from multiple directories. For example, `/var/log/*/*.log`. -Filebeat’s input configuration options include several settings for decoding JSON messages. Log files are decoded line by line, so it’s important that they contain one JSON object per line. +### Add the JSON input options -For this example, Filebeat uses the following four decoding options. +Filebeat’s `filestream` input configuration includes several options for decoding logs structured as JSON messages. You can set these options in `parsers.ndjson`. Filebeat processes the logs line by line, so it’s important that they contain one JSON object per line. -```txt - json.keys_under_root: true - json.overwrite_keys: true - json.add_error_key: true - json.expand_keys: true +For this example, set Filebeat to use the `ndjson` parser with the following decoding options: + +```yml + parsers: + - ndjson: + target: "" + overwrite_keys: true + expand_keys: true + add_error_key: true + message_key: msg ``` -To learn more about these settings, check [JSON input configuration options](beats://reference/filebeat/filebeat-input-log.md#filebeat-input-log-config-json) and [Decode JSON fields](beats://reference/filebeat/decode-json-fields.md) in the Filebeat Reference. +To learn more about these settings, check the [`ndjson` parser's configuration options](beats://reference/filebeat/filebeat-input-filestream.md#filebeat-input-filestream-ndjson) and [Decode JSON fields](beats://reference/filebeat/decode-json-fields.md) in the Filebeat reference. -Append the four JSON decoding options to the *Filebeat inputs* section of *filebeat.yml*, so that the section now looks like this: +Append `parsers.ndjson` with the set decoding options to the `filebeat.inputs` section of `filebeat.yml`, so that the section now looks like this: -```yaml +```yml # ============================== Filebeat inputs =============================== filebeat.inputs: # Each - is an input. Most options can be set at the input level, so # you can use different inputs for various configurations. -# Below are the input specific configurations. +# Below are the input-specific configurations. -- type: log +# filestream is an input for collecting log messages from files. +- type: filestream + + # Unique ID among all inputs, an ID is required. + id: my-filestream-id # Change to true to enable this input configuration. enabled: true @@ -225,204 +298,145 @@ filebeat.inputs: # Paths that should be crawled and fetched. Glob based paths. paths: - /path/to/log/files/*.json - json.keys_under_root: true - json.overwrite_keys: true - json.add_error_key: true - json.expand_keys: true + #- c:\programdata\elasticsearch\logs\* + + parsers: + - ndjson: + target: "" + overwrite_keys: true + expand_keys: true + add_error_key: true + message_key: msg ``` -**Finish setting up Filebeat** +### Finish setting up Filebeat -Filebeat comes with predefined assets for parsing, indexing, and visualizing your data. To load these assets, run the following from the Filebeat installation directory: +Filebeat comes with predefined assets for parsing, indexing, and visualizing your data. To load these assets into {{kib}} on your {{ech}} deployment, run the following from the Filebeat installation directory: -```txt +```sh ./filebeat setup -e ``` ::::{important} -Depending on variables including the installation location, environment, and local permissions, you might need to [change the ownership](beats://reference/libbeat/config-file-permissions.md) of filebeat.yml. You can also try running the command as *root*: *sudo ./filebeat setup -e* or you can disable strict permission checks by running the command with the `--strict.perms=false` option. +Depending on variables including the installation location, environment, and local permissions, you might need to [change the ownership](beats://reference/libbeat/config-file-permissions.md) of `filebeat.yml`. You can also try running the command as `root`: `sudo ./filebeat setup -e*`, or you can disable strict permission checks by running the command with the `--strict.perms=false` option. :::: - -The setup process takes a couple of minutes. If everything goes successfully you should get a confirmation message: +The setup process takes a couple of minutes. If the setup is successful, you should get a confirmation message: ```txt Loaded Ingest pipelines ``` -The Filebeat data view (formerly *index pattern*) is now available in Elasticsearch. To verify: +The Filebeat data view (formerly _index pattern_) is now available in Elasticsearch. To verify: ::::{note} -Beginning with Elastic Stack version 8.0, Kibana *index patterns* have been renamed to *data views*. To learn more, check the Kibana [What’s new in 8.0](https://www.elastic.co/guide/en/kibana/8.0/whats-new.html#index-pattern-rename) page. +Beginning with Elastic Stack version 8.0, Kibana _index patterns_ have been renamed to _data views_. To learn more, check the {{kib}} [What’s new in 8.0](https://www.elastic.co/guide/en/kibana/8.0/whats-new.html#index-pattern-rename) page. :::: +1. Log in to the [{{ecloud}} Console](https://cloud.elastic.co?page=docs&placement=docs-body), and select your deployment. +2. In the main menu, select **Management** → **Stack Management** → **Data Views**. +3. In the search bar, search for _filebeat_. You should see _filebeat-*_ in the search results. -1. [Login to Kibana](../../../deploy-manage/deploy/elastic-cloud/access-kibana.md). -2. Open the {{kib}} main menu and select **Management** > **{{kib}}** > **Data views**. -3. In the search bar, search for *filebeat*. You should get _filebeat-*_ in the search results. +Filebeat is now set to collect log messages and stream them to your deployment. -**Optional: Use an API key to authenticate** +## Send Python logs to {{es}} [ec-python-logs-send-ess] -For additional security, instead of using basic authentication you can generate an Elasticsearch API key through the through the {{ech}} or {{ece}} console, and then configure Filebeat to use the new key to connect securely to your deployment. +It’s time to send some log data into {{es}}. -1. For {{ech}}, log into [{{ecloud}} Console](https://cloud.elastic.co?page=docs&placement=docs-body), or for {{ece}}, log into the admin console. -2. Select the deployment name and go to **☰** > **Management** > **Dev Tools**. -3. Enter the following request: - - ```json - POST /_security/api_key - { - "name": "filebeat-api-key", - "role_descriptors": { - "logstash_read_write": { - "cluster": ["manage_index_templates", "monitor"], - "index": [ - { - "names": ["filebeat-*"], - "privileges": ["create_index", "write", "read", "manage"] - } - ] - } - } - } - ``` +### Launch Filebeat and `elvis.py` - This creates an API key with the cluster `monitor` privilege which gives read-only access for determining the cluster state, and `manage_index_templates` which allows all operations on index templates. Some additional privileges also allow `create_index`, `write`, and `manage` operations for the specified index. The index `manage` privilege is added to enable index refreshes. +In a new terminal: -4. Click **▶**. The output should be similar to the following: +1. Navigate to the directory where you created the `elvis.py` Python script, then run it: - ```json - { - "api_key": "tV1dnfF-GHI59ykgv4N0U3", - "id": "2TBR42gBabmINotmvZjv", - "name": "filebeat-api-key" - } + ```sh + python elvis.py ``` -5. Add your API key information to the *Elasticsearch Output* section of `filebeat.yml`, just below *output.elasticsearch:*. Use the format `:`. If your results are as shown in this example, enter `2TBR42gBabmINotmvZjv:tV1dnfF-GHI59ykgv4N0U3`. -6. Add a pound (`#`) sign to comment out the *cloud.auth: elastic:* line, since Filebeat will use the API key instead of the deployment username and password to authenticate. + Let the script run for a few minutes, then make sure that the `elvis.json` file is generated and is populated with several log entries. - ```txt - # =============================== Elastic Cloud ================================ +2. Launch Filebeat by running the following from the Filebeat installation directory: - # These settings simplify using Filebeat with the Elastic Cloud (https://cloud.elastic.co/). - - # The cloud.id setting overwrites the `output.elasticsearch.hosts` and - # `setup.kibana.host` options. - # You can find the `cloud.id` in the Elastic Cloud web UI. - cloud.id: my-deployment:yTMtd5VzdKEuP2NwPbNsb3VkLtKzLmldJDcyMzUyNjBhZGP7MjQ4OTZiNTIxZTQyOPY2C2NeOGQwJGQ2YWQ4M5FhNjIyYjQ9ODZhYWNjKDdlX2Yz4ELhRYJ7 - - # The cloud.auth setting overwrites the `output.elasticsearch.username` and - # `output.elasticsearch.password` settings. The format is `:`. - #cloud.auth: elastic:591KhtuAgTP46by9C4EmhGuk - - # ================================== Outputs =================================== - - # Configure what output to use when sending the data collected by the beat. - - # ---------------------------- Elasticsearch Output ---------------------------- - output.elasticsearch: - # Array of hosts to connect to. - api_key: "2TBR42gBabmINotmvZjv:tV1dnfF-GHI59ykgv4N0U3" + ```sh + ./filebeat -c filebeat.yml -e ``` + In this command: + * The `-e` flag sends output to the standard error instead of the configured log output. + * The `-c` flag specifies the path to the Filebeat config file. -## Send the Python logs to Elasticsearch [ec-python-logs-send-ess] + ::::{note} + In case the command doesn't work as expected, check the [Filebeat quick start](beats://reference/filebeat/filebeat-installation-configuration.md#installation) for the detailed command syntax for your operating system. You can also try running the command as `root`: `sudo ./filebeat -c filebeat.yml -e`. + :::: -It’s time to send some log data into E{{es}}! + Filebeat should now be running and monitoring the contents of the `elvis.json` file. -**Launch Filebeat and elvis.py** +### Verify the log entries -Launch Filebeat by running the following from the Filebeat installation directory: +To confirm that the log data has been successfully sent to your deployment: -```txt -./filebeat -e -c filebeat.yml -``` - -In this command: - -* The *-e* flag sends output to the standard error instead of the configured log output. -* The *-c* flag specifies the path to the Filebeat config file. - -::::{note} -Just in case the command doesn’t work as expected, check the [Filebeat quick start](beats://reference/filebeat/filebeat-installation-configuration.md#installation) for the detailed command syntax for your operating system. You can also try running the command as *root*: *sudo ./filebeat -e -c filebeat.yml*. -:::: - - -Filebeat should now be running and monitoring the contents of *elvis.json*, which actually doesn’t exist yet. So, let’s create it. Open a new terminal instance and run the *elvis.py* Python script: - -```sh -python elvis.py -``` - -Let the script run for a few minutes and maybe brew up a quick coffee or tea ☕ . After that, make sure that the *elvis.json* file is generated as expected and is populated with several log entries. - -**Verify the log entries** - -The next step is to confirm that the log data has successfully found it’s way into your deployment. - -1. [Login to Kibana](../../../deploy-manage/deploy/elastic-cloud/access-kibana.md). -2. Open the {{kib}} main menu and select **Management** > **{{kib}}** > **Data views**. -3. In the search bar, search for *filebeat_. You should get _filebeat-*_ in the search results. -4. Select _filebeat-*_. +1. Log in to the [{{ecloud}} Console](https://cloud.elastic.co?page=docs&placement=docs-body), and select your deployment. +2. In the main menu, select **Management** → **Stack Management** → **Data Views**. +3. In the search bar, search for _filebeat_, then select _filebeat-*_. The filebeat data view shows a list of fields and their details. +## Create log visualizations in {{kib}} [ec-python-logs-view-kibana] -## Create log visualizations in Kibana [ec-python-logs-view-kibana] - -Now it’s time to create visualizations based off of the Python application log data. +Now you can create visualizations based off of the Python application log data: -1. Open the Kibana main menu and select **Dashboard**, then **Create dashboard**. +1. In the main menu, select **Dashboards** → **Create dashboard**. 2. Select **Create visualization**. The [Lens](../../../explore-analyze/visualize/lens.md) visualization editor opens. -3. In the data view dropdown box, select *_filebeat-*_, if it isn’t already selected. -4. In the **Visualization type dropdown**, select **Bar vertical stacked**, if it isn’t already selected. +3. In the **Data view** dropdown box, select _filebeat-*_, if it isn’t already selected. +4. In the menu for setting the visualization type, select **Bar** and **Stacked**, if they aren’t already selected. 5. Check that the [time filter](../../../explore-analyze/query-filter/filtering.md) is set to **Last 15 minutes**. -6. From the **Available fields** list, drag and drop the **@timestamp** field onto the visualization builder. -7. Drag and drop the *log.level* field onto the visualization builder. -8. In the chart settings area, under **Break down by**, select **Top values of log.level** and set **Number of values** to *4*. Since there are four log severity levels, this parameter sets all of them to appear in the chart legend. -9. Select **Refresh**. A stacked bar chart now shows the relative frequency of each of the four log severity levels over time. +6. From the **Available fields** list, drag and drop the `@timestamp` field onto the visualization builder. +7. Drag and drop the `log.level` field onto the visualization builder. +8. In the chart settings area under **Breakdown**, select **Top values of log.level**. +9. Set the **Number of values** field to _4_ to display all of four levels of severity in the chart legend. +10. Select **Refresh**. A stacked bar chart now shows the relative frequency of each of the four log severity levels over time. ![A screen capture of the Kibana "Bar vertical stacked" visualization with several bars. The X axis shows "Count of records" and the Y axis shows "@timestamp per 30 seconds". Each bar is divided into the four log severity levels.](/manage-data/images/cloud-ec-python-logs-levels.png "") -10. Select **Save and return** to add this visualization to your dashboard. +11. Select **Save and return** to add this visualization to your dashboard. -Let’s create a second visualization. +Let’s create a second visualization: 1. Select **Create visualization**. -2. Again, make sure that **Visualization type dropdown** is set to **Bar vertical stacked**. -3. From the **Available fields** list, drag and drop the **@timestamp** field onto the visualization builder. -4. Drag and drop the **http.request.body.content** field onto the visualization builder. -5. In the chart settings area, under **Break down by**, select **Top values of http.request.body.content** and set **Number of values** to *12*. Since there are twelve different log messages, this parameter sets all of them to appear in the chart legend. -6. Select **Refresh**. A stacked bar chart now shows the relative frequency of each of the log messages over time. +2. In the menu for setting the visualization type, select **Bar** and **Stacked**, if they aren’t already selected. +3. From the **Available fields** list, drag and drop the `@timestamp` field onto the visualization builder. +4. Drag and drop the `http.request.body.content` field onto the visualization builder. +5. In the chart settings area under **Breakdown**, select **Top values of http.request.body.content**. +6. Set the **Number of values** to _12_ to display all twelve log messages in the chart legend. +7. Select **Refresh**. A stacked bar chart now shows the relative frequency of each of the log messages over time. ![A screen capture of the visualization builder](/manage-data/images/cloud-ec-python-logs-content.png "") -7. Select **Save and return** to add this visualization to your dashboard. +8. Select **Save and return** to add this visualization to your dashboard. -And now for the final visualization. +Now, create one final visualization: 1. Select **Create visualization**. -2. In the **Visualization type dropdown** dropdown, select **Donut**. -3. From the list of available fields, drag and drop the **log.level** field onto the visualization builder. A donut chart appears. +2. In the menu for setting the visualization type, select **Pie**. +3. From the **Available fields** list, drag and drop the `log.level` field onto the visualization builder. A pie chart appears. - ![A screen capture of a donut chart divided into four sections](/manage-data/images/cloud-ec-python-logs-donut.png "") + ![A screen capture of a pie chart divided into four sections](/manage-data/images/cloud-ec-python-logs-donut.png "") 4. Select **Save and return** to add this visualization to your dashboard. 5. Select **Save** and add a title to save your new dashboard. -You now have a Kibana dashboard with three visualizations: a stacked bar chart showing the frequency of each log severity level over time, another stacked bar chart showing the frequency of various message strings over time (from the added *http.request.body.content* parameter), and a donut chart showing the relative frequency of each log severity type. +You now have a {{kib}} dashboard with three visualizations: a stacked bar chart showing the frequency of each log severity level over time, another stacked bar chart showing the frequency of various message strings over time (from the added `http.request.body.content` parameter), and a pie chart showing the relative frequency of each log severity type. You can add titles to the visualizations, resize and position them as you like, and then save your changes. -**View log data updates in real time** +### View log data in real time -1. Select **Refresh** on the Kibana dashboard. Since *elvis.py* continues to run and generate log data, your Kibana visualizations update with each refresh. +1. Select **Refresh** on the {{kib}} dashboard. Because `elvis.py` continues to run and generate log data, your {{kib}} visualizations update with each refresh. ![A screen capture of the completed Kibana dashboard](/manage-data/images/cloud-ec-python-logs-final-dashboard.png "") -2. As your final step, remember to stop Filebeat and the Python script. Enter *CTRL + C* in both your Filebeat terminal and in your `elvis.py` terminal. +2. As a final step, remember to stop Filebeat and the Python script. Enter _CTRL + C_ in both your Filebeat terminal and in your `elvis.py` terminal. -You now know how to monitor log files from a Python application, deliver the log event data securely into an {{ech}} or {{ece}} deployment, and then visualize the results in Kibana in real time. Consult the [Filebeat documentation](beats://reference/filebeat/index.md) to learn more about the ingestion and processing options available for your data. You can also explore our [documentation](../../../manage-data/ingest.md) to learn all about all about ingesting data. +You now know how to monitor log files from a Python application, deliver the log event data securely into an {{ech}} deployment, and then visualize the results in {{kib}} in real time. Consult the [Filebeat documentation](beats://reference/filebeat/index.md) to learn more about the Filebeat ingestion and processing options available for your data. You can also explore our [documentation](../../../manage-data/ingest.md) to learn more about ingesting data with other tools.