You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/learning-paths/servers-and-cloud-computing/sentiment-analysis-eks/cluster-monitoring.md
+6-2Lines changed: 6 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,9 +12,13 @@ layout: learningpathall
12
12
13
13
* Grafana is a visualization and analytics tool. It integrates with data sources from Prometheus to create interactive dashboards to monitor and analyze Kubernetes metrics.
14
14
15
+
{{% notice Note %}}
16
+
The terrafom script executed in the previous step automatically installs Prometheus and Grafana in the EKS cluster. However, if you wish to have more flexibility with the installable versions for both, follow the instructions below.
17
+
{{% /notice %}}
18
+
15
19
## Install Prometheus on your EKS cluster
16
20
17
-
You can use Helm to install prometheus on the Kubernetes cluster.
21
+
You can use Helm to install Prometheus on the Kubernetes cluster.
18
22
19
23
Follow the [Helm documentation](https://helm.sh/docs/intro/install/) to install it on your computer.
20
24
@@ -105,7 +109,7 @@ kubectl get pods -n grafana
105
109
106
110
Log in to the Grafana dashboard using the LoadBalancer IP and click on **Dashboards** in the left navigation page.
107
111
108
-
Locate a `Kubernetes/Compute Resources/Node` dashboard and click on it.
112
+
Locate a `Kubernetes/Compute Resources/Node (Pods)` dashboard and click on it.
109
113
110
114
You should see a dashboard like below for your Kubernetes cluster:
Create a twitter(X) [developer account](https://developer.x.com/en/docs/x-api/getting-started/getting-access-to-the-x-api) and create a`bearer token`.
190
+
Create a twitter(X) [developer account](https://developer.x.com/en/docs/x-api/getting-started/getting-access-to-the-x-api) and download the`bearer token`.
191
191
192
-
Use the following commands to set the token and fetch the posts:
192
+
Use the following commands to set the bearer token and fetch the posts:
193
193
194
194
```console
195
195
export BEARER_TOKEN=<BEARER_TOKEN_FROM_X>
196
196
python3 scripts/xapi_tweets.py
197
197
```
198
+
{{% notice Note %}}
199
+
You might need to install the following python packages, if you run into any dependency issues:
200
+
* pip3 install requests.
201
+
* pip3 install boto3.
202
+
{{% /notice %}}
198
203
199
204
You can modify the script `xapi_tweets.py` and use your own keywords.
200
205
@@ -204,3 +209,23 @@ Here is the code which includes some sample keywords:
204
209
query_params = {'query': "(#onArm OR @Arm OR #Arm OR #GenAI) -is:retweet lang:en",
205
210
'tweet.fields': 'lang'}
206
211
```
212
+
213
+
Use the following command to send these processed tweets to Elasticsearch
214
+
215
+
```console
216
+
python3 csv_to_kinesis.py
217
+
```
218
+
219
+
Navigate to the Kibana dashboard using the following URL and analyze the tweets:
220
+
221
+
```console
222
+
http://<IP_Address_of_ES_and_Kibana>:5601
223
+
```
224
+
225
+
## Environment Clean-up
226
+
227
+
Following this Learning Path will deploy many artifacts in your cloud account. Remember to destroy the resources after you've finished executing it. Use the following command to cleanup the resources:
0 commit comments