You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 30, 2024. It is now read-only.
Copy file name to clipboardExpand all lines: README.md
+29-27Lines changed: 29 additions & 27 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,21 +6,21 @@ The solution automates digital asset (text and image) ingestion from twitter, RS
6
6
7
7
The solution performs the following key features:
8
8
9
-
-**Performs topic modeling to detect dominant topics**: identifies the terms that collectively form a topic from within customer feedback
10
-
-**Identifies the sentiment of what customers are saying**: uses contextual semantic search to understand the nature of online discussions
11
-
-**Determines if images associated with your brand contain unsafe content**: detects unsafe and negative imagery in content
12
-
-**Helps customers identify insights in near real-time**: you can use a visualization dashboard to better understand context, threats, and opportunities almost instantly
9
+
-**Performs topic modeling to detect dominant topics**: identifies the terms that collectively form a topic from within customer feedback
10
+
-**Identifies the sentiment of what customers are saying**: uses contextual semantic search to understand the nature of online discussions
11
+
-**Determines if images associated with your brand contain unsafe content**: detects unsafe and negative imagery in content
12
+
-**Helps customers identify insights in near real-time**: you can use a visualization dashboard to better understand context, threats, and opportunities almost instantly
13
13
14
14
This solution deploys an AWS CloudFormation template that supports Twitter, RSS feeds, and YouTube comments as data source options for ingestion, but the solution can be customized to aggregate other social media platforms and internal enterprise systems.
15
15
16
16
For a detailed solution deployment guide, refer to [Discovering Hot Topics using Machine Learning](https://aws.amazon.com/solutions/implementations/discovering-hot-topics-using-machine-learning)
17
17
18
18
## On this Page
19
19
20
-
-[Architecture Overview](#architecture-overview)
21
-
-[Deployment](#deployment)
22
-
-[Source Code](#source-code)
23
-
-[Creating a custom build](#creating-a-custom-build)
20
+
-[Architecture Overview](#architecture-overview)
21
+
-[Deployment](#deployment)
22
+
-[Source Code](#source-code)
23
+
-[Creating a custom build](#creating-a-custom-build)
24
24
25
25
## Architecture Overview
26
26
@@ -54,13 +54,13 @@ After you deploy the solution, use the included Amazon QuickSight dashboard to v
54
54
55
55
[AWS CDK Solutions Constructs](https://aws.amazon.com/solutions/constructs/) make it easier to consistently create well-architected applications. All AWS Solutions Constructs are reviewed by AWS and use best practices established by the AWS Well-Architected Framework. This solution uses the following AWS CDK Constructs:
56
56
57
-
-aws-events-rule-lambda
58
-
-aws-kinesisfirehose-s3
59
-
-aws-kinesisstreams-lambda
60
-
-aws-lambda-dynamodb
61
-
-aws-lambda-s3
62
-
-aws-lambda-step-function
63
-
-aws-sqs-lambda
57
+
- aws-events-rule-lambda
58
+
- aws-kinesisfirehose-s3
59
+
- aws-kinesisstreams-lambda
60
+
- aws-lambda-dynamodb
61
+
- aws-lambda-s3
62
+
- aws-lambda-step-function
63
+
- aws-sqs-lambda
64
64
65
65
## Deployment
66
66
@@ -78,12 +78,12 @@ The solution is deployed using a CloudFormation template with a lambda backed cu
78
78
├── bin [entrypoint of the CDK application]
79
79
├── lambda [folder containing source code the lambda functions]
80
80
│ ├── capture_news_feed [lambda function to ingest news feeds]
81
-
│ ├── create-partition [lambda function to create glue partitions]
82
81
│ ├── firehose_topic_proxy [lambda function to write topic analysis output to Amazon Kinesis Firehose]
83
82
│ ├── firehose-text-proxy [lambda function to write text analysis output to Amazon Kinesis Firehose]
84
-
│ ├── ingestion-consumer [lambda function that consumes messages from Amazon Kinesis Data Stream]
83
+
│ ├── ingestion-consumer [lambda function that consumes messages from Amazon Kinesis Data Streams]
84
+
│ ├── ingestion-custom [lambda function that reads files from Amazon S3 bucket and pushes data to Amazon Kinesis Data Streams]
85
85
│ ├── ingestion-producer [lambda function that makes Twitter API call and pushes data to Amazon Kinesis Data Stream]
86
-
│ ├── ingestion-youtube [lambda function that ingests comments from YouTube videos and pushes data to Amazon Kinesis Data Stream]
86
+
│ ├── ingestion-youtube [lambda function that ingests comments from YouTube videos and pushes data to Amazon Kinesis Data Streams]
87
87
│ ├── integration [lambda function that publishes inference outputs to Amazon Events Bridge]
88
88
│ ├── layers [lambda layer function library for Node and Python layers]
89
89
│ │ ├── aws-nodesdk-custom-config
@@ -106,10 +106,12 @@ The solution is deployed using a CloudFormation template with a lambda backed cu
106
106
│ ├── ingestion [CDK constructs for data ingestion]
107
107
│ ├── integration [CDK constructs for Amazon Events Bridge]
108
108
│ ├── quicksight-custom-resources [CDK construct that invokes custom resources to create Amazon QuickSight resources]
109
+
│ ├── s3-event-notification [CDK construct that configures S3 events to be pushed to Amazon EventBridge]
109
110
│ ├── storage [CDK constructs that define storage of the inference events]
110
111
│ ├── text-analysis-workflow [CDK constructs for text analysis of ingested data]
111
112
│ ├── topic-analysis-workflow [CDK constructs for topic visualization of ingested data]
112
113
│ └── visualization [CDK constructs to build a relational database model for visualization]
114
+
├── discovering-hot-topics.ts
113
115
```
114
116
115
117
## Creating a custom build
@@ -124,22 +126,22 @@ Clone this git repository
124
126
125
127
### 2. Build the solution for deployment
126
128
127
-
-To run the unit tests
129
+
- To run the unit tests
128
130
129
131
```
130
132
cd <rootDir>/source
131
133
chmod +x ./run-all-tests.sh
132
134
./run-all-tests.sh
133
135
```
134
136
135
-
-Configure the bucket name of your target Amazon S3 distribution bucket
137
+
- Configure the bucket name of your target Amazon S3 distribution bucket
136
138
137
139
```
138
140
export DIST_OUTPUT_BUCKET=my-bucket-name
139
141
export VERSION=my-version
140
142
```
141
143
142
-
-Now build the distributable:
144
+
- Now build the distributable:
143
145
144
146
```
145
147
cd <rootDir>/deployment
@@ -148,7 +150,7 @@ chmod +x ./build-s3-dist.sh
148
150
149
151
```
150
152
151
-
-Parameter details
153
+
- Parameter details
152
154
153
155
```
154
156
$DIST_OUTPUT_BUCKET - This is the global name of the distribution. For the bucket name, the AWS Region is added to the global name (example: 'my-bucket-name-us-east-1') to create a regional bucket. The lambda artifact should be uploaded to the regional buckets for the CloudFormation template to pick it up for deployment.
@@ -158,13 +160,13 @@ $CF_TEMPLATE_BUCKET_NAME - The name of the S3 bucket where the CloudFormation te
158
160
$QS_TEMPLATE_ACCOUNT - The account from which the Amazon QuickSight templates should be sourced for Amazon QuickSight Analysis and Dashboard creation
159
161
```
160
162
161
-
-When creating and using buckets it is recommeded to:
163
+
- When creating and using buckets it is recommeded to:
162
164
163
-
- Use randomized names or uuid as part of your bucket naming strategy.
164
-
- Ensure buckets are not public.
165
-
- Verify bucket ownership prior to uploading templates or code artifacts.
165
+
- Use randomized names or uuid as part of your bucket naming strategy.
166
+
- Ensure buckets are not public.
167
+
- Verify bucket ownership prior to uploading templates or code artifacts.
166
168
167
-
-Deploy the distributable to an Amazon S3 bucket in your account. _Note:_ you must have the AWS Command Line Interface installed.
169
+
- Deploy the distributable to an Amazon S3 bucket in your account. _Note:_ you must have the AWS Command Line Interface installed.
0 commit comments