Skip to content

Commit d14b34d

Browse files
committed
Move project content to separate repo
1 parent 66c9586 commit d14b34d

File tree

119 files changed

+43105
-54
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

119 files changed

+43105
-54
lines changed
Lines changed: 244 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,244 @@
1+
# Overview - Udagram Image Filtering Microservice
2+
The project application, **Udagram** - an Image Filtering application, allows users to register and log into a web client, and post photos to a feed.
3+
4+
This section introduces the project followed by instructions on how to set up your local environment remote dependencies to be able to configure and run the starter project.
5+
6+
## Components
7+
At a high level, the project has 2 main components:
8+
1. Frontend Web App - Angular web application built with Ionic Framework
9+
2. Backend RESTful API - Node-Express application
10+
11+
## Project Goal
12+
In this project you will:
13+
- Refactor the monolith application to microservices
14+
- Set up each microservice to be run in its own Docker container
15+
- Set up a Travis CI pipeline to push images to DockerHub
16+
- Deploy the DockerHub images to the Kubernetes cluster
17+
18+
# Local Prerequisites
19+
You should have the following tools installed in your local machine:
20+
* Git
21+
* Node.js
22+
* PostgreSQL client
23+
* Ionic CLI
24+
* Docker
25+
* AWS CLI
26+
* kubectl
27+
28+
We will provide some details and tips on how to set up the mentioned prerequisites. In general, we will opt to defer you to official installation instructions as these can change over time.
29+
30+
## Git
31+
Git is used to interface with GitHub.
32+
33+
> Windows users: Once you download and install Git for Windows, you can execute all the bash, ssh, git commands in the Gitbash terminal. On the other hand, Windows users using Windows Subsystem for Linux (WSL) can follow all steps as if they are Linux users.
34+
35+
### Instructions
36+
Install [Git](https://git-scm.com/downloads) for your corresponding operating system.
37+
38+
## Node.js
39+
### Instructions
40+
Install Node.js using [these instructions](https://nodejs.org/en/download/). We recommend a version between 12.14 and 14.15.
41+
42+
This installer will install Node.js as well as NPM on your system. Node.js is used to run JavaScript-based applications and NPM is a package manager used to handle dependencies.
43+
44+
### Verify Installation
45+
```bash
46+
# v12.14 or greater up to v14.15
47+
node -v
48+
```
49+
50+
```bash
51+
# v7.19 or greater
52+
npm -v
53+
```
54+
55+
## PostgreSQL client
56+
Using PostgreSQL involves a server and a client. The server hosts the database while the client interfaces with it to execute queries. Because we will be creating our server on AWS, we will only need to install a client for our local setup.
57+
58+
### Instructions
59+
The easiest way to set this up is with the [PostgreSQL Installer](https://www.postgresql.org/download/). This installer installs a PostgreSQL client in the form of the `psql` command line utility.
60+
61+
## Ionic CLI
62+
Ionic Framework is used to make cross-platform applications using JavaScript. It is used to help build and run Udagram.
63+
64+
### Instructions
65+
Use [these instructions](https://ionicframework.com/docs/installation/cli) to install Ionic Framework with `npm`.
66+
67+
#### Verify Installation
68+
```bash
69+
# v6.0 or higher
70+
ionic --version
71+
```
72+
73+
## Docker
74+
Docker is needed to build and run containerized applications.
75+
76+
### Instructions
77+
Follow the instructions for [Docker Desktop](https://docs.docker.com/desktop/#download-and-install) to install Docker.
78+
79+
## AWS CLI
80+
We use AWS CLI to interface programmatically with AWS.
81+
82+
### Instructions
83+
Follow [these instructions](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) to set up AWS CLI.
84+
85+
After it's installed, you will need to configure an AWS access profile locally so that our local environment knows how to access your AWS account:
86+
1. Create an IAM user with admin privileges on the AWS web console. Copy its Access Key.
87+
2. Configure the access profile locally using your Access Key:
88+
```bash
89+
aws configure [--profile nd9990]
90+
```
91+
92+
### Verify Installation
93+
```bash
94+
# aws-cli/2.0.0 or greater
95+
aws --version
96+
```
97+
98+
## kubectl
99+
kubectl is the command line tool to interface with Kubernetes. We will be using this to communicate with the EKS cluster that we create in AWS.
100+
101+
### Instructions
102+
Follow the [instructions here](https://kubernetes.io/docs/tasks/tools/#kubectl).
103+
104+
# Project Prerequisites
105+
To run this project, you are expected to have:
106+
1. An S3 bucket
107+
2. A PostgreSQL database
108+
109+
## S3 Bucket
110+
The project uses an AWS S3 bucket to store image files.
111+
112+
### Instructions
113+
1. Navigate to S3 from the AWS console.
114+
2. Create a public S3 bucket with default configurations (eg. no versioning, disable encryption).
115+
3. In your newly-created S3 bucket, go to the **Permissions** tab and add an additional bucket policy to enable access for other AWS services (ie. Kubernetes).
116+
117+
You can use the <a href="https://awspolicygen.s3.amazonaws.com/policygen.html" target="_blank">policy generator</a> tool to generate such an IAM policy. See an example below (change the bucket name in your case).
118+
```json
119+
{
120+
"Version":"2012-10-17",
121+
"Statement":[
122+
{
123+
"Sid":"Stmt1625306057759",
124+
"Principal":"*",
125+
"Action":"s3:*",
126+
"Effect":"Allow",
127+
"Resource":"arn:aws:s3:::test-nd9990-dev-wc"
128+
}
129+
]
130+
}
131+
```
132+
133+
> In the AWS S3 console, the CORS configuration must be JSON format. Whereas, the AWS CLI can use either JSON or XML format.
134+
135+
> Once the policies above are set and you are no longer testing locally, you can disable public access to your bucket.
136+
137+
## PostgreSQL Database
138+
We will create a PostgreSQL database using AWS RDS. This is used by the project to store user metadata.
139+
140+
### Instructions
141+
1. Navigate to RDS from the AWS console.
142+
2. Create a PostgreSQL database with the following configurations:
143+
144+
<center>
145+
146+
|**Field**|**Value**|
147+
|---|---|
148+
|Database Creation Method|Standard create |
149+
|Engine Option|PostgreSQL 12 or greater|
150+
|Templates |Free tier <small>(if no Free tier is available, select a different PostgreSQL version)</small>|
151+
|DB Instance Identifier|Your choice|
152+
|Master Username|Your choice|
153+
|Password|Your choice|
154+
|DB Instance Class|Burstable classes with minimal size |
155+
|VPC and Subnet |Default|
156+
|Public Access|Yes|
157+
|Database Authentication|Password authentication|
158+
|VPC security group|Either choose default or <br>create a new one|
159+
|Availability Zone|No preferencce|
160+
|Database port|`5432` (default)|
161+
</center>
162+
163+
2. Once the database is created successfully (this will take a few minutes), copy and save the database endpoint, master username, and password to your local machine. These values are required for the application to connect to the database.
164+
165+
3. Edit the security group's inbound rule to allow incoming connections from anywhere (`0.0.0.0/0`). This will allow an application that is running locally to connect to the database.
166+
167+
> Note: AWS RDS will automatically create a database with the name `postgres` if none is configured during the creation step. By following the setup instructions provided here, we will be using the default database name.
168+
169+
### Verify Connection
170+
Test the connection from your local PostgreSQL client.
171+
Assuming the endpoint is: `mypostgres-database-1.c5szli4s4qq9.us-east-1.rds.amazonaws.com`, you can run:
172+
```bash
173+
psql -h mypostgres-database-1.c5szli4s4qq9.us-east-1.rds.amazonaws.com -U [your-username] postgres
174+
# Provide the database password when prompted
175+
```
176+
If your connection is succesful, your terminal should print ` "postgres=>"`.
177+
178+
You can play around with some `psql` commands found [here](https://www.postgresql.org/docs/13/app-psql.html).
179+
180+
Afterwards, you can enter `\q` to quit.
181+
182+
# Project Configuration
183+
Once the local and remote prerequisites are set up, we will need to configure our application so that they can connect and utilize them.
184+
185+
## Fork and Clone the Project
186+
If you have not already done so, you will need to fork and clone the project so that you have your own copy to work with.
187+
188+
```bash
189+
git clone https://github.com/<YOUR_GITHUB_USERNAME>/nd9990-c3-microservices-exercises.git
190+
191+
cd nd9990-c3-microservices-exercises/project/
192+
```
193+
194+
## Configuration Values
195+
The application will need to connect to the AWS PostgreSQL database and S3 bucket that you have created.
196+
197+
We do **not** want to hard-code the configuration details into the application code. The code should not contain sensitive information (ie. username and password).
198+
199+
For this reason, we will follow a common pattern to store our credentials inside environment variables. We'll explain how to set these values in Mac/Linux environments and Windows environments followed by an example.
200+
201+
### Set Environment Variables in Mac/Linux
202+
#### Instructions
203+
1. Use the `set_env.sh` file present in the `project/` directory to configure these values on your local machine. This is a file that has been set up for your convenience to manage your environment.
204+
2. Prevent this file from being tracked in `git` so that your credentials don't become stored remotely:
205+
```bash
206+
# Stop git from tracking the set_env.sh file
207+
git rm --cached set_env.sh
208+
209+
# Prevent git from tracking the set_env.sh file
210+
echo *set_env.sh >> .gitignore
211+
```
212+
3. Running the command `source set_env.sh` will set your environment variables.
213+
> Note: The method above will set the environment variables temporarily. Every time you open a new terminal, you will have to run `source set_env.sh` to reconfigure your environment variables
214+
#### Verify Configurations
215+
1. Set the configuration values as environment variables:
216+
```bash
217+
source set_env.sh
218+
```
219+
2. Verify that environment variables were set by testing one of the expected values:
220+
```bash
221+
echo $POSTGRES_USERNAME
222+
```
223+
224+
### Set Environment Variables in Windows
225+
Set all the environment variables as shown in the `set_env.sh` file either using the **Advanced System Settings** or d GitBash/WSL terminal.
226+
227+
Below is an example. Make sure that you replace the values with ones that are applicable to the resources that you created in AWS.
228+
```bash
229+
setx POSTGRES_USERNAME postgres
230+
setx POSTGRES_PASSWORD abcd1234
231+
setx POSTGRES_HOST mypostgres-database-1.c5szli4s4qq9.us-east-1.rds.amazonaws.com
232+
setx POSTGRES_DB postgres
233+
setx AWS_BUCKET test-nd9990-dev-wc
234+
setx AWS_REGION us-east-1
235+
setx AWS_PROFILE nd9990
236+
setx JWT_SECRET hello
237+
setx URL http://localhost:8100
238+
```
239+
240+
# Get Started!
241+
Now that we have our prerequsites set up and configured, we will be following up this section with an overview of how to run the application.
242+
243+
## Project Assessment
244+
To understand how you project will be assessed, see the <a href="https://review.udacity.com/#!/rubrics/2804/view" target="_blank">Project Rubric</a>
Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
# Part 3 - Set up Travis continuous integration pipeline
2+
3+
Prior to setting up a multi-container application in Kubernetes, you will need to set up a CI pipeline to build and push our application code as Docker images in DockerHub.
4+
5+
The end result that we want is a setup where changes in your GitHub code will automatically trigger a build process that generates Docker images.
6+
7+
### Create Dockerhub Repositories
8+
9+
Log in to https://hub.docker.com/ and create four public repositories - each repository corresponding to your local Docker images.
10+
11+
* `reverseproxy`
12+
* `udagram-api-user`
13+
* `udagram-api-feed`
14+
* `udagram-frontend`
15+
16+
> Note: The names of the repositoriesare exactly the same as the `image name` specified in the *docker-compose-build.yaml* file
17+
18+
### Set up Travis CI Pipeline
19+
20+
Use Travis CI pipeline to build and push images to your DockerHub registry.
21+
22+
1. Create an account on https://travis-ci.com/ (not https://travis-ci.org/). It is recommended that you sign in using your Github account.
23+
24+
2. Integrate Github with Travis: Activate your GitHub repository with whom you want to set up the CI pipeline.
25+
26+
3. Set up your Dockerhub username and password in the Travis repository's settings, so that they can be used inside of `.travis.yml` file while pushing images to the Dockerhub.
27+
28+
4. Add a `.travis.yml` configuration file to the project directory locally.
29+
30+
In addition to the mandatory sections, your Travis file should automatically read the Dockerfiles, build images, and push images to DockerHub. For build and push, you can use either `docker-compose` or individual `docker build` commands as shown below.
31+
```bash
32+
# Assuming the .travis.yml file is in the project directory, and there is a separate sub-directory for each service
33+
# Use either `docker-compose` or individual `docker build` commands
34+
# Build
35+
- docker build -t udagram-api-feed ./udagram-api-feed
36+
# Do similar for other three images
37+
```
38+
39+
```bash
40+
# Tagging
41+
- docker tag udagram-api-feed sudkul/udagram-api-feed:v1
42+
# Do similar for other three images```
43+
```bash
44+
# Push
45+
# Assuming DOCKER_PASSWORD and DOCKER_USERNAME are set in the Travis repository settings
46+
- echo "$DOCKER_PASSWORD" | docker login -u "$DOCKER_USERNAME" --password-stdin
47+
- docker push sudkul/udagram-api-feed:v1
48+
# Do similar for other three images
49+
```
50+
> **Tip**: Use different tags each time you push images to the Dockerhub.
51+
52+
53+
5. Trigger your build by pushing your changes to the Github repository. All of these steps mentioned in the `.travis.yml` file will be executed on the Travis worker node. It may take upto 15-20 minutes to build and push all four images.
54+
55+
56+
6. Verify if the newly pushed images are now available in your Dockerhub account.
57+
58+
59+
### Screenshots
60+
So that we can verify your project’s pipeline is set up properly, please include the screenshots of the following:
61+
62+
1. DockerHub showing images that you have pushed
63+
2. Travis CI showing a successful build job
64+
65+
66+
### Troubleshooting
67+
68+
If you are not able to get through the Travis pipeline, and still want to push your local images to the Dockerhub (only for testing purposes), you can attempt the manual method.
69+
70+
Note that this is only for the troubleshooting purposes, such as verifying the deployment to the Kubernetes cluster.
71+
72+
* Log in to the Docker from your CLI, and tag the images with the name of your registry name (Dockerhub account username).
73+
```bash
74+
# See the list of current images
75+
docker images
76+
# Use the following syntax
77+
# In the remote registry (Dockerhub), we can have multiple versions of an image using "tags".
78+
# docker tag <local-image-name:current-tag> <registry-name>/<repository-name>:<new-tag>
79+
docker tag <local-image:tag> <dockerhub-username>/<repository>:<tag>
80+
```
81+
* Push the images to the Dockerhub.
82+
```bash
83+
docker login --username=<your-username>
84+
# Use the "docker push" command for each image, or
85+
# Use "docker-compose -f docker-compose-build.yaml push" if the names in the compose file are as same as the Dockerhub repositories.
86+
```
87+
88+

0 commit comments

Comments
 (0)