Skip to content

Commit 54a0cf0

Browse files
authored
Merge pull request #28 from commjoen/dependabot/docker/wrongsecrets-balancer/node-18-alpine
Bump node from 16-alpine to 18-alpine in /wrongsecrets-balancer
2 parents dd94097 + e688f00 commit 54a0cf0

File tree

10 files changed

+1711
-1758
lines changed

10 files changed

+1711
-1758
lines changed

.github/workflows/test.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,9 @@ jobs:
66
runs-on: ubuntu-latest
77
steps:
88
- uses: actions/checkout@v3
9+
- uses: actions/setup-node@v3
10+
with:
11+
node-version: 18
912
- name: Install Balancer
1013
run: |
1114
cd cleaner
@@ -23,6 +26,9 @@ jobs:
2326
runs-on: ubuntu-latest
2427
steps:
2528
- uses: actions/checkout@v3
29+
- uses: actions/setup-node@v3
30+
with:
31+
node-version: 18
2632
- name: "Install & Build BalancerUI"
2733
run: |
2834
cd wrongsecrets-balancer/ui

aws/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ Are you done playing? Please run `terraform destroy` twice to clean up.
5757
### Test it
5858
When you have completed the installation steps, you can do `kubectl port-forward service/wrongsecrets-balancer 3000:3000` and then go to [http://localhost:3000](http://localhost:3000).
5959

60-
Want to know how well your cluster is holding up? Check with
60+
Want to know how well your cluster is holding up? Check with
6161

6262
```sh
6363
kubectl top nodes
@@ -69,7 +69,7 @@ Want to know how well your cluster is holding up? Check with
6969
When you're done:
7070

7171
1. Kill the port forward.
72-
2. Run the cleanup script: `cleanup-aws-loadbalancing-and-helm.sh`
72+
2. Run the cleanup script: `cleanup-aws-autoscaling-and-helm.sh`
7373
3. Run `terraform destroy` to clean up the infrastructure.
7474
1. If you've deployed the `shared-state` s3 bucket, also `cd shared-state` and `terraform destroy` there.
7575
4. Run `unset KUBECONFIG` to unset the KUBECONFIG env var.

aws/build-an-deploy-aws.sh

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@ eksctl create iamserviceaccount \
5858
--region=$AWS_REGION \
5959
--namespace=kube-system \
6060
--name=cluster-autoscaler \
61+
--role-name=AmazonEKSClusterAutoscalerRole \
6162
--attach-policy-arn=arn:aws:iam::${ACCOUNT_ID}:policy/AmazonEKSClusterAutoscalerPolicy \
6263
--override-existing-serviceaccounts \
6364
--approve

aws/cleanup-aws-loadbalancing-and-helm.sh renamed to aws/cleanup-aws-autoscaling-and-helm.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,4 +44,4 @@ eksctl delete iamserviceaccount \
4444
sleep 5 # Prevents race condition - command below may error out because it's still 'attached'
4545

4646
aws iam delete-policy \
47-
--policy-arn arn:aws:iam::${ACCOUNT_ID}:policy/AmazonEKSClusterAutoscalerPolicy
47+
--policy-arn arn:aws:iam::${ACCOUNT_ID}:policy/AmazonEKSClusterAutoscalerPolicy

cleaner/Dockerfile

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
FROM node:16-alpine as build
1+
FROM node:18-alpine as build
22
RUN mkdir -p /home/app
33
WORKDIR /home/app
44
COPY package.json package-lock.json ./
5-
RUN npm ci --production
5+
RUN npm ci --omit=dev
66

7-
FROM node:16-alpine
7+
FROM node:18-alpine
88
RUN addgroup --system --gid 1001 app && adduser app --system --uid 1001 --ingroup app
99
WORKDIR /home/app/
1010
COPY --from=build --chown=app:app /home/app/node_modules/ ./node_modules/

0 commit comments

Comments
 (0)