Skip to content

Commit e07b226

Browse files
authored
Merge pull request #197 from RS-PYTHON/helm-debug
Improve S3 docs and CI
2 parents d274115 + 7ceff1f commit e07b226

File tree

3 files changed

+20
-19
lines changed

3 files changed

+20
-19
lines changed

.github/workflows/test-deployment.yml

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -65,12 +65,20 @@ jobs:
6565
run: |
6666
kubectl label node cluster.local node-role.kubernetes.io/infra=
6767
shell: bash
68+
- name: Deploy minio
69+
run: |
70+
# Minio for cloudnative pg - https://github.com/minio/minio/tree/master/helm/minio#installing-the-chart-toy-setup
71+
helm repo add minio https://charts.min.io/
72+
helm repo update minio
73+
helm install minio minio/minio --namespace minio --set mode=standalone --set replicas=1 --set persistence.enabled=false --set resources.requests.memory=512Mi --set rootUser=s3_access_key,rootPassword=s3_secret_key --set buckets[0].name=rs-cluster-psql,buckets[1].name=rs-cluster-velero --create-namespace --wait
74+
shell: bash
6875
- name: Generate inventory
6976
run: |
7077
cp -rfp inventory/sample inventory/mycluster
7178
mv inventory/mycluster/.env.template inventory/mycluster/.env
7279
mv inventory/mycluster/openrc.sh.template inventory/mycluster/openrc.sh
7380
sed -i 's!<changeme_with_full_path>/miniforge3/envs/rspy!/usr/share/miniconda/envs/rspy!g' inventory/mycluster/hosts.yaml
81+
sed -i 's!https://s3.gra.io.cloud.ovh.net!http://minio.minio.svc.cluster.local:9000!' inventory/mycluster/host_vars/setup/main.yaml
7482
conda run -n rspy --no-capture-output env PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=1 ansible-playbook registry.yaml -i inventory/mycluster/hosts.yaml -e ci_mode=true
7583
conda run -n rspy --no-capture-output env PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=1 ansible-playbook generate_inventory.yaml -i inventory/mycluster/hosts.yaml
7684
shell: bash
@@ -80,17 +88,10 @@ jobs:
8088
shell: bash
8189
- name: Deploy the apps (for real)
8290
run: |
83-
sed -i 's!debug: false!debug: true!g' roles/app-installer/tasks/install_app.yaml
91+
sed -i 's!debug: false!debug: true!g' roles/app-installer/defaults/main.yaml
8492
sed -i 's!cinder.csi.openstack.org!k8s.io/minikube-hostpath!g' apps/00-storage-class/sc-retain.yaml
8593
sed -i 's!instances: 3!instances: 1!g' apps/03-cloudnative-pg/cluster.yaml
8694
sed -i -e 's!https://iam.{{ platform_domain_name }}!http://keycloak-service.iam.svc.cluster.local:8080!g' -e 's!insecure_oidc_skip_issuer_verification="false"!insecure_oidc_skip_issuer_verification="true"!g' apps/oauth2-proxy/values.yaml
8795
88-
# Minio for cloudnative pg
89-
sed -i -e 's!{{ s3.access_key }}!access_key!' -e 's!{{ s3.secret_key }}!secret_key!' apps/03-cloudnative-pg/secrets.yaml
90-
sed -i -e 's!{{ postgresql.bucket }}!bucket!' -e 's!{{ s3.endpoint }}!http://minio.minio.svc.cluster.local:9000!' apps/03-cloudnative-pg/objectstore.yaml
91-
helm repo add minio https://charts.min.io/
92-
helm repo update minio
93-
helm install minio minio/minio --namespace minio --set mode=standalone --set persistence.enabled=false --set resources.requests.memory=1Gi --set rootUser=access_key --set rootPassword=secret_key --set buckets[0].name=bucket --create-namespace --wait
94-
9596
conda run -n rspy --no-capture-output env PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=1 ansible-playbook apps.yaml -i inventory/mycluster/hosts.yaml
9697
shell: bash

docs/how-to/Credentials.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ graylog:
4040
4141
## Reuse credentials
4242
43-
Like in the example values, you can reuse crendentials already set up in the inventory files. This functionnality is used in the sample inventory for the S3 keys and endpoints that are often the same accross applications:
43+
Like in the example values, you can reuse credentials already set up in the inventory files. This functionality is used in the sample inventory for the S3 keys and endpoints that are often the same accross applications:
4444
4545
```yaml
4646
# {{ inventory_dir }}/host_vars/setup/main.yaml
@@ -52,17 +52,17 @@ s3:
5252
```
5353
5454
```yaml
55-
# {{ inventory_dir }}/host_vars/setup/apps/thanos.yaml
56-
thanos:
55+
# apps/velero/values.yaml
56+
velero:
5757
s3:
58-
bucket: THANOS_BUCKET
59-
endpoint: "{{ common.s3.endpoint }}"
60-
region: "{{ common.s3.region }}"
61-
access_key: "{{ common.s3.access_key }}"
62-
secret_key: "{{ common.s3.secret_key }}"
58+
bucket: VELERO_BUCKET
59+
endpoint: "{{ s3.endpoint }}"
60+
region: "{{ s3.region }}"
61+
access_key: "{{ s3.access_key }}"
62+
secret_key: "{{ s3.secret_key }}"
6363
```
6464
65-
## Retrieve indivindual credentials from a HashiCorp Vault
65+
## Retrieve individual credentials from a HashiCorp Vault
6666
6767
You can retrieve credentials from a *HashiCorp Vault* instance using the *hvac* ansible plugin:
6868

inventory/sample/host_vars/setup/main.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,5 +19,5 @@ platform_domain_name: rspy.example.com
1919
s3:
2020
endpoint: https://s3.gra.io.cloud.ovh.net
2121
region: gra
22-
access_key: AK
23-
secret_key: SK
22+
access_key: s3_access_key
23+
secret_key: s3_secret_key

0 commit comments

Comments
 (0)