Skip to content

Commit a4238a9

Browse files
committed
added terms and conditions to all forms
1 parent d75b3b9 commit a4238a9

File tree

6 files changed

+101
-91
lines changed

6 files changed

+101
-91
lines changed

content/english/about/vacancies.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ form1:
1212
- label: |
1313
Name
1414
id: name
15+
required: true
1516
type: text
1617
- label: |
1718
Contact details
@@ -49,9 +50,11 @@ form1:
4950
file_type: '.docx, .pdf'
5051
type: file
5152
- label: >
52-
<a href="/privacy_policy/" target="_blank">Terms and conditions
53-
(link)</a>
54-
id: terms
53+
Terms and conditions <br> <span style="font-size:12px;
54+
color=#777;">Submitted data will only be processed for the purpose
55+
described above, kept for the minimum necessary duration, and is
56+
securely stored in a protected environment</span>
57+
id: terms-conditions
5558
values:
5659
- label: Agree
5760
value: agree

content/english/algoprudence/submit-a-case.md

Lines changed: 32 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,47 +1,11 @@
11
---
2-
content: |
3-
Helloi Test
4-
{{< team >}}
52
title: Submit a case
63
subtitle: >
74
Algorithm Audit conducts solicited and unsolicited audits. Fill in the below
85
form to provide us with preliminary information required to review your
96
algorithm. Or submit a carefully documented normative judgement for inlcusion
107
in our case repository.
118
image: /images/svg-illustrations/case_repository.svg
12-
form:
13-
title: Submit a case
14-
button_text: Send
15-
backend_link: 'https://formspree.io/f/xzbnrlan'
16-
id: submit-a-case
17-
questions:
18-
- label: Name of the algorithm
19-
id: name
20-
type: text
21-
- label: >-
22-
Short description – Define the specific task of the algorithm and its
23-
the context in which it operates (max. 200 words)
24-
id: description
25-
type: textarea
26-
- label: >-
27-
Technical dimension – Description of data collection, used statistical
28-
methodologies and used evaluation criteria
29-
id: dimensions
30-
type: textarea
31-
- label: >-
32-
Legal framework – Applicable laws and open legal norms, e.g., GDPR, EU
33-
non-discrimination law
34-
id: legal-framework
35-
type: textarea
36-
- label: >-
37-
Ethical issues – Description of the identified ethical issue given it's
38-
technical and legal framework
39-
id: ethical-issues
40-
type: textarea
41-
- label: Contact details
42-
id: contact-details
43-
type: email
44-
placeholder: Email address
459
team:
4610
title: Team Algoprudence
4711
icon: fas fa-user-friends
@@ -96,38 +60,52 @@ form1:
9660
backend_link: 'https://formspree.io/f/xzbnrlan'
9761
id: submit-a-case
9862
questions:
99-
- label: Name algorithm
63+
- label: |
64+
Name algorithm
10065
id: name
10166
required: true
10267
type: text
103-
- label: >-
68+
- label: >
10469
Short description – Define the specific task of the algorithm and its
10570
the context in which it operates (max. 200 words)
10671
id: description
10772
required: true
10873
type: textarea
109-
- label: >-
74+
- label: >
11075
Technical dimension – Description of data collection, used statistical
11176
methodologies and used evaluation criteria
11277
id: technical-dimension
11378
type: textarea
114-
- label: >-
79+
- label: >
11580
Legal framework – Applicable laws and open legal norms, e.g., GDPR, EU
11681
non-discrimination law
11782
id: legal-framework
11883
required: false
11984
type: textarea
120-
- label: >-
85+
- label: >
12186
Ethical issues – Description of the identified ethical issue given it’s
12287
technical and legal framework
12388
id: ethical-issue
12489
required: true
12590
type: textarea
126-
- label: Contact details
91+
- label: |
92+
Contact details
12793
id: contact-details
12894
required: false
12995
type: email
13096
placeholder: Mail address
97+
- label: >
98+
Terms and conditions <br> <span style="font-size:12px;
99+
color=#777;">Submitted data will only be processed for the purpose
100+
described above, kept for the minimum necessary duration, and is
101+
securely stored in a protected environment</span>
102+
id: terms-conditions
103+
values:
104+
- label: Agree
105+
value: agree
106+
id: agree
107+
required: true
108+
type: checkbox
131109
form2:
132110
title: Case information
133111
content: ''
@@ -158,6 +136,18 @@ form2:
158136
file_type: .pdf
159137
type: file
160138
placeholder: ''
139+
- label: >
140+
Terms and conditions <br> <span style="font-size:12px;
141+
color=#777;">Submitted data will only be processed for the purpose
142+
described above, kept for the minimum necessary duration, and is
143+
securely stored in a protected environment</span>
144+
id: terms-conditions
145+
values:
146+
- label: Agree
147+
value: agree
148+
id: agree
149+
required: true
150+
type: checkbox
161151
---
162152

163153
{{< tab_header width="6" tab1_id="case-for-review" default_tab="case-for-review" tab1_title="Submit a case for normative review" tab2_id="case-repository" tab2_title="Add a case to algoprudence repository" >}}

content/english/technical-tools/BDT.md

Lines changed: 12 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -52,22 +52,24 @@ text_field5:
5252
id: faq
5353
content: "##### Why this bias detection tool?\n\n* No data needed on protected attributes of users (unsupervised bias detection);\n* Model-agnostic (AI binary classifiers only);\n* Connecting quantitative tools with qualitative methods to assess fair AI;\n* Developed open-source and not-for-profit.\n\n##### By whom can the bias detection tool be used?\_\n\nThe bias detection tool allows the entire ecosystem involved in auditing AI, e.g., data scientists, journalists, policy makers, public- and private auditors, to use quantitative methods to detect bias in AI systems.\n\n##### What does the tool compute?\_\n\nA statistical method is used to compute which clusters are relatively often misclassified by an AI system. A cluster is a group of data points sharing similar features. The tool returns a report in which identified differences (between feature means) are visualized and statistical significant feature differences are tested (Welch’s two-samples t-test for unequal variances).\n\n##### The tool detects prohibited discrimination in AI?\_\n\nNo. The bias detection tool serves as a starting point to assess potentially unfair AI classifiers with the help of subject-matter expertise. The features of identified clusters are examined on critical links with protected grounds, and whether the measured disparities are legitimate. This is a qualitative assessment for which the context-sensitive legal doctrine provides guidelines, i.e., to assess the legitimacy of the aim pursued and whether the means of achieving that aim are\_appropriate\_and\_necessary.\n\n##### For what type of AI does the tool work?\_\n\nCurrently, only\_binary classification\_algorithms can be reviewed. For instance, prediction of loan approval (yes/no), disinformation detection (true/false) or disease detection (positive/negative).\n\n##### What happens with my data?\n\nYour .csv file is uploaded to a AWS bucket, where it is processed. Once the clustering algorithm is finised the data is immediately deleted.\n\n##### &#xA;In sum\_\n\nQuantitative methods, such as unsupervised bias detection, are helpful to discover potentially unfair treated groups of similar users in AI systems in a scalable manner. Automated identification of cluster disparities in AI models allows human experts to assess observed disparities in a qualitative manner, subject to political, social and environmental traits. This two-pronged approach bridges the gap between the qualitative requirements of law and ethics, and the quantitative nature of AI (see figure). In making normative advice, on identified ethical issues publicly available, over time a repository of 'techno-ethical jurisprudence' emerges; from which data scientists and public authorities can distill best practices to build fairer AI (see our\_case reviews).\_\n"
5454
reports_preview:
55-
title: Example reports
55+
title: Example output bias detection tool
5656
icon: fas fa-file
57-
button_text: ''
58-
button_link: ''
57+
button_text: Case repository
58+
button_link: /algoprudence
5959
id: example-reports
6060
feature_item:
61+
- name: Normative judgement commission
62+
image: /images/algoprudence/AA202301/Cover.png
63+
link: /algoprudence/cases/aa202301_bert-based-disinformation-classifier/
64+
content: >
65+
An advice commission believes there is a low risk of
66+
(higher-dimensional) proxy discrimination by the BERT-based
67+
disinformation classifier
6168
- name: FPR clustering results
6269
image: /images/BDT/Example_report.png
6370
link: >-
6471
https://static-files-pdf.s3.amazonaws.com/bias_scan_FPR_test_pred_BERT.pdf
6572
content: "An example report for the\_[BERT-based disinformation detection (FPR) case study](https://static-files-pdf.s3.amazonaws.com/bias_scan_FPR_test_pred_BERT.pdf)\n"
66-
- name: FNR clustering results
67-
image: /images/BDT/Example_report.png
68-
link: >-
69-
https://static-files-pdf.s3.amazonaws.com/bias_scan_FPR_test_pred_BERT.pdf
70-
content: "An example report for the\_[BERT-based disinformation detection (FNR) case study](https://static-files-pdf.s3.amazonaws.com/bias_scan_FNR_test_pred_BERT.pdf)\n"
7173
team:
7274
title: Bias Detection Tool Team
7375
icon: fas fa-user-friends
@@ -117,6 +119,8 @@ Try the tool below ⬇️
117119

118120
{{< promo_bar content="Do you appreciate the work of Algorithm Audit? ⭐️ us on" id="promo" >}}
119121

122+
{{< reports_preview >}}
123+
120124
{{< container_open title="Finalist Stanford’s AI Audit Challenge 2023" icon="fas fa-medal" id="finalist" >}}
121125

122126
Under the name Joint Fairness Assessment Method (JFAM) our bias scan tool has been selected as a finalist in [Stanford’s AI Audit Competition 2023](https://hai.stanford.edu/ai-audit-challenge-2023-finalists).
@@ -154,8 +158,6 @@ What input does the bias scan tool need? A csv file of max. 5GB with feature col
154158

155159
{{< container_close >}}
156160

157-
{{< reports_preview >}}
158-
159161
{{< container_open title="FAQ" icon="fas fa-question-circle" >}}
160162

161163
##### Why this bias detection tool?

content/nederlands/about/vacancies.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ form1:
1212
- label: |
1313
Name
1414
id: name
15+
required: true
1516
type: text
1617
- label: |
1718
Contact details
@@ -49,9 +50,11 @@ form1:
4950
file_type: '.docx, .pdf'
5051
type: file
5152
- label: >
52-
<a href="/privacy_policy/" target="_blank">Terms and conditions
53-
(link)</a>
54-
id: terms
53+
Terms and conditions <br> <span style="font-size:12px;
54+
color=#777;">Submitted data will only be processed for the purpose
55+
described above, kept for the minimum necessary duration, and is
56+
securely stored in a protected environment</span>
57+
id: terms-conditions
5558
values:
5659
- label: Agree
5760
value: agree

content/nederlands/algoprudence/submit-a-case.md

Lines changed: 24 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -30,37 +30,6 @@ team:
3030
name: Samaa Mohammad-Ulenberg
3131
bio: |
3232
Bestuurder
33-
form:
34-
title: Dien een case in
35-
button_text: Verstuur
36-
backend_link: 'https://formspree.io/f/xzbnrlan'
37-
questions:
38-
- label: Naam algoritme
39-
id: naam
40-
type: text
41-
- label: Korte beschrijving van algoritme (max. 100 woorden)
42-
id: beschrijving
43-
type: textarea
44-
- label: >-
45-
Technische aspecten – Beschrijving van o.a. verzamelde data, gehanteerde
46-
statistische methode en gebruikte evaluatie criteria
47-
id: technische-dimensie
48-
type: textarea
49-
- label: >-
50-
Juridisch kader – Relevante wet- en regelgeving, beschrijving van open
51-
juridische normen, bijvoorbeeld in de Algemene Verordening
52-
Gegevensbescherming (AVG) of de Algemene Wet Gelijke Behandeling (AWGB)
53-
id: juridische-dimensie
54-
type: textarea
55-
- label: >-
56-
Ethische aspecten – Beschrijving van geïdentificeerd ethische kwesties
57-
gegeven de technische en juridische achtergrond
58-
id: ethische-dimensie
59-
type: textarea
60-
- label: Contactgegevens
61-
id: contactgegevens
62-
type: email
63-
placeholder: Emailadres
6433
reports_preview:
6534
title: Recente audits
6635
icon: fas fa-file
@@ -118,6 +87,18 @@ form1:
11887
id: reaction
11988
type: email
12089
placeholder: Emailadres
90+
- label: >
91+
Voorwaarden <br> <span style="font-size:12px;
92+
color=#777;">Verstrekte gegevens worden alleen verwerkt voor het hierboven beschreven doel,
93+
de gegevens worden niet langer opgeslagen dan strikt noodzakelijk en worden opgeslagen in
94+
een beschermde omgeving</span>
95+
id: terms-conditions
96+
values:
97+
- label: Agree
98+
value: agree
99+
id: agree
100+
required: true
101+
type: checkbox
121102
form2:
122103
title: Informatie over het oordeel
123104
content: ''
@@ -143,6 +124,18 @@ form2:
143124
id: documents
144125
file_upload_text: Kies bestand
145126
type: file
127+
- label: >
128+
Voorwaarden <br> <span style="font-size:12px;
129+
color=#777;">Verstrekte gegevens worden alleen verwerkt voor het hierboven beschreven doel,
130+
de gegevens worden niet langer opgeslagen dan strikt noodzakelijk en worden opgeslagen in
131+
een beschermde omgeving</span>
132+
id: terms-conditions
133+
values:
134+
- label: Agree
135+
value: agree
136+
id: agree
137+
required: true
138+
type: checkbox
146139
---
147140

148141
{{< tab_header width="6" tab1_id="case-for-review" default_tab="case-for-review" tab1_title="Dien een case in voor beoordeling" tab2_id="case-repository" tab2_title="Voeg oordeel toe aan algoprudentie uitspraken" >}}

content/nederlands/technical-tools/BDT.md

Lines changed: 21 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,25 @@ web_app:
1616
icon: fas fa-cloud
1717
id: web-app
1818
content: ''
19+
reports_preview:
20+
title: Voorbeeld output bias detectie tool
21+
icon: fas fa-file
22+
button_text: Overzicht casuïstiek
23+
button_link: /nl/algoprudence
24+
id: example-reports
25+
feature_item:
26+
- name: Normatief oordeel commissie
27+
image: /images/algoprudence/AA202301/Cover.png
28+
link: /algoprudence/cases/aa202301_bert-based-disinformation-classifier/
29+
content: >
30+
Adviescommissie oordeelt dat er een laag risico is op
31+
(hoger-dimensionale) proxydiscriminatie bij gebruik van de
32+
BERT-gebaseerde desinformatie detectie-algoritme
33+
- name: FPR clustering resultaten
34+
image: /images/BDT/Example_report.png
35+
link: >-
36+
https://static-files-pdf.s3.amazonaws.com/bias_scan_FPR_test_pred_BERT.pdf
37+
content: "Voorbeeld van automatisch gegenereerde biasgegevens over \_[BERT-gebaseerde desinformatie detectie-algoritme (FPR) case study](https://static-files-pdf.s3.amazonaws.com/bias_scan_FPR_test_pred_BERT.pdf)\n"
1938
team:
2039
title: Bias Detectie Tool Team
2140
icon: fas fa-user-friends
@@ -59,6 +78,8 @@ Gebruik de tool hieronder ⬇️
5978

6079
{{< promo_bar content="Waardeer je het werk van Algorithm Audit? ⭐️ ons op" id="promo" >}}
6180

81+
{{< reports_preview >}}
82+
6283
{{< container_open title="Finalist Stanford’s AI Audit Challenge 2023" icon="fas fa-medal" id="finalist" >}}
6384

6485
Met de inzending Joint Fairness Assessment Method (JFAM) is Algorithm Audit's bias detectie tool geselecteerd als finalist voor [Stanford’s AI Audit Competition 2023](https://hai.stanford.edu/ai-audit-challenge-2023-finalists).
@@ -96,8 +117,6 @@ Welke input data kan de bias detectie tool verwerken? Een csv-bestand van maxima
96117

97118
{{< container_close >}}
98119

99-
{{< reports_preview >}}
100-
101120
{{< container_open title="Veelgestelde vragen" icon="fas fa-question-circle" >}}
102121

103122
##### Waarom deze bias detectie tool?

0 commit comments

Comments
 (0)