Skip to content

Commit 5626dc4

Browse files
committed
Release EN NL new home page
1 parent 929df64 commit 5626dc4

File tree

33 files changed

+368
-367
lines changed

33 files changed

+368
-367
lines changed

content/english/_index.md

Lines changed: 41 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,6 @@ Banner:
88
title_mobile_line2: knowledge for
99
title_mobile_line3_underline: responsible
1010
title_mobile_line3_after: algorithms
11-
phonetica: /æl.ɡə-ˈpruː.dəns/
12-
type: noun
13-
description1: Case-based normative advice for ethical algorithms
14-
description2: Guidance for decentralised self-assessment of fair AI
15-
description3: Jurisprudence for algorithms
1611
slogan:
1712
title: A European knowledge platform for
1813
labels:
@@ -30,22 +25,22 @@ About:
3025
overview_block:
3126
activities:
3227
- title: Knowledge platform
33-
subtitle: Statistical and legal expertise
28+
subtitle: Expertise in statistics, software development, legal framework and ethics
3429
url: /knowledge-platform/
3530
icon: fa-light fa-layer-group
3631
color: "#E3F0FE"
3732
- title: Algoprudence
38-
subtitle: Case-based normative advice
33+
subtitle: Case-based normative advice about responsible AI
3934
url: /algoprudence/
4035
icon: fa-light fa-scale-balanced
4136
color: "#F7CDBF"
4237
- title: Technical tools
43-
subtitle: Open source AI auditing tools
38+
subtitle: Open source tools for validating algorithmic systems
4439
url: /technical-tools/
4540
icon: fa-light fa-toolbox
4641
color: "#FFFDE4"
4742
- title: Project work
48-
subtitle: "Validation, AI Act etc."
43+
subtitle: Validation, AI Act implementation, organisational control measures etc.
4944
url: /knowledge-platform/project-work/
5045
icon: fa-light fa-magnifying-glass-plus
5146
color: "#E3F0FE"
@@ -55,20 +50,20 @@ Activity_Feed:
5550
- title: >-
5651
Local-only tools for AI validation
5752
intro: >
58-
Slides explaining the concept of local-only tools. Highlighting similarities and differences with cloud computing and providing examples of architerctural set-up for Algorithm Audit's unsupervised bias detection and synthetic data generation tool.
53+
Slides explaining the concept of 'local-only' tools. Highlighting similarities and differences with cloud computing, including examples how Algorithm Audit's open source software can be used for unsupervised bias detection and synthetic data generation tool.
5954
link: >-
6055
/technical-tools/bdt/#local-only
6156
image: /images/BDT/20250605_carrousel_local-only.png
6257
date: 05-06-2025
63-
type: white paper
58+
type: open source code
6459
- title: >-
6560
Public standard 'Meaningful human intervention for risk profiling
6661
algorithms'
6762
intro: >
6863
Step-by-step guide to prevent prohibited automated decision-making
69-
solely based on profilings, as stated in Article 22 GDPR. Based on
64+
solely based on profiling, as stated in article 22 GDPR. Based on
7065
case-based experiences with risk profiling algorithms and aligned with
71-
recent literature.
66+
recent (scientific) publications.
7267
link: >-
7368
/knowledge-platform/knowledge-base/public_standard_meaningful_human_intervention/
7469
image: /images/knowledge_base/Public_standard_meaningful_human_intervention.png
@@ -106,84 +101,78 @@ Areas_of_AI_expertise:
106101
- name: Sociotechnical evaluation of generative AI
107102
icon: fas fa-robot
108103
content: >
109-
Auditing data-analysis methods and algorithms used for decision support.
110-
Among others by checking organizational checks and balances, and
111-
assessing the quantitative dimension
112-
- name: AI Act implementation
104+
Evaluating Large Language Models (LLMs) and other general-purpose AI models for robustness, privacy and AI Act compliance. Based on real-world examples, are developing a framework to analyze content filters, guardrails and user interaction design choices. <a
105+
href="/knowledge-platform/project-work/#LLM-validation"
106+
style="text-decoration: underline;">Learn more</a> about our evaluation framework.
107+
- name: AI Act implementation and standards
113108
icon: fas fa-certificate
114109
content: >
115-
Algorithm Audit is well-informed about techno-legal concepts in the AI Act. As a member of Dutch and Europen standardization organisations NEN and CEN-CENELEC, Algorithm Audit monitors and contributes to the development of standards for AI systems. See also our public <a
116-
href="https://algorithmaudit.eu/knowledge-platform/standards/"
110+
Our open-source <a
111+
href="/technical-tools/implementation-tool/"
112+
style="text-decoration: underline;">AI Act Implementation Tool</a> helps organizations identifying AI systems and assigning the right risk category. As a member of Dutch and European standardization organisations NEN and CEN-CENELEC, Algorithm Audit monitors and contributes to the development of standards for AI systems. See also our public <a
113+
href="/knowledge-platform/standards/"
117114
style="text-decoration: underline;">knowledge base</a> on
118115
standardization
119116
- name: Bias analysis
120117
icon: fas fa-chart-pie
121118
content: >
122-
Auditing data-analysis methods and algorithms used for decision support.
123-
Among others by checking organizational checks and balances, and
124-
assessing the quantitative dimension
119+
We evaluate algorithmic systems both from a qualitative and quantitative dimension. Besides expertise about data analysis and AI engineering, we possess have in-depth knowledge of legal frameworks concerning non-discrimination, automated decision-making and organizational risk management. See our <a
120+
href="/knowledge-platform/knowledge-base/"
121+
style="text-decoration: underline;">public standards</a> how to deploy algorithmic systems responsibly.
125122
button_text: Discuss collaboration
126-
button_link: /knowledge-platform/project-work/
123+
button_link: /knowledge-platform/project-work/#form
127124
Distinctive_in:
128125
title: Distinctive in
129126
enable: true
130127
width_m: 4
131128
width_s: 2
132129
feature_item:
133-
- name: Independence
134-
icon: fas fa-star-of-life
130+
- name: Multi-disciplinary expertise
131+
icon: fas fa-brain
135132
content: >
136-
By working nonprofit and under explicit terms and conditions, we ensure
137-
the independence and quality of our audits and normative advice
138-
- name: Normative advice
139-
icon: fas fa-search
133+
We are pioneering the future of responsible AI by bringing together expertise in statistics, software development, law and ethics. Our work is widely read throughout Europe and beyond.
134+
- name: Not-for-profit
135+
icon: fas fa-seedling
140136
content: >
141-
Mindful of societal impact our commissions provide normative advice on
142-
ethical issues that arise in algorithmic use cases
143-
- name: Public knowledge
144-
icon: fab fa-slideshare
137+
We work closely together with private and public sector organisations, regulators and policy makers to foster knowledge exchange about responsible AI. Working nonprofit suits our activities and goals best.
138+
- name: Public knowledge building
139+
icon: fas fa-box-open
145140
content: >
146-
Audits and corresponding advice (*algoprudence*) are made <a
147-
href="https://algorithmaudit.eu/algoprudence/" style="text-decoration:
148-
underline;">publicly available</a>, increasing collective knowledge how
149-
to deploy and use algorithms in an responsible way
150-
button_text: Project work
141+
We make our reports, software and best-practices publicy available, contributing to collective knowledge on the responsible deployment and use of AI. We prioritize public knowledge building over protecting our intellectual property.
142+
button_text: Our projects
151143
button_link: /knowledge-platform/project-work/
152144
Supported_by:
153-
title: Collaborating with
145+
title: Working together with
154146
funders:
155-
- image: /images/supported_by/CoE.png
147+
- image: /images/partner logo-cropped/CoE.png
156148
link: "https://www.coe.int/en/web/portal/home"
157149
alt_text: Council of Europe
158150
- image: /images/partner logo-cropped/EAISF.png
159151
link: "https://europeanaifund.org/announcing-our-2022-open-call-grantees/"
160152
alt_text: European AI & Society Fund
161153
- image: /images/partner logo-cropped/CEN.jpg
162154
link: "https://www.cencenelec.eu"
163-
alt_text: "Europees standardisation committee "
164-
- image: /images/supported_by/HAI.png
165-
link: "https://hai.stanford.edu/ai-audit-challenge-2023-finalists"
166-
alt_text: Stanford University Human-Centered Artificial Intelligence Lab
167-
- image: /images/supported_by/BZK.jpg
155+
alt_text: "European standardisation committee "
156+
- image: /images/partner logo-cropped/MinBZK.png
168157
link: >-
169158
https://www.rijksoverheid.nl/ministeries/ministerie-van-binnenlandse-zaken-en-koninkrijksrelaties
170159
alt_text: Dutch Ministry of the Interior
160+
- image: /images/partner logo-cropped/MinJenV.png
161+
link: "https://www.rijksoverheid.nl/ministeries/ministerie-van-justitie-en-veiligheid"
162+
alt_text: Dutch Ministry of Justice and Security
171163
- image: /images/partner logo-cropped/DUO.png
172164
link: "https://duo.nl"
173165
alt_text: Dutch Executive Agency for Education
174166
- image: /images/partner logo-cropped/GemeenteAmsterdam.png
175167
link: "https://www.amsterdam.nl"
176168
alt_text: Municipality of Amsterdam
169+
- image: /images/partner logo-cropped/SIDN.png
170+
link: "https://www.sidnfonds.nl/excerpt"
171+
alt_text: Foundation Internet Domain registration Netherlands
177172
- image: /images/partner logo-cropped/NEN.svg
178173
link: "https://www.nen.nl"
179174
alt_text: Dutch standardisation institute
180-
- image: /images/supported_by/sidn.png
181-
link: "https://www.sidnfonds.nl/projecten/open-source-ai-auditing"
182-
alt_text: Foundation for Internet and Democracy Netherlands
183-
184-
button_text: And more
185-
button_link: /funded-by
186175
Title_video:
187-
title: The Movie
176+
title: Video
188177
video_mp4: /videos/AA_video_(1080p).mp4
189178
---

content/english/algoprudence/cases/aa202301_bert-based-disinformation-classifier.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Applying our self-build unsupervised [bias detection tool](/technical-tools/bdt
2626

2727
This case study, in combination with our [bias detection tool](/technical-tools/bdt/), has been selected as a finalist for [Stanford’s AI Audit Challenge 2023](https://hai.stanford.edu/ai-audit-challenge-2023-finalists).
2828

29-
{{< image id="stanford" width_desktop="6" width_mobile="12" image1="/images/supported_by/HAI.png" link1="https://hai.stanford.edu/ai-audit-challenge-2023-finalists" alt1="Stanford University" caption1="Stanford University" >}}
29+
{{< image id="stanford" width_desktop="6" width_mobile="12" image1="/images/partner logo-cropped/StanfordHAI.png" link1="https://hai.stanford.edu/ai-audit-challenge-2023-finalists" alt1="Stanford University" caption1="Stanford University" >}}
3030

3131
#### Presentation
3232

@@ -51,15 +51,15 @@ A visual presentation of this case study can be found in this [slide deck](http
5151

5252
{{< accordions_area_open id="actions" >}}
5353

54-
{{< accordion_item_open image="/images/supported_by/sidn.png" title="Funding for further development" id="sidn" date="01-12-2023" tag1="funding" tag2="open source" tag3="AI auditing tool" >}}
54+
{{< accordion_item_open image="/images/partner logo-cropped/SIDN.png" title="Funding for further development" id="sidn" date="01-12-2023" tag1="funding" tag2="open source" tag3="AI auditing tool" >}}
5555

5656
##### Description
5757

5858
[SIDN Fund](https://www.sidnfonds.nl/projecten/open-source-ai-auditing) is supporting Algorithm Audit for further development of the bias detection tool. On 01-01-2024, a [team](/nl/about/teams/#bdt) has started that is further developing a testing the tool.
5959

6060
{{< accordion_item_close >}}
6161

62-
{{< accordion_item_open title="Finalist selection Stanford's AI Audit Challenge 2023" image="/images/supported_by/HAI.png" id="ai_audit_challenge" date="28-04-2023" tag1="finalist" >}}
62+
{{< accordion_item_open title="Finalist selection Stanford's AI Audit Challenge 2023" image="/images/partner logo-cropped/StanfordHAI.png" id="ai_audit_challenge" date="28-04-2023" tag1="finalist" >}}
6363

6464
##### Description
6565

content/english/algoprudence/cases/aa202401_preventing-prejudice.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -75,7 +75,7 @@ Report *Preventing prejudice* has been <a href="https://www.rijksoverheid.nl/doc
7575

7676
{{< accordion_item_close >}}
7777

78-
{{< accordion_item_open title="DUO apologizes for indirect discrimination in college allowances control process" image="/images/supported_by/DUO.png" id="DUO-apologies" date="01-03-2024" tag1="press release" >}}
78+
{{< accordion_item_open title="DUO apologizes for indirect discrimination in college allowances control process" image="/images/partner logo-cropped/DUO.png" id="DUO-apologies" date="01-03-2024" tag1="press release" >}}
7979

8080
##### Description
8181

content/english/events/activities.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -310,7 +310,7 @@ events:
310310
311311
312312
![](</images/events/20240818_BZK.jpg>)
313-
image: /images/supported_by/BZK.jpg
313+
image: /images/partner logo-cropped/MinBZK.png
314314
date: 18-09-2024
315315
facets:
316316
- value: year_2024
@@ -419,7 +419,7 @@ events:
419419
420420
Interview:
421421
[https://www.sidnfonds.nl/nieuws/niet-elke-beslissing-is-te-kwantificeren](https://www.sidnfonds.nl/nieuws/niet-elke-beslissing-is-te-kwantificeren)
422-
image: /images/supported_by/sidn.png
422+
image: /images/partner logo-cropped/SIDN.png
423423
date: 31-05-2024
424424
facets:
425425
- value: year_2024

content/english/events/press_room.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ quick_navigation:
1717

1818
{{< accordions_area_open id="DUO_CBS" >}}
1919

20-
{{< accordion_item_open title="DUO control process biased towards students with a non-European migration background" id="DUO_CBS" background_color="#ffffff" date="22-05-2024" tag1="DUO" tag2="CBS" tag3="bias analysis" image="/images/supported_by/DUO.png" >}}
20+
{{< accordion_item_open title="DUO control process biased towards students with a non-European migration background" id="DUO_CBS" background_color="#ffffff" date="22-05-2024" tag1="DUO" tag2="CBS" tag3="bias analysis" image="/images/partner logo-cropped/DUO.png" >}}
2121

2222
**THE HAGUE - In its inspection of the legitimate use of student finance for students living away from home, DUO selected students for control with a non-European migration background significantly more often. This demonstrates an unconscious bias in DUO's control process. Students with a non-European migration background were assigned a higher risk score by a risk profile and were more often manually selected for a home visit. This is evident from follow-up research that NGO Algorithm Audit carried out on behalf of DUO, which was sent by the minister to the House of Representatives on May 22. The results of the research strengthen the outcomes of previous research, on the basis of which the minister apologized on behalf of the cabinet on March 1, 2024 for indirect discrimination in the control process.**
2323

@@ -35,7 +35,7 @@ The Bias Prevented report (addendum) can be found [here](https://algorithmaudit.
3535

3636
{{< accordion_item_close >}}
3737

38-
{{< accordion_item_open title="Irregularities identified in college allowances control process by Dutch public sector organization DUO" id="DUO" background_color="#ffffff" date="01-03-2024" tag1="DUO" tag2="audit report" tag3="" image="/images/supported_by/DUO.png" >}}
38+
{{< accordion_item_open title="Irregularities identified in college allowances control process by Dutch public sector organization DUO" id="DUO" background_color="#ffffff" date="01-03-2024" tag1="DUO" tag2="audit report" tag3="" image="/images/partner logo-cropped/DUO.png" >}}
3939

4040
**THE HAGUE – In its control process into misuse of the allowances for students living away from home, Dutch public sector organization DUO selected individuals who lived close to their parent(s) significantly more often. The risk-taxation algorithm, that was used as an assisting tool for selecting students, worked as expected. However, the combination of the algorithm and manual selection resulted in a large overrepresentation of certain groups. Selected students were visited at home to inspect whether allowances were unduly granted. This is the main conclusion of research carried out by NGO Algorithm Audit on behalf of DUO. DUO’s control process was discredited in 2023 after media attention, in which was mentioned that students with a migration background were accused of misuse more often than other students.**
4141

content/english/knowledge-platform/collaboration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ Over the course of 2023-24, Algorithm Audit has paired up with Dutch public sect
4141
* collaboratively submitted a request to the Dutch National Office of Statistics in order to conduct a large-scale study on proxy characteristics;
4242
* finding solutions to deal with the normative dimension of auditing and evaluating algorithmic-driven decision-making processes, for instance involvement of various stakeholders to qualitatively interpret quantitative measurements.
4343

44-
{{< image id="DUO" width_desktop="4" width_mobile="12" image1="/images/supported_by/DUO.png" alt1="Dienst Uitvoering Onderwijs (DUO)" caption1="Dienst Uitvoering Onderwijs (DUO)" >}}
44+
{{< image id="DUO" width_desktop="4" width_mobile="12" image1="/images/partner logo-cropped/DUO.png" alt1="Dienst Uitvoering Onderwijs (DUO)" caption1="Dienst Uitvoering Onderwijs (DUO)" >}}
4545

4646
{{< button button_text="Read the full audit report" button_link="/algoprudence/cases/aa202401_bias-prevented/" >}}
4747

0 commit comments

Comments
 (0)