Skip to content

Commit 6ce8a7a

Browse files
authored
Merge pull request #141 from NGO-Algorithm-Audit/feature/structural_edits
Feature/structural edits
2 parents ecc20a6 + 1a3910b commit 6ce8a7a

31 files changed

+123
-24
lines changed

config/_default/menus.NL.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,8 +63,8 @@ url = "/nl/technical-tools/BDT"
6363
icon = "fa-table"
6464
[[main]]
6565
parent = "Technische tools"
66-
name = "Documentatie"
67-
url = "nl/technical-tools/documentation"
66+
name = "AI-verordening Implementatie Tool"
67+
url = "nl/technical-tools/implementation-tool"
6868
weight = 4
6969
[[main.params]]
7070
icon = "fa-file"

config/_default/menus.en.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -61,8 +61,8 @@ url = "/technical-tools/BDT"
6161
icon = "fa-table"
6262
[[main]]
6363
parent = "Technical tools"
64-
name = "Documentation"
65-
url = "/technical-tools/documentation"
64+
name = "AI Act Implementation Tool"
65+
url = "/technical-tools/implementation-tool"
6666
weight = 4
6767
[[main.params]]
6868
icon = "fa-file"

content/english/algoprudence/cases/aa202201_type-of-sim.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ Anonymized large multinational company with e-commerce platform.
9090

9191
The problem statement and advice report can be downloaded <a href="https://drive.google.com/file/d/1fSETUhxOz0nF2nznsWq-4TyngP6lU7yH/preview" target="_blank">here</a>.
9292

93-
{{< embed_pdf url="/pdf-files/algoprudence/Report_Type_SIM.pdf" url2="" width_desktop_pdf="" width_mobile_pdf="" >}}
93+
{{< embed_pdf url="/pdf-files/algoprudence/ALGO_AA202201/ALGO_AA202201_Report_Type_SIM.pdf" >}}
9494

9595
#### Normative advice commission
9696

content/english/algoprudence/cases/aa202301_bert-based-disinformation-classifier.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ A visual presentation of this case study can be found in this [slide deck](http
114114

115115
Dowload the full report and problem statement [here](https://drive.google.com/file/d/1GHPwDaal3oBJZluFYVR59e1_LHhP8kni/view?usp=sharing).
116116

117-
{{< pdf_frame articleUrl1="https://drive.google.com/file/d/1GHPwDaal3oBJZluFYVR59e1_LHhP8kni/preview" articleUrl2="" >}}
117+
{{< embed_pdf url2="/pdf-files/algoprudence/ALGO_AA202301/ALGO_AA202301A_Case_study_disinfo.pdf" url="/pdf-files/algoprudence/ALGO_AA202301/ALGO_AA202301P_Case_study_disinfo.pdf" >}}
118118

119119
#### Normative advice commission
120120

content/english/algoprudence/cases/aa202302_risk-profiling-for-social-welfare-reexamination.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ The advice report (AA:2023:02:A) has been presented to the Dutch Minister of Dig
8080

8181
Dowload the full report and problem statement [here](https://drive.google.com/file/d/1GHPwDaal3oBJZluFYVR59e1_LHhP8kni/view?usp=sharing).
8282

83-
{{< embed_pdf url="/pdf-files/algoprudence/AA202302A_EN.pdf" url2="/pdf-files/algoprudence/AA202302P_EN.pdf" >}}
83+
{{< embed_pdf url="/pdf-files/algoprudence/ALGO_AA202302/ALGO_AA202302P_EN.pdf" url2="/pdf-files/algoprudence/ALGO_AA202302/ALGO_AA202302A_EN.pdf" >}}
8484

8585
#### Normative advice commission
8686

content/english/algoprudence/cases/aa202401_preventing-prejudice.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ Education Executive Agency of The Netherlands (DUO)
6262

6363
The technical audit report (TA:AA:2024:01) can be downloaded [here](https://drive.google.com/file/d/17dwU4zAqpyixwVTKCYM7Ezq1VM5_kcDa/preview).
6464

65-
{{< pdf_frame articleUrl1="https://drive.google.com/file/d/1waGi9gduj10AOGSrSDqKWOMfqcRWwyGy/preview" articleUrl2="" width_desktop_pdf="6" width_mobile_pdf="6" >}}
65+
{{< embed_pdf url="/pdf-files/algoprudence/TA_AA202401/TA_AA202401_Preventing_prejudice.pdf" >}}
6666

6767
<!-- #### AI Act standards
6868

content/english/algoprudence/cases/aa202402_preventing-prejudice_addendum.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ Education Executive Agency of The Netherlands (DUO)
6666

6767
The full report (TA:AA:2024:02) can be found <a href="https://drive.google.com/file/d/1uOhR9qXHW6P0i4uP7RNhil2G2dXzFjrp/preview" target="_blank">here</a>.
6868

69-
{{< pdf_frame articleUrl1="https://drive.google.com/file/d/1uOhR9qXHW6P0i4uP7RNhil2G2dXzFjrp/preview" articleUrl2="" width_desktop_pdf="6" width_mobile_pdf="12" >}}
69+
{{< embed_pdf url="/pdf-files/algoprudence/TA_AA202402/TA_AA202402_Addendum_Preventing_prejudice.pdf" >}}
7070

7171
#### Financed by
7272

content/english/algoprudence/how-we-work.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ Advice of commission is published together with problem statement on our website
5454

5555
Over the years, we have developed our own deliberative audit methodology. Algorithm Audit's guidelines for convening a normative advice commission can be consulted in the below document. [Get in touch](/about/contact/) to share feedback about these guidelines.
5656

57-
{{< pdf_frame articleUrl1="https://drive.google.com/file/d/1qu-dv1ZJvmqjYi6tVC7bbpxuzQdThhOk/preview" width_desktop_pdf="6" width_mobile_pdf="12" >}}
57+
{{< embed_pdf url="/pdf-files/algoprudence/Guidelines normative advice commission EN.pdf" >}}
5858

5959
{{< container_close >}}
6060

content/english/technical-tools/BDT.md

Lines changed: 36 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -64,19 +64,48 @@ type: bias-detection-tool
6464

6565
<br>
6666

67-
##### What is the tool about?
67+
#### What is the tool about?
68+
69+
The tool identifies potentially unfairly treated groups of similar users by an AI system. The tool returns clusters of users for which the system is underperforming compared to the rest of the data set. The tool makes use of <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a> – an unsupervised statistal learning method. This means that no data are required on protected attributes of users, e.g., gender, nationality or ethnicity, to detect indirect discrimination, also referred to as higher-dimensional proxy or intersectional discrimination. The metric by which bias is defined can be manually chosen and is referred to as the `performance metric`.
70+
71+
#### What data can be processed?
72+
73+
Numerical and categorical data can be analysed. The type of data is automatically detected by the tool. The `performance metric` column should always contain numerical values. The user should indicate in the app whether a higher of lower value of the `performance metric` is considered to be better.
74+
75+
The tool contains a demo data set and a 'Try it out' button. More information can be found in the app.
76+
77+
<div>
78+
<p><u>Example of numerical data set</u>:</p>
79+
<style type="text/css">.tg{border-collapse:collapse;border-spacing:0}.tg td{border-color:#000;border-style:solid;border-width:1px;font-size:14px;overflow:hidden;padding:10px 5px;word-break:normal}.tg th{border-color:#000;border-style:solid;border-width:1px;font-size:14px;font-weight:400;overflow:hidden;padding:10px 5px;word-break:normal}.tg .tg-uox0{border-color:#grey;font-weight:700;text-align:left;vertical-align:top}.tg .tg-uoz0{border-color:#grey;text-align:left;vertical-align:top} .tg-1wig{font-weight:700;text-align:left;vertical-align:top}.tg .tg-0lax{text-align:left;vertical-align:top}</style>
80+
<table class="tg">
81+
<thead>
82+
<tr>
83+
<th class="tg-uox0">feat_1</th><th class="tg-uox0">feat_2</th><th class="tg-uox0">...</th><th class="tg-uox0">feat_n</th><th class="tg-uox0">perf_metr</th>
84+
</tr>
85+
</thead>
86+
<tbody>
87+
<tr><td class="tg-uoz0">10</td><td class="tg-uoz0">1</td><td class="tg-uoz0">...</td><td class="tg-uoz0">0.1</td><td class="tg-uoz0">1</td></tr>
88+
<tr><td class="tg-uoz0">20</td><td class="tg-uoz0">2</td><td class="tg-uoz0">...</td><td class="tg-uoz0">0.2</td><td class="tg-uoz0">1</td></tr>
89+
<tr><td class="tg-uoz0">30</td><td class="tg-uoz0">3</td><td class="tg-uoz0">...</td><td class="tg-uoz0">0.3</td><td class="tg-uoz0">0</td></tr>
90+
</tbody>
91+
</table>
92+
</div>
93+
<br>
6894

69-
The tool identifies potentially unfairly treated groups of similar users by an AI system. The tool returns clusters of users for which the system is underperforming compared to the rest of the data set. The tool makes use of <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a> – an unsupervised statistal learning method. This means that no data is required on protected attributes of users, e.g., gender, nationality or ethnicity, to detect higher-dimensional forms of apparently neutral differentiation, also referred to as higher-dimensional proxy or intersectional discrimination. The metric by which bias is defined can be manually chosen in advance and is referred to as the `performance metric`.
95+
#### How is my data processed?
7096

71-
##### How is my data processed?
97+
The tool is privacy preserving. It uses computing power of your own computer to analyze the attached data set. In this architectural setup, data is processed entirely on your device and it not uploaded to any third-party, such as cloud providers. This computing approach is called *local-first* and allows organisations to securely use tools locally. Instructions how the tool can be hosted locally, incl. source code, can be found <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">here</a>.
7298

73-
The tool is privacy preserving. It uses computing power of your own computer to analyze a dataset. In this architectural setup, data is processed entirely on your device and it not uploaded to any third party, such as cloud providers. This local-only feature allows organisations to securely use the tool with proprietary data. The used software is also available as <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a> `unsupervised-bias-detection`. [![!pypi](https://img.shields.io/pypi/v/unsupervised-bias-detection?logo=pypi&color=blue)](https://pypi.org/project/unsupervised-bias-detection/)
99+
[![!pypi](https://img.shields.io/pypi/v/unsupervised-bias-detection?logo=pypi\&color=blue)](https://pypi.org/project/unsupervised-bias-detection/)
100+
Software of the used statistical methods is available in a seperate <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection" target="_blank">Github repository</a>, and is also available as <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a> `unsupervised-bias-detection`.
74101

75-
##### What does the tool return?
102+
#### What does the tool return?
76103

77-
The tool returns a report which presents the cluster with the highest bias and describes this cluster by the features that characterizes it. This is quantitatively expressed by the (statistically significant) differences in feature means between the identified cluster and the rest of the data. These results serve as a starting point for a deliberative assessment by human experts to evaluate potential discrimination and unfairness in the AI system under review. The tool also visualizes the outcomes.
104+
The tool returns a pdf report or `.json` file with identified clusters. It specifically focusses on the identified cluster with highest bias and describes this cluster by the features that characterizes it. These results serve as a starting point for a deliberative assessment by human experts to evaluate potential discrimination and unfairness in the AI system under review. The tool also visualizes the outcomes.
78105

79106
Try the tool below ⬇️
107+
<!-- This is quantitatively expressed by the (statistically significant) differences in feature means between the identified cluster and the rest of the data. -->
108+
80109

81110
{{< container_close >}}
82111

@@ -104,7 +133,7 @@ Algorithm Audit's bias detection tool is part of OECD's <a href="https://oecd.ai
104133

105134
{{< container_open title="Hierarchical Bias-Aware Clustering (HBAC) algorithm" icon="fas fa-code-branch" id="HBAC" >}}
106135

107-
The bias detection tool currently works for tabular numerical and categorical data. The _Hierarchical Bias-Aware Clustering_ (HBAC) algorithm processes input data according to the k-means or k-modes clustering algorithm. The HBAC-algorithm is introduced by Misztal-Radecka and Indurkya in a [scientific article](https://www.sciencedirect.com/science/article/abs/pii/S0306457321000285) as published in *Information Processing and Management* (2021). Our implementation of the HBAC-algorithm can be found on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection/blob/master/README.md" target="_blank">Github</a>.
136+
The bias detection tool utilizes the _Hierarchical Bias-Aware Clustering_ (HBAC) algorithm. HBAC processes input data according to the k-means (for numerical data) or k-modes (for categorical data) clustering algorithm. The HBAC-algorithm is introduced by Misztal-Radecka and Indurkya in a [scientific article](https://www.sciencedirect.com/science/article/abs/pii/S0306457321000285) as published in *Information Processing and Management* (2021). Our implementation of the HBAC-algorithm can be found on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection/blob/master/README.md" target="_blank">Github</a>. The methodology has been reviewed by a team of machine learning engineers and statisticians, and is continuously undergoing evaluation.
108137

109138
{{< container_close >}}
110139

content/english/technical-tools/documentation.md

Lines changed: 37 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,4 +31,40 @@ overview_block:
3131

3232
{{< iframe src="https://ai-documentation.s3.eu-central-1.amazonaws.com/index.html" id="forms" height="500px" >}}
3333

34-
{{< webapp id="webapp" appId="AIActWizard" stylesheet="https://ai-documentation.s3.eu-central-1.amazonaws.com/AI-Act-Questionnaire-v1.0.0.css" src="https://ai-documentation.s3.eu-central-1.amazonaws.com/AI-Act-Questionnaire-v1.0.0.js" title="" >}}
34+
{{< webapp id="webapp" appId="AIActWizard" src="https://ai-documentation.s3.eu-central-1.amazonaws.com/AI-Act-Questionnaire-v1.0.0.js" title="" >}}
35+
36+
<style>
37+
/* Styling for form-group elements inside #AIActWizard */
38+
#AIActWizard .form-group {
39+
display: block;
40+
}
41+
42+
/* Styling for form-group elements header labels inside #AIActWizard */
43+
#AIActWizard .form-group .form-label {
44+
margin-left: 0;
45+
color: black;
46+
}
47+
48+
/* Styling for intermediate-output labels in #AIActWizard */
49+
#AIActWizard .intermediate-output label {
50+
font-weight: 700;
51+
}
52+
53+
/* Styling for intermediate-output textareas in #AIActWizard */
54+
#AIActWizard .intermediate-output textarea {
55+
border: none;
56+
background-color: transparent;
57+
resize: none;
58+
width: 100%;
59+
height: auto;
60+
padding: 0;
61+
margin: 0;
62+
font-size: inherit;
63+
font-family: inherit;
64+
line-height: inherit;
65+
color: inherit;
66+
overflow: hidden;
67+
white-space: pre-wrap;
68+
word-wrap: break-word;
69+
}
70+
</style>

0 commit comments

Comments
 (0)