Skip to content

Commit 27261e9

Browse files
authored
Merge pull request #253 from NGO-Algorithm-Audit/feature/container-add-accordion-option
Feature/container add accordion option
2 parents 821338f + e712107 commit 27261e9

File tree

8 files changed

+123
-74
lines changed

8 files changed

+123
-74
lines changed

content/english/technical-tools/BDT.md

Lines changed: 43 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,34 @@
11
---
22
title: Unsupervised bias detection tool
33
subtitle: >
4-
Local-first tool using statistical analysis to identify groups that may be subject to unfair treatment by algorithms or AI. The tool informs
5-
the qualitative doctrine of law and ethics which disparities need to be
4+
Local-first tool using statistical analysis to identify groups that may be
5+
subject to unfair treatment by algorithms or AI. The tool informs the
6+
qualitative doctrine of law and ethics which disparities need to be
67
scrutinised manually by domain experts.
78
image: /images/svg-illustrations/illustration_cases.svg
9+
quick_navigation:
10+
title: Content overview
11+
links:
12+
- title: Introduction
13+
url: '#info'
14+
- title: Web app
15+
url: '#web-app'
16+
- title: Source code
17+
url: '#source-code'
18+
- title: Anomaly detection algorithm
19+
url: '#HBAC'
20+
- title: Scientific paper and audit report
21+
url: '#scientific-paper'
22+
- title: Local-first computing
23+
url: '#local-first'
24+
- title: Supported by
25+
url: '#supported-by'
26+
- title: Awards and acknowledgements
27+
url: '#awards-acknowledgements'
28+
- title: Summary
29+
url: '#summary'
30+
- title: Team
31+
url: '#team'
832
reports_preview:
933
title: Example output bias detection tool
1034
icon: fas fa-file
@@ -52,41 +76,18 @@ team:
5276
name: Mackenzie Jorgensen PhD
5377
bio: |
5478
Researcher Alan Turing Institute, London
55-
quick_navigation:
56-
title: Content overview
57-
links:
58-
- title: Introduction
59-
url: "#info"
60-
- title: Web app
61-
url: "#web-app"
62-
- title: Source code
63-
url: "#source-code"
64-
- title: Anomaly detection algorithm
65-
url: "#HBAC"
66-
- title: Scientific paper and audit report
67-
url: "#scientific-paper"
68-
- title: Local-first computing
69-
url: "#local-first"
70-
- title: Supported by
71-
url: "#supported-by"
72-
- title: Awards and acknowledgements
73-
url: "#awards-acknowledgements"
74-
- title: Summary
75-
url: "#summary"
76-
- title: Team
77-
url: "#team"
7879
type: bias-detection-tool
7980
---
8081

8182
<!-- Introduction -->
8283

83-
{{< container_open title="Introduction – Unsupervised bias detection tool" icon="fas fa-search" id="info" >}}
84+
{{< container_open title="Introduction – Unsupervised bias detection tool" icon="fas fa-search " id="info" isAccordion="" >}}
8485

8586
<br>
8687

8788
#### What is the tool about?
8889

89-
The tool identifies groups where an algorithm or AI system shows variations in performance. This type of monitoring is referred to as _anomaly detection_. To identify anomalous patterns, the tool uses <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a>. Clustering is a form of _unsupervised learning_. This means detecting disparate treatment (bias) does not require any data on protected attributes of users, such as gender, nationality, or ethnicity. The metric used to measure bias can be manually selected and is referred to as the `bias metric`.
90+
The tool identifies groups where an algorithm or AI system shows variations in performance. This type of monitoring is referred to as *anomaly detection*. To identify anomalous patterns, the tool uses <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a>. Clustering is a form of *unsupervised learning*. This means detecting disparate treatment (bias) does not require any data on protected attributes of users, such as gender, nationality, or ethnicity. The metric used to measure bias can be manually selected and is referred to as the `bias metric`.
9091

9192
#### What data can be processed?
9293

@@ -125,7 +126,7 @@ The tool identifies deviating clusters. A summary of the results is made availab
125126

126127
#### How is my data processed?
127128

128-
The tool is privacy-friendly because the data is processed entirely within the browser. The data does not leave your computer or the environment of your organization. The tool utilizes the computing power of your own computer to analyze the data. This type of browser-based software is referred to as _local-first_. The tool does not upload data to third parties, such as cloud providers. Instructions on how to host the tool and local-first architecture can be hosted locally within your own organization can be found on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
129+
The tool is privacy-friendly because the data is processed entirely within the browser. The data does not leave your computer or the environment of your organization. The tool utilizes the computing power of your own computer to analyze the data. This type of browser-based software is referred to as *local-first*. The tool does not upload data to third parties, such as cloud providers. Instructions on how to host the tool and local-first architecture can be hosted locally within your own organization can be found on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
129130

130131
Try the tool below ⬇️
131132

@@ -143,18 +144,17 @@ Try the tool below ⬇️
143144

144145
{{< container_open title="Source code" id="source-code" icon="fas fa-toolbox" >}}
145146

146-
- The source code of the anolamy detection-algorithm is available on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection" target="_blank">Github</a> and as a <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a>: `pip install unsupervised-bias-detection`.
147-
[![!pypi](https://img.shields.io/pypi/v/unsupervised-bias-detection?logo=pypi&color=blue)](https://pypi.org/project/unsupervised-bias-detection/)
148-
149-
- The architecture to run web apps local-first is also available on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
147+
* The source code of the anolamy detection-algorithm is available on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection" target="_blank">Github</a> and as a <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a>: `pip install unsupervised-bias-detection`.
148+
[![!pypi](https://img.shields.io/pypi/v/unsupervised-bias-detection?logo=pypi\&color=blue)](https://pypi.org/project/unsupervised-bias-detection/)
149+
* The architecture to run web apps local-first is also available on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
150150

151151
{{< container_close >}}
152152

153153
<!-- Anolamy detection algorithm -->
154154

155155
{{< container_open title="Anolamy detection algorithm – Hierarchical Bias-Aware Clustering (HBAC)" icon="fas fa-code-branch" id="HBAC" >}}
156156

157-
The tool uses the _Hierarchical Bias-Aware Clustering_ (HBAC) algorithm. HBAC processes input data according to the k-means (for numerical data) or k-modes (for categorical data) clustering algorithm. The HBAC-algorithm is introduced by Misztal-Radecka and Indurkya in a [scientific article](https://www.sciencedirect.com/science/article/abs/pii/S0306457321000285) as published in *Information Processing and Management* (2021). Our implementation of the HBAC-algorithm, including additional methodological checks to distinguish real bias from noise, such as sample splitting, statistical hypothesis testing and measuring cluster stability, can be found in the <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection/blob/master/README.md" target="_blank">unsupervised-bias-detection</a> pip package.
157+
The tool uses the *Hierarchical Bias-Aware Clustering* (HBAC) algorithm. HBAC processes input data according to the k-means (for numerical data) or k-modes (for categorical data) clustering algorithm. The HBAC-algorithm is introduced by Misztal-Radecka and Indurkya in a [scientific article](https://www.sciencedirect.com/science/article/abs/pii/S0306457321000285) as published in *Information Processing and Management* (2021). Our implementation of the HBAC-algorithm, including additional methodological checks to distinguish real bias from noise, such as sample splitting, statistical hypothesis testing and measuring cluster stability, can be found in the <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection/blob/master/README.md" target="_blank">unsupervised-bias-detection</a> pip package.
158158

159159
{{< container_close >}}
160160

@@ -188,11 +188,11 @@ Local-first computing is the opposite of cloud computing: the data is not upload
188188

189189
<!-- Supported by -->
190190

191-
{{< container_open title="Supported by" icon="fas fa-toolbox" id="supported-by">}}
191+
{{< container_open title="Supported by" icon="fas fa-toolbox" id="supported-by" >}}
192192

193193
This tool is developed with support of public and philanthropic organisations.
194194

195-
{{< accordions_area_open id="supported-by-accordion">}}
195+
{{< accordions_area_open id="supported-by-accordion" >}}
196196

197197
{{< accordion_item_open title="Innovation grant Dutch Ministry of the Interior" image="/images/supported_by/BZK.jpg" tag1="2024-25" >}}
198198

@@ -218,7 +218,7 @@ In 2024, the SIDN Fund <a href="https://www.sidnfonds.nl/projecten/open-source-a
218218

219219
<!-- Awards and acknowledgements -->
220220

221-
{{< container_open title="Awards and acknowledgements" icon="fas fa-medal" id="awards-acknowledgements">}}
221+
{{< container_open title="Awards and acknowledgements" icon="fas fa-medal" id="awards-acknowledgements" >}}
222222

223223
This tool has received awards and is acknowledged by various <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection?tab=readme-ov-file#contributing-members" target="_blank">stakeholders</a>, including civil society organisations, industry representatives and academics.
224224

@@ -246,16 +246,16 @@ The unsupervised bias detection tool is part of OECD's <a href="https://oecd.ai/
246246

247247
<!-- Summary -->
248248

249-
{{< container_open title="Summary" icon="fas fa-dot-circle" id="summary">}}
249+
{{< container_open title="Summary" icon="fas fa-dot-circle" id="summary" >}}
250250

251251
Key take-aways about unsupervised bias detection tool:
252252

253-
- <span style="color:#005AA7">Quantitative-qualitative research method</span>: Data-driven bias testing combined with the balanced and context-sensitive judgment of human experts;
254-
- <span style="color:#005AA7">Unsupervised bias detection</span>: No user data needed on protected attributes (_unsupervised learning_);
255-
- <span style="color:#005AA7">Anolamy detection</span>: Scalable method based on statistical analysis;
256-
- <span style="color:#005AA7">Detects complex bias</span>: Identifies unfairly treated groups characterized by mixture of features, detects intersectional bias;
257-
- <span style="color:#005AA7">Model-agnostic</span>: Works for all binary classification algorithms and AI systems;
258-
- <span style="color:#005AA7">Open-source and not-for-profit</span>: User friendly and free to use for the entire AI auditing community.
253+
* <span style="color:#005AA7">Quantitative-qualitative research method</span>: Data-driven bias testing combined with the balanced and context-sensitive judgment of human experts;
254+
* <span style="color:#005AA7">Unsupervised bias detection</span>: No user data needed on protected attributes (*unsupervised learning*);
255+
* <span style="color:#005AA7">Anolamy detection</span>: Scalable method based on statistical analysis;
256+
* <span style="color:#005AA7">Detects complex bias</span>: Identifies unfairly treated groups characterized by mixture of features, detects intersectional bias;
257+
* <span style="color:#005AA7">Model-agnostic</span>: Works for all binary classification algorithms and AI systems;
258+
* <span style="color:#005AA7">Open-source and not-for-profit</span>: User friendly and free to use for the entire AI auditing community.
259259

260260
{{< container_close >}}
261261

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
{{ $_hugo_config := `{ "version": 1 }` }}
2-
2+
</div>
33
</div>
44
</div>
Lines changed: 28 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,38 @@
11
{{ $_hugo_config := `{ "version": 1 }` }}
2-
<div id={{.Get "id" }} class="container-fluid mt-5 p-0">
3-
<div class="shadow mobile-desktop-container-layout rounded-lg">
4-
<div class="row">
5-
<div class="col-12">
2+
{{ $id := .Get "id" }}
3+
{{ $isAccordion := .Get "isAccordion" }}
4+
{{ if $isAccordion }}
5+
<script>
6+
docReady(function () {
7+
new Accordion('.accordion_container_{{$id}}');
8+
9+
$(".accordion_container_{{$id}} .ac-trigger").click(function() {
10+
const isExpanded = $(this).attr("aria-expanded");
11+
const elementToSetChevronOn = $(this).find("h3 > span.fas");
12+
if (isExpanded === "true") {
13+
elementToSetChevronOn.removeClass("fa-chevron-up");
14+
elementToSetChevronOn.addClass("fa-chevron-down");
15+
} else {
16+
elementToSetChevronOn.removeClass("fa-chevron-down");
17+
elementToSetChevronOn.addClass("fa-chevron-up");
18+
}
19+
});
20+
});
21+
</script>
22+
{{ end }}
23+
<div id={{$id}} class="{{ if $isAccordion }} accordion_container_{{$id}} {{ end }} container-fluid mt-5 p-0">
24+
<div class="{{ if $isAccordion }} ac {{ end }} shadow mobile-desktop-container-layout rounded-lg ">
25+
<div class="{{ if $isAccordion }} ac-header {{ end }} ">
26+
<div class="col-12 row ac--has-own-icon {{ if $isAccordion }} ac-trigger {{ end }}">
627

728
<!-- Title and icon -->
829
<div>
930
<h3>
10-
<span class="{{.Get "icon" }} icon mb-4 pl-5"></span>
31+
<span class="{{if $isAccordion }} fas fa-chevron-up {{else}} {{.Get "icon" }} {{end}} icon mb-4 pl-5"></span>
1132
{{ .Get "title" }}
1233
</h3>
1334
</div>
1435

1536
</div>
16-
</div>
37+
</div>
38+
<div class="{{ if $isAccordion }} ac-panel {{ end }} ">

static/css/accordion.min.css

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
padding: 0
2121
}
2222

23-
.ac .ac-trigger {
23+
.ac .ac-trigger:not( .ac--has-own-icon) {
2424
text-align: left;
2525
width: 100%;
2626
padding: 8px 32px 8px 8px;
@@ -34,7 +34,7 @@
3434
border: 0
3535
}
3636

37-
.ac .ac-trigger::after {
37+
.ac .ac-trigger:not( .ac--has-own-icon)::after {
3838
content: "+";
3939
text-align: center;
4040
width: 15px;
@@ -68,6 +68,6 @@
6868
visibility: visible
6969
}
7070

71-
.ac.is-active>.ac-header .ac-trigger::after {
71+
.ac.is-active>.ac-header .ac-trigger.ac-trigger:not(.ac--has-own-icon)::after {
7272
content: "–"
7373
}

static/js/quicknavigation.js

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,16 +5,20 @@ docReady(function () {
55
var sections = quickLinks.map(function () {
66
return $(this).attr("href");
77
});
8-
var sectionOffsets = sections.map(function (index, section) {
9-
return $(section).offset().top - 300; // Adjust offset as needed
8+
var sectionOffsets = sections.map(function (_, section) {
9+
const sectionElement = $(section);
10+
if (sectionElement.length === 0) {
11+
return -1;
12+
}
13+
return sectionElement.offset().top - 250; // Adjust offset as needed
1014
});
1115
let highestActiveIndex = -1;
1216
var scrollPos = $(this).scrollTop();
1317
quickLinks.each(function () {
1418
$(this).removeClass("highlight-red-sm");
1519
});
1620
sectionOffsets.each(function (index) {
17-
if (scrollPos >= this) {
21+
if (this > 0 && scrollPos >= this) {
1822
highestActiveIndex = index;
1923
}
2024
});

tina/collections/shared/templates/container_open.ts

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,24 @@ let template: RichTextTemplate = {
77
end: ">}}",
88
},
99
fields: [
10+
{
11+
type: "string",
12+
name: "isAccordion",
13+
label: "Is Accordion",
14+
description:
15+
"Is this a collapsible accordion?, if so, the icon below is ignored",
16+
required: false,
17+
options: [
18+
{
19+
label: "No",
20+
value: "",
21+
},
22+
{
23+
label: "Yes",
24+
value: "true",
25+
},
26+
],
27+
},
1028
{
1129
type: "string",
1230
name: "icon",
Lines changed: 22 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,24 @@
1-
import { RichTextTemplate } from "@tinacms/schema-tools/dist/types/index"
2-
let pdf_frame : RichTextTemplate = {
3-
name: 'team',
4-
label: 'Team',
5-
match: {
6-
start: '{{<',
7-
end: '>}}'
1+
import { RichTextTemplate } from "@tinacms/schema-tools/dist/types/index";
2+
let pdf_frame: RichTextTemplate = {
3+
name: "team",
4+
label: "Team",
5+
match: {
6+
start: "{{<",
7+
end: ">}}",
8+
},
9+
fields: [
10+
{
11+
name: "id",
12+
label: "ID",
13+
type: "string",
814
},
9-
fields: [
10-
{
11-
name: 'title',
12-
label: 'DONT USE',
13-
type: 'string',
14-
description: 'Use top level template',
15-
required: false,
16-
}
17-
]
15+
{
16+
name: "title",
17+
label: "DONT USE",
18+
type: "string",
19+
description: "Use top level template",
20+
required: false,
21+
},
22+
],
1823
};
19-
export default pdf_frame;
24+
export default pdf_frame;

tina/tina-lock.json

Lines changed: 1 addition & 1 deletion
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)