Skip to content

Commit e712107

Browse files
committed
fix issue where accordion containers could not be closed.
1 parent 1fe8a6a commit e712107

File tree

3 files changed

+35
-25
lines changed

3 files changed

+35
-25
lines changed

content/english/technical-tools/BDT.md

Lines changed: 23 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -10,25 +10,25 @@ quick_navigation:
1010
title: Content overview
1111
links:
1212
- title: Introduction
13-
url: "#info"
13+
url: '#info'
1414
- title: Web app
15-
url: "#web-app"
15+
url: '#web-app'
1616
- title: Source code
17-
url: "#source-code"
17+
url: '#source-code'
1818
- title: Anomaly detection algorithm
19-
url: "#HBAC"
19+
url: '#HBAC'
2020
- title: Scientific paper and audit report
21-
url: "#scientific-paper"
21+
url: '#scientific-paper'
2222
- title: Local-first computing
23-
url: "#local-first"
23+
url: '#local-first'
2424
- title: Supported by
25-
url: "#supported-by"
25+
url: '#supported-by'
2626
- title: Awards and acknowledgements
27-
url: "#awards-acknowledgements"
27+
url: '#awards-acknowledgements'
2828
- title: Summary
29-
url: "#summary"
29+
url: '#summary'
3030
- title: Team
31-
url: "#team"
31+
url: '#team'
3232
reports_preview:
3333
title: Example output bias detection tool
3434
icon: fas fa-file
@@ -81,13 +81,13 @@ type: bias-detection-tool
8181

8282
<!-- Introduction -->
8383

84-
{{< container_open title="Introduction – Unsupervised bias detection tool" icon="fas fa-search" id="info" isAccordion={true} >}}
84+
{{< container_open title="Introduction – Unsupervised bias detection tool" icon="fas fa-search " id="info" isAccordion="" >}}
8585

8686
<br>
8787

8888
#### What is the tool about?
8989

90-
The tool identifies groups where an algorithm or AI system shows variations in performance. This type of monitoring is referred to as _anomaly detection_. To identify anomalous patterns, the tool uses <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a>. Clustering is a form of _unsupervised learning_. This means detecting disparate treatment (bias) does not require any data on protected attributes of users, such as gender, nationality, or ethnicity. The metric used to measure bias can be manually selected and is referred to as the `bias metric`.
90+
The tool identifies groups where an algorithm or AI system shows variations in performance. This type of monitoring is referred to as *anomaly detection*. To identify anomalous patterns, the tool uses <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a>. Clustering is a form of *unsupervised learning*. This means detecting disparate treatment (bias) does not require any data on protected attributes of users, such as gender, nationality, or ethnicity. The metric used to measure bias can be manually selected and is referred to as the `bias metric`.
9191

9292
#### What data can be processed?
9393

@@ -126,7 +126,7 @@ The tool identifies deviating clusters. A summary of the results is made availab
126126

127127
#### How is my data processed?
128128

129-
The tool is privacy-friendly because the data is processed entirely within the browser. The data does not leave your computer or the environment of your organization. The tool utilizes the computing power of your own computer to analyze the data. This type of browser-based software is referred to as _local-first_. The tool does not upload data to third parties, such as cloud providers. Instructions on how to host the tool and local-first architecture can be hosted locally within your own organization can be found on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
129+
The tool is privacy-friendly because the data is processed entirely within the browser. The data does not leave your computer or the environment of your organization. The tool utilizes the computing power of your own computer to analyze the data. This type of browser-based software is referred to as *local-first*. The tool does not upload data to third parties, such as cloud providers. Instructions on how to host the tool and local-first architecture can be hosted locally within your own organization can be found on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
130130

131131
Try the tool below ⬇️
132132

@@ -144,17 +144,17 @@ Try the tool below ⬇️
144144

145145
{{< container_open title="Source code" id="source-code" icon="fas fa-toolbox" >}}
146146

147-
- The source code of the anolamy detection-algorithm is available on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection" target="_blank">Github</a> and as a <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a>: `pip install unsupervised-bias-detection`.
148-
[![!pypi](https://img.shields.io/pypi/v/unsupervised-bias-detection?logo=pypi&color=blue)](https://pypi.org/project/unsupervised-bias-detection/)
149-
- The architecture to run web apps local-first is also available on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
147+
* The source code of the anolamy detection-algorithm is available on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection" target="_blank">Github</a> and as a <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a>: `pip install unsupervised-bias-detection`.
148+
[![!pypi](https://img.shields.io/pypi/v/unsupervised-bias-detection?logo=pypi\&color=blue)](https://pypi.org/project/unsupervised-bias-detection/)
149+
* The architecture to run web apps local-first is also available on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
150150

151151
{{< container_close >}}
152152

153153
<!-- Anolamy detection algorithm -->
154154

155155
{{< container_open title="Anolamy detection algorithm – Hierarchical Bias-Aware Clustering (HBAC)" icon="fas fa-code-branch" id="HBAC" >}}
156156

157-
The tool uses the _Hierarchical Bias-Aware Clustering_ (HBAC) algorithm. HBAC processes input data according to the k-means (for numerical data) or k-modes (for categorical data) clustering algorithm. The HBAC-algorithm is introduced by Misztal-Radecka and Indurkya in a [scientific article](https://www.sciencedirect.com/science/article/abs/pii/S0306457321000285) as published in *Information Processing and Management* (2021). Our implementation of the HBAC-algorithm, including additional methodological checks to distinguish real bias from noise, such as sample splitting, statistical hypothesis testing and measuring cluster stability, can be found in the <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection/blob/master/README.md" target="_blank">unsupervised-bias-detection</a> pip package.
157+
The tool uses the *Hierarchical Bias-Aware Clustering* (HBAC) algorithm. HBAC processes input data according to the k-means (for numerical data) or k-modes (for categorical data) clustering algorithm. The HBAC-algorithm is introduced by Misztal-Radecka and Indurkya in a [scientific article](https://www.sciencedirect.com/science/article/abs/pii/S0306457321000285) as published in *Information Processing and Management* (2021). Our implementation of the HBAC-algorithm, including additional methodological checks to distinguish real bias from noise, such as sample splitting, statistical hypothesis testing and measuring cluster stability, can be found in the <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection/blob/master/README.md" target="_blank">unsupervised-bias-detection</a> pip package.
158158

159159
{{< container_close >}}
160160

@@ -250,12 +250,12 @@ The unsupervised bias detection tool is part of OECD's <a href="https://oecd.ai/
250250

251251
Key take-aways about unsupervised bias detection tool:
252252

253-
- <span style="color:#005AA7">Quantitative-qualitative research method</span>: Data-driven bias testing combined with the balanced and context-sensitive judgment of human experts;
254-
- <span style="color:#005AA7">Unsupervised bias detection</span>: No user data needed on protected attributes (_unsupervised learning_);
255-
- <span style="color:#005AA7">Anolamy detection</span>: Scalable method based on statistical analysis;
256-
- <span style="color:#005AA7">Detects complex bias</span>: Identifies unfairly treated groups characterized by mixture of features, detects intersectional bias;
257-
- <span style="color:#005AA7">Model-agnostic</span>: Works for all binary classification algorithms and AI systems;
258-
- <span style="color:#005AA7">Open-source and not-for-profit</span>: User friendly and free to use for the entire AI auditing community.
253+
* <span style="color:#005AA7">Quantitative-qualitative research method</span>: Data-driven bias testing combined with the balanced and context-sensitive judgment of human experts;
254+
* <span style="color:#005AA7">Unsupervised bias detection</span>: No user data needed on protected attributes (*unsupervised learning*);
255+
* <span style="color:#005AA7">Anolamy detection</span>: Scalable method based on statistical analysis;
256+
* <span style="color:#005AA7">Detects complex bias</span>: Identifies unfairly treated groups characterized by mixture of features, detects intersectional bias;
257+
* <span style="color:#005AA7">Model-agnostic</span>: Works for all binary classification algorithms and AI systems;
258+
* <span style="color:#005AA7">Open-source and not-for-profit</span>: User friendly and free to use for the entire AI auditing community.
259259

260260
{{< container_close >}}
261261

tina/collections/shared/templates/container_open.ts

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,12 +8,22 @@ let template: RichTextTemplate = {
88
},
99
fields: [
1010
{
11-
type: "boolean",
11+
type: "string",
1212
name: "isAccordion",
1313
label: "Is Accordion",
1414
description:
1515
"Is this a collapsible accordion?, if so, the icon below is ignored",
1616
required: false,
17+
options: [
18+
{
19+
label: "No",
20+
value: "",
21+
},
22+
{
23+
label: "Yes",
24+
value: "true",
25+
},
26+
],
1727
},
1828
{
1929
type: "string",

tina/tina-lock.json

Lines changed: 1 addition & 1 deletion
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)