Skip to content

Commit ac997c5

Browse files
committed
Update UBDT EN
1 parent c91eddd commit ac997c5

File tree

2 files changed

+16
-12
lines changed

2 files changed

+16
-12
lines changed

content/english/technical-tools/BDT.md

Lines changed: 16 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -14,8 +14,8 @@ quick_navigation:
1414
url: '#source-code'
1515
- title: Scientific paper and audit report
1616
url: '#scientific-paper'
17-
- title: Local-first architecture
18-
url: '#local-first'
17+
- title: local-only architecture
18+
url: '#local-only'
1919
- title: Supported by
2020
url: '#supported-by'
2121
- title: Awards and acknowledgements
@@ -102,7 +102,11 @@ type: bias-detection-tool
102102
The tool helps find groups where an AI system or algorithm performs differently, which could indicate unfair treatment. It does this using a technique called <a href="https://en.wikipedia.org/wiki/Cluster_analysis" target="_blank">clustering</a>, which groups similar data points together (in a cluster). The tool doesn’t need information like gender, nationality, or ethnicity to find these patterns. Instead, it uses a `bias score` to measure deviations in the performace of the system, which you can choose based on your data.
103103

104104
#### What results does it give?
105-
The tool finds groups (clusters) where performance of the algorithmic system is significantly deviating. It highlights the group with the worst `bias score` and creates a report called a bias analysis report, which you can download as a PDF. You can also download all the identified groups (clusters) in a .json file. Additionally, the tool provides visual summaries of the results, helping experts dive deeper into the identified deviations.
105+
The tool finds groups (clusters) where performance of the algorithmic system is significantly deviating. It highlights the group with the worst `bias score` and creates a report called a bias analysis report, which you can download as a PDF. You can also download all the identified groups (clusters) in a .json file. Additionally, the tool provides visual summaries of the results, helping experts dive deeper into the identified deviations. Example below ⬇️.
106+
107+
<div style="margin-bottom:50px; display: flex; justify-content: center;">
108+
<img src="/images/BDT/example_COMPAS.png" alt="drawing" width="600px"/>
109+
</div>
106110

107111
#### What kind of data does it work with?
108112
The tool works with data in a table format, consisting solely of numbers or categories. You just need to pick one column in the data to use as the `bias score`. This column should have numbers only, and you’ll specify whether a higher or lower number is better. For example, if you’re looking at error rates, lower numbers are better. For accuracy, higher numbers are better. The tool also comes with a demo dataset you can use by clicking "Try it out."
@@ -127,7 +131,7 @@ The tool works with data in a table format, consisting solely of numbers or cate
127131
<br>
128132

129133
#### Is my data safe?
130-
Yes! Your data stays on your computer and never leaves your organization’s environment. The tool runs directly in your browser, using your computer’s power to analyze the data. This setup, called 'local-first', ensures no data is sent to cloud providers or third parties. Instructions for hosting the tool securely within your organization are available on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
134+
Yes! Your data stays on your computer and never leaves your organization’s environment. The tool runs directly in your browser, using your computer’s power to analyze the data. This setup, called 'local-only', ensures no data is sent to cloud providers or third parties. Instructions for hosting the tool securely within your organization are available on <a href="https://github.com/NGO-Algorithm-Audit/local-only-web-tool" target="_blank">Github</a>.
131135

132136
Try the tool below ⬇️
133137

@@ -174,7 +178,7 @@ The HBAC algorithm maximizes the difference in the bias score between clusters.
174178

175179
<!-- Web app -->
176180

177-
{{< iframe title="Web app – Unsupervised bias detection tool" icon="fas fa-cloud" id="web-app" src="https://local-first-bias-detection.s3.eu-central-1.amazonaws.com/bias-detection.html?lang=en" height="770px" >}}
181+
{{< iframe title="Web app – Unsupervised bias detection tool" icon="fas fa-cloud" id="web-app" src="https://local-only-bias-detection.s3.eu-central-1.amazonaws.com/bias-detection.html?lang=en" height="770px" >}}
178182

179183
<!-- Promo bar -->
180184

@@ -186,7 +190,7 @@ The HBAC algorithm maximizes the difference in the bias score between clusters.
186190

187191
* The source code of the anolamy detection-algorithm is available on <a href="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection" target="_blank">Github</a> and as a <a href="https://pypi.org/project/unsupervised-bias-detection/" target="_blank">pip package</a>: `pip install unsupervised-bias-detection`.
188192

189-
* The architecture to run web apps local-first is also available on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
193+
* The architecture to run web apps local-only is also available on <a href="https://github.com/NGO-Algorithm-Audit/local-only-web-tool" target="_blank">Github</a>.
190194

191195
{{< container_close >}}
192196

@@ -200,19 +204,19 @@ The unsupervised bias detection tool has been applied in practice to audit a Dut
200204

201205
{{< container_close >}}
202206

203-
<!-- Local-first architecture -->
207+
<!-- local-only architecture -->
204208

205-
{{< container_open title="Local-first architecture" icon="fas fa-map-pin" id="local-first" >}}
209+
{{< container_open title="local-only architecture" icon="fas fa-map-pin" id="local-only" >}}
206210

207211
<br>
208212

209-
#### What is local-first?
210-
Local-first computing is the opposite of cloud computing: the data is not uploaded to third-parties, such as a cloud providers, but is processed by your own computer. The data attached to the tool therefore doesn't leave your computer or the environment of your organization. The tool is privacy-friendly because the data can be processed within the mandate of your organisation and doesn't need to be shared with new parties. The unsupervised bias detection tool can also be hosted locally within your organization. Instructions, including the source code or the web app, can be found on <a href="https://github.com/NGO-Algorithm-Audit/local-first-web-tool" target="_blank">Github</a>.
213+
#### What is local-only?
214+
local-only computing is the opposite of cloud computing: the data is not uploaded to third-parties, such as a cloud providers, but is processed by your own computer. The data attached to the tool therefore doesn't leave your computer or the environment of your organization. The tool is privacy-friendly because the data can be processed within the mandate of your organisation and doesn't need to be shared with new parties. The unsupervised bias detection tool can also be hosted locally within your organization. Instructions, including the source code or the web app, can be found on <a href="https://github.com/NGO-Algorithm-Audit/local-only-web-tool" target="_blank">Github</a>.
211215

212-
#### Overview of local-first architecture
216+
#### Overview of local-only architecture
213217

214218
<div style="margin-bottom:50px; display: flex; justify-content: center;">
215-
<img src="/images/BDT/local-first EN.png" alt="drawing" width="100%"/>
219+
<img src="/images/BDT/local-only EN.png" alt="drawing" width="100%"/>
216220
</div>
217221

218222
{{< container_close >}}
38.7 KB
Loading

0 commit comments

Comments
 (0)