You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/english/technical-tools/BDT.md
+16-12Lines changed: 16 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,8 +14,8 @@ quick_navigation:
14
14
url: '#source-code'
15
15
- title: Scientific paper and audit report
16
16
url: '#scientific-paper'
17
-
- title: Local-first architecture
18
-
url: '#local-first'
17
+
- title: local-only architecture
18
+
url: '#local-only'
19
19
- title: Supported by
20
20
url: '#supported-by'
21
21
- title: Awards and acknowledgements
@@ -102,7 +102,11 @@ type: bias-detection-tool
102
102
The tool helps find groups where an AI system or algorithm performs differently, which could indicate unfair treatment. It does this using a technique called <ahref="https://en.wikipedia.org/wiki/Cluster_analysis"target="_blank">clustering</a>, which groups similar data points together (in a cluster). The tool doesn’t need information like gender, nationality, or ethnicity to find these patterns. Instead, it uses a `bias score` to measure deviations in the performace of the system, which you can choose based on your data.
103
103
104
104
#### What results does it give?
105
-
The tool finds groups (clusters) where performance of the algorithmic system is significantly deviating. It highlights the group with the worst `bias score` and creates a report called a bias analysis report, which you can download as a PDF. You can also download all the identified groups (clusters) in a .json file. Additionally, the tool provides visual summaries of the results, helping experts dive deeper into the identified deviations.
105
+
The tool finds groups (clusters) where performance of the algorithmic system is significantly deviating. It highlights the group with the worst `bias score` and creates a report called a bias analysis report, which you can download as a PDF. You can also download all the identified groups (clusters) in a .json file. Additionally, the tool provides visual summaries of the results, helping experts dive deeper into the identified deviations. Example below ⬇️.
The tool works with data in a table format, consisting solely of numbers or categories. You just need to pick one column in the data to use as the `bias score`. This column should have numbers only, and you’ll specify whether a higher or lower number is better. For example, if you’re looking at error rates, lower numbers are better. For accuracy, higher numbers are better. The tool also comes with a demo dataset you can use by clicking "Try it out."
@@ -127,7 +131,7 @@ The tool works with data in a table format, consisting solely of numbers or cate
127
131
<br>
128
132
129
133
#### Is my data safe?
130
-
Yes! Your data stays on your computer and never leaves your organization’s environment. The tool runs directly in your browser, using your computer’s power to analyze the data. This setup, called 'local-first', ensures no data is sent to cloud providers or third parties. Instructions for hosting the tool securely within your organization are available on <ahref="https://github.com/NGO-Algorithm-Audit/local-first-web-tool"target="_blank">Github</a>.
134
+
Yes! Your data stays on your computer and never leaves your organization’s environment. The tool runs directly in your browser, using your computer’s power to analyze the data. This setup, called 'local-only', ensures no data is sent to cloud providers or third parties. Instructions for hosting the tool securely within your organization are available on <ahref="https://github.com/NGO-Algorithm-Audit/local-only-web-tool"target="_blank">Github</a>.
131
135
132
136
Try the tool below ⬇️
133
137
@@ -174,7 +178,7 @@ The HBAC algorithm maximizes the difference in the bias score between clusters.
@@ -186,7 +190,7 @@ The HBAC algorithm maximizes the difference in the bias score between clusters.
186
190
187
191
* The source code of the anolamy detection-algorithm is available on <ahref="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection"target="_blank">Github</a> and as a <ahref="https://pypi.org/project/unsupervised-bias-detection/"target="_blank">pip package</a>: `pip install unsupervised-bias-detection`.
188
192
189
-
* The architecture to run web apps local-first is also available on <ahref="https://github.com/NGO-Algorithm-Audit/local-first-web-tool"target="_blank">Github</a>.
193
+
* The architecture to run web apps local-only is also available on <ahref="https://github.com/NGO-Algorithm-Audit/local-only-web-tool"target="_blank">Github</a>.
190
194
191
195
{{< container_close >}}
192
196
@@ -200,19 +204,19 @@ The unsupervised bias detection tool has been applied in practice to audit a Dut
Local-first computing is the opposite of cloud computing: the data is not uploaded to third-parties, such as a cloud providers, but is processed by your own computer. The data attached to the tool therefore doesn't leave your computer or the environment of your organization. The tool is privacy-friendly because the data can be processed within the mandate of your organisation and doesn't need to be shared with new parties. The unsupervised bias detection tool can also be hosted locally within your organization. Instructions, including the source code or the web app, can be found on <ahref="https://github.com/NGO-Algorithm-Audit/local-first-web-tool"target="_blank">Github</a>.
213
+
#### What is local-only?
214
+
local-only computing is the opposite of cloud computing: the data is not uploaded to third-parties, such as a cloud providers, but is processed by your own computer. The data attached to the tool therefore doesn't leave your computer or the environment of your organization. The tool is privacy-friendly because the data can be processed within the mandate of your organisation and doesn't need to be shared with new parties. The unsupervised bias detection tool can also be hosted locally within your organization. Instructions, including the source code or the web app, can be found on <ahref="https://github.com/NGO-Algorithm-Audit/local-only-web-tool"target="_blank">Github</a>.
0 commit comments