You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/english/technical-tools/BDT.md
+21-9Lines changed: 21 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,8 +23,8 @@ quick_navigation:
23
23
url: "#local-only"
24
24
- title: Supported by
25
25
url: "#supported-by"
26
-
- title: Awards and acknowledgements
27
-
url: "#awards-acknowledgements"
26
+
- title: Awards and publications
27
+
url: "#awards-publications"
28
28
- title: Summary
29
29
url: "#summary"
30
30
- title: Team
@@ -64,7 +64,11 @@ team:
64
64
- image: /images/people/MJorgensen.jpeg
65
65
name: Mackenzie Jorgensen PhD
66
66
bio: |
67
-
Researcher Alan Turing Institute, London
67
+
Postdoctoral Research Fellow, Northumbria University
68
+
- image: /images/people/JParie.jpg
69
+
name: Jurriaan Parie
70
+
bio: |
71
+
Director, Algorithm Audit
68
72
---
69
73
70
74
<!-- Promobar -->
@@ -204,7 +208,7 @@ The HBAC algorithm maximizes the difference in bias variable between clusters. T
204
208
205
209
The unsupervised bias detection tool has been applied in practice to audit a Dutch public sector risk profiling algorithm. Our [team](/technical-tools/bdt/#team) documented this case in a scientific paper. The tool identified proxies for students with a non-European migration background in a risk profiling algorithm. Specifically the most deviating cluster contains above average students following vocational education and has higher-than-average students living far away from their parent(s)' address, which turned out to be correlate significantly with students with a non-European migration background. Deviations in the control process could therefore also have been found if aggregation statistics on the origin of students had not been available. The results are also described in Appendix A of the below report. This report was sent to <ahref="https://www.tweedekamer.nl/kamerstukken/detail?id=2024D20431&did=2024D20431"target="_blank">Dutch parliament</a> on 22-05-2024.
206
210
207
-
{{< embed_pdf url="/pdf-files/technical-tools/UBDT/20250205 Auditing a Dutch Public Sector Risk Profiling Algorithm.pdf" url2="/pdf-files/algoprudence/TA_AA202402/TA_AA202402_Addendum_Preventing_prejudice.pdf" width_mobile_pdf="12" width_desktop_pdf="6" >}}
211
+
{{< embed_pdf url="/pdf-files/technical-tools/UBDT/20260215 Auditing a Dutch Public Sector Risk Profiling Algorithm.pdf" url2="/pdf-files/algoprudence/TA_AA202402/TA_AA202402_Addendum_Preventing_prejudice.pdf" width_mobile_pdf="12" width_desktop_pdf="6" >}}
208
212
209
213
{{< container_close >}}
210
214
@@ -260,19 +264,19 @@ In 2024, the SIDN Fund <a href="https://www.sidnfonds.nl/projecten/open-source-a
260
264
261
265
{{< container_close >}}
262
266
263
-
<!-- Awards and acknowledgements-->
267
+
<!-- Awards and publications-->
264
268
265
-
{{< container_open title="Awards and acknowledgements" icon="fas fa-medal" id="awards-acknowledgements" >}}
269
+
{{< container_open title="Awards and publications" icon="fas fa-medal" id="awards-publications" >}}
266
270
267
-
This tool has received awards and is acknowledged by various <ahref="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection?tab=readme-ov-file#contributing-members"target="_blank">stakeholders</a>, including civil society organisations, industry representatives and academics.
271
+
This tool has received awards and is acknowledged by various <ahref="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection?tab=readme-ov-file#contributing-members"target="_blank">stakeholders</a>, including civil society organisations, industry representatives and academic outlets.
Under the name Joint Fairness Assessment Method (JFAM) the unsupervised bias detection tool has been selected as a finalist in <ahref="https://hai.stanford.edu/ai-audit-challenge-2023-finalists"target="_blank">Stanford’s AI Audit Competition 2023</a>.
279
+
The [scientific paper](/#scientific-paper) of the tool was presented during the the International Association for Safe and Ethical Artificial Intelligence (<ahref="https://www.iaseai.org"target="_blank">IASEAI’26</a>).
276
280
277
281
{{< accordion_item_close >}}
278
282
@@ -284,6 +288,14 @@ The unsupervised bias detection tool is part of OECD's <a href="https://oecd.ai/
Under the name Joint Fairness Assessment Method (JFAM) the unsupervised bias detection tool has been selected as a finalist in <ahref="https://hai.stanford.edu/ai-audit-challenge-2023-finalists"target="_blank">Stanford’s AI Audit Competition 2023</a>.
Copy file name to clipboardExpand all lines: content/nederlands/technical-tools/BDT.md
+22-10Lines changed: 22 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,8 +22,8 @@ quick_navigation:
22
22
url: "#local-only"
23
23
- title: Ondersteund door
24
24
url: "#supported-by"
25
-
- title: Prijzen en ondersteuning
26
-
url: "#awards-acknowledgements"
25
+
- title: Prijzen en publicaties
26
+
url: "#awards-publications"
27
27
- title: Samenvatting
28
28
url: "#summary"
29
29
- title: Team
@@ -64,7 +64,11 @@ team:
64
64
- image: /images/people/MJorgensen.jpeg
65
65
name: Mackenzie Jorgensen PhD
66
66
bio: |
67
-
Onderzoeker Alan Turing Institute, Londen
67
+
Postdoctorale onderzoeker, Northumbria University
68
+
- image: /images/people/JParie.jpg
69
+
name: Jurriaan Parie
70
+
bio: |
71
+
Directeurr, Algorithm Audit
68
72
---
69
73
70
74
<!-- Promobar -->
@@ -206,7 +210,7 @@ Het HBAC-algoritme maximaliseert het verschil in bias variabele tussen clusters.
206
210
207
211
De unsupervised bias detectie tool is in de praktijk toegepast om een risicoprofileringsalgoritme van de Dienst Uitvoering Onderwijs (DUO) te auditen. Ons [team](/technical-tools/bdt/#team) heeft deze casus uitgewerkt in een wetenschappelijke paper. De tool identificeerde proxies voor studenten met een niet-Europese migratieachtergrond in het risicoprofileringsalgoritme, specifiek opleidingsniveau en de afstand tussen het adres van de student en dat van hun ouder(s). Afwijkingen in het controleproces hadden dus ook gevonden kunnen worden als CBS-data over de herkomst van studenten niet beschikbaar was geweest. De resultaten worden ook beschreven in Appendix A van het onderstaande rapport. Dit rapport is op 22-05-2024 naar de <ahref="https://www.tweedekamer.nl/kamerstukken/detail?id=2024D20431&did=2024D20431"target="_blank">Tweede Kamer</a> gestuurd.
208
212
209
-
{{< embed_pdf url="/pdf-files/technical-tools/UBDT/20250205 Auditing a Dutch Public Sector Risk Profiling Algorithm.pdf" url2="/pdf-files/algoprudence/TA_AA202402/TA_AA202402_Addendum_Vooringenomenheid_voorkomen.pdf" width_mobile_pdf="12" width_desktop_pdf="6" >}}
213
+
{{< embed_pdf url="/pdf-files/technical-tools/UBDT/20260215 Auditing a Dutch Public Sector Risk Profiling Algorithm.pdf" url2="/pdf-files/algoprudence/TA_AA202402/TA_AA202402_Addendum_Vooringenomenheid_voorkomen.pdf" width_mobile_pdf="12" width_desktop_pdf="6" >}}
210
214
211
215
{{< container_close >}}
212
216
@@ -262,19 +266,19 @@ In 2024 ondersteunde het SIDN Fonds <a href="https://www.sidnfonds.nl/projecten/
262
266
263
267
{{< container_close >}}
264
268
265
-
<!-- Prijzen en ondersteuning-->
269
+
<!-- Prijzen en publicaties-->
266
270
267
-
{{< container_open title="Prijzen en ondersteuning" icon="fas fa-medal" id="awards-acknowledgements">}}
271
+
{{< container_open title="Prijzen en publicaties" icon="fas fa-medal" id="awards-publications">}}
268
272
269
-
De tool heeft prijzen ontvangen en wordt ondersteund door verschillende <ahref="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection?tab=readme-ov-file#contributing-members"target="_blank">belanghebbenden</a>, waaronder maatschappelijke organisaties, vertegenwoordigers uit de industrie en academici.
273
+
De tool heeft prijzen ontvangen en is erkend door verschillende <ahref="https://github.com/NGO-Algorithm-Audit/unsupervised-bias-detection?tab=readme-ov-file#contributing-members"target="_blank">belanghebbenden</a>, waaronder maatschappelijke organisaties, vertegenwoordigers uit de industrie en academische outlets.
Onder de naam Joint Fairness Assessment Method (JFAM) is de unsupervised bias detectie tool geselecteerd als finalist voor <ahref="https://hai.stanford.edu/ai-audit-challenge-2023-finalists"target="_blank">Stanford’s AI Audit Competition 2023</a>.
281
+
De [wetenschappelijke paper](/#scientific-paper) over de tool werd gepresenteerd tijds het International Association for Safe and Ethical Artificial Intelligence (<ahref="https://www.iaseai.org"target="_blank">IASEAI’26</a>).
278
282
279
283
{{< accordion_item_close >}}
280
284
@@ -286,6 +290,14 @@ De unsupervised bias detectie tool maakt deel uit van de <a href="https://oecd.a
Onder de naam Joint Fairness Assessment Method (JFAM) is de unsupervised bias detectie tool geselecteerd als finalist voor <ahref="https://hai.stanford.edu/ai-audit-challenge-2023-finalists"target="_blank">Stanford’s AI Audit Competition 2023</a>.
0 commit comments