|
1 | 1 | --- |
| 2 | +icon: fa-poll-h |
| 3 | +layout: case |
2 | 4 | title: Risk Profiling for Social Welfare Reexamination |
3 | 5 | subtitle: | |
4 | 6 | Problem statement (AA:2023:02:P) en advice document (AA:2023:02:A) |
5 | 7 | image: /images/algoprudence/AA202302/AA202302A_cover_EN.png |
6 | | -layout: case |
7 | | -icon: fa-poll-h |
8 | | -key_takeaways: |
9 | | - - title: Algorithmic profiling is possible under strict conditions |
10 | | - content: > |
11 | | - The use of algorithmic profiling to re-examine whether social welfare |
12 | | - benefits have been duly granted, is acceptable if applied responsibly. |
13 | | - - title: Profiling must not equate suspicion |
14 | | - content: | |
15 | | - Re-examination needs to be based more on service and less on distrust. |
16 | | - - title: Diversity in selection methods |
17 | | - content: > |
18 | | - To avoid tunnel vision and negative feedback loops, algorithmic profiling |
19 | | - ought to be combined with expert-driven profiling and random sampling. |
20 | | - - title: Well-considered use of profiling criteria |
21 | | - content: > |
22 | | - Caring to avoid (proxy) discrimination and other undesirable forms of |
23 | | - differentiation, the normative advice commission assessed variables |
24 | | - individually on their eligibility for profiling (see Infographic). |
25 | | - - title: Explainability requirements for machine learning |
26 | | - content: > |
27 | | - It is necessary that the sampling of residents can be explained throughout |
28 | | - the entire decision-making process. Complex training methods for variable |
29 | | - selection, such as the xgboost algorithm discussed in this case study, are |
30 | | - considered too complex to meet explainability requirements. |
31 | | -summary: > |
32 | | - The commission judges that algorithmic risk profiling can be used under strict |
33 | | - conditions for sampling residents receiving social welfare for re-examination. |
34 | | - The aim of re-examination is a leading factor in judging profiling criteria. |
35 | | - If re-examination were based less on distrust and adopts a more |
36 | | - service-oriented approach, then the advice commission judges a broader use of |
37 | | - profiling variables permissible to enable more precise targeting of |
38 | | - individuals in need of assistance. For various variables used by the |
39 | | - Municipality of Rotterdam during the period 2017-2021, the commission gives an |
40 | | - argued judgement why these variables are or are not eligible as a profiling |
41 | | - selection criterion (see Infographic). A combined use of several sampling |
42 | | - methods (including expert-driven profiling and random sampling) is recommended |
43 | | - to avoid tunnel vision and negative feedback loops. The commission advises |
44 | | - stricter conditions for the selection of variables for use by algorithms than |
45 | | - for selection by domain experts. The commission states that algorithms used to |
46 | | - sample citizens for re-examination must be explainable. Complex training |
47 | | - methods, such as the xgboost model used by the Municipality of Rotterdam, do |
48 | | - not meet this explainability criterion. This advice is directed towards all |
49 | | - Dutch and European municipalities that use or consider using profiling methods |
50 | | - in the context of social services. |
51 | | -sources: > |
52 | | - Unsolicited research, build upon [freedom of information |
53 | | - requests](https://www.vpro.nl/argos/media/luister/argos-radio/onderwerpen/2021/In-het-vizier-van-het-algoritme-.html) |
54 | | - submitted by investigative journalists. |
55 | | -additional_content: |
56 | | - - title: Presentation |
57 | | - content: "The advice report (AA:2023:02:A) has been presented to the Dutch Minister of Digitalization on November 29, 2023. A press release can be found\_[here](https://algorithmaudit.eu/pressroom).\n" |
58 | | - image: /images/algoprudence/AA202302/Algorithm audit presentatie BZK FB-18.jpg |
59 | | - width: 8 |
60 | | -algoprudence: |
61 | | - title: Algoprudence |
62 | | - intro: "Download the full advice report (AA:2023:02:A)\_[here](https://drive.google.com/file/d/1zRRUYRfaIzdKFA2hQtW9yeM4jrD-Abef/view?usp=sharing)\_and problem statement (AA:2023:02:P)\_[here](https://drive.google.com/file/d/11sQMVJQd3ZJlW0R6HjU01b4N4CmuFw2q/view?usp=sharing).\n" |
63 | | - reports: |
64 | | - - url: >- |
65 | | - https://drive.google.com/file/d/1zRRUYRfaIzdKFA2hQtW9yeM4jrD-Abef/preview |
66 | | - - url: >- |
67 | | - https://drive.google.com/file/d/11sQMVJQd3ZJlW0R6HjU01b4N4CmuFw2q/preview |
68 | | -normative_advice_members: |
69 | | - - name: | |
70 | | - Abderrahman El Aazani, Researcher at the Ombudsman Rotterdam-Rijnmond |
71 | | - - name: > |
72 | | - Francien Dechesne, Associate Professor Law and Digital Technologies, |
73 | | - Leiden University |
74 | | - - name: > |
75 | | - Maarten van Asten, Alderman Finance, Digitalization, Sports and Events |
76 | | - Municipality of Tilburg |
77 | | - - name: | |
78 | | - Munish Ramlal, Ombudsman Metropole region Amsterdam |
79 | | - - name: > |
80 | | - Oskar Gstrein, Assistant Professor Governance and Innovation, University |
81 | | - of Groningen |
82 | | -funded_by: |
83 | | - - url: 'https://www.sidnfonds.nl/projecten/ethical-risk-assessment-tool' |
84 | | - image: /images/supported_by/sidn.png |
85 | | - - url: 'https://europeanaifund.org/' |
86 | | - image: /images/supported_by/EUAISFund.png |
87 | | - - url: >- |
88 | | - https://www.rijksoverheid.nl/ministeries/ministerie-van-binnenlandse-zaken-en-koninkrijksrelaties |
89 | | - image: /images/supported_by/BZK.jpg |
90 | | -actions: |
91 | | - - title: >- |
92 | | - Questions raised in the city council of Amsterdam as a result of advice |
93 | | - report |
94 | | - description: > |
95 | | - Council members submitted |
96 | | - [questions](https://amsterdam.raadsinformatie.nl/document/13573898/1/236+sv+Aslami%2C+IJmker+en+Garmy+inzake+toegepaste+profileringscriteria+gemeentelijke+algoritmes) |
97 | | - whether the machine learning (ML)-driven risk profiling algorithm |
98 | | - currently tested by the City of Amsterdam satisfies the requirements as |
99 | | - set out in AA-2023:02:A, including: |
100 | | -
|
101 | | -
|
102 | | - * (in)eligible selection criteria fed to the ML model |
103 | | -
|
104 | | - * explainability requirements for the used explainable boosting algorithm |
105 | | -
|
106 | | - * implications of the AIAct for this particular form of risk profiling. |
107 | | - image: /images/algoprudence/AA202302/Actions/images.png |
108 | | - date: 04-12-2023 |
109 | | - facets: |
110 | | - - value: political action |
111 | | - label: political action |
112 | | - - title: Binnenlands Bestuur |
113 | | - description: "News website for Dutch public sector administration reported on AA:2023:02:A. See\_[link](https://www.binnenlandsbestuur.nl/digitaal/algoritmische-profilering-onder-strikte-voorwaarden-mogelijk).\n" |
114 | | - image: /images/algoprudence/AA202302/logo-bb.svg |
115 | | - date: 01-12-2023 |
116 | | - facets: |
117 | | - - value: type |
118 | | - label: News |
119 | | - - title: Presentation advice report to Dutch Minister of Digitalization |
120 | | - description: "Advice report AA:2023:02:A has been presented to the Dutch Minister of Digitalization on November 29, 2023. A press release can be found\_[here](/events/press_room).\n" |
121 | | - image: /images/algoprudence/AA202302/Actions/presentatie_BZK.jpg |
122 | | - date: 29-11-2023 |
123 | | - facets: |
124 | | - - value: presentation |
125 | | - label: Presentation |
126 | | - - value: publication |
127 | | - label: Publication |
128 | 8 | form1: |
129 | 9 | title: React to this normative judgement |
130 | 10 | content: >- |
@@ -157,7 +37,7 @@ form1: |
157 | 37 | placeholder: Mail address |
158 | 38 | --- |
159 | 39 |
|
160 | | -{{< tab_header width="6" tab1_id="description" tab1_title="Description of algoprudence" tab2_id="actions" tab2_title="Actions following algoprudence" tab3_id="" tab3_title="" default_tab="description" >}} |
| 40 | +{{< tab_header width="4" tab1_id="description" tab1_title="Description of algoprudence" tab2_id="actions" tab2_title="Actions following algoprudence" tab3_id="discussion" tab3_title="Discussion & debate" default_tab="description" >}} |
161 | 41 |
|
162 | 42 | {{< tab_content_open icon="fa-poll-h" title="Risk Profiling for Social Welfare Reexamination" id="description" >}} |
163 | 43 |
|
@@ -248,4 +128,26 @@ News website for Dutch public sector administration reported on AA:2023:02:A. Se |
248 | 128 |
|
249 | 129 | {{< tab_content_close >}} |
250 | 130 |
|
| 131 | +{{< tab_content_open id="discussion" >}} |
| 132 | + |
| 133 | +{{< accordions_area_open id="discussion" >}} |
| 134 | + |
| 135 | +{{< accordion_item_open title="Reaction Netherlands Human Rights Institute on age discrimination" id="cvrm" background_color="#eef2f6" date="12-04-2024" tag1="reaction" image="/images/algoprudence/AA202302/Discussion&debate/CvRM.svg" >}} |
| 136 | + |
| 137 | +#### Age Discrimination |
| 138 | + |
| 139 | +Policies, such as those implemented by public sector agencies investigating (un)duly granted social welfare or employers seeking new employees, can intentionally or unintentionally lead to differentiation between certain groups of people. If an organization makes this distinction based on grounds that are legally protected, such as gender, origin, sexual orientation, or a disability or chronic illness, and there is no valid justifying reason for doing so, then the organization is engaging in prohibited discrimination. We refer to this as discrimination. |
| 140 | + |
| 141 | +But what about age? Both the Rotterdam-algorithm and DUO-algorithm, as studied by Algorithm Audit, differentiated based on age. However, in these cases, age discrimination does not occur. |
| 142 | + |
| 143 | +EU non-discrimination law also prohibits discrimination on the basis of age. For instance, arbitrarily rejecting a job applicant because someone is too old is not unlawful. However, legislation regarding age differentiation allows more room for a justifying argument than for the aforementioned personal characteristics. This is especially true when the algorithm is not applied in the context of labor. |
| 144 | + |
| 145 | +Therefore, in the case of detecting unduly granted social welfare or misuse of college loan, it is not necessarily prohibited for an algorithm to consider someone's age. However, there must be a clear connection between age and the aim pursued. Until it is shown that someone's age increases the likelihood of misuse or fraud, age is ineligible as a selection criteria in algorithmic-driven selection procedures. For example, pertaining to disability allowances for youngsters (Wajong) a clear connection exists and an algorithm can lawfully differentiate upon age. |
| 146 | + |
| 147 | +{{< accordion_item_close >}} |
| 148 | + |
| 149 | +{{< accordions_area_close >}} |
| 150 | + |
| 151 | +{{< tab_content_close >}} |
| 152 | + |
251 | 153 | {{< form1 >}} |
0 commit comments