Minister of Education apologizes for DUO’s discriminatory algorithm – IT Pro – News


The Dutch outgoing Minister of Education Robbert Dijkgraaf apologizes for an algorithm used by the Education Executive Agency. That algorithm gave a risk score to potential fraudsters, but this turned out to encourage discrimination.

The outgoing Minister of Education, Culture and Science writes in a letter to the House of Representatives that he apologizes for the algorithm used. This happens after research agency PWC a report wrote about the use of that algorithm. This was done by the Education Executive Agency or DUO, which is responsible for paying out study grants. Last year the NOS already wrote about the controversial algorithm that DUO used to detect benefit fraud.

DUO used a rule-based algorithm to assign a risk score to students. This was done, among other things, on the basis of type of education, living distance for parents and age, but after that more and more grounds for further research were added on which students received home visits. “Specifically, PwC concludes that students who lived in neighborhoods with a high share of residents with a migration background were checked more often compared to others,” Dijkgraaf writes. “This despite the use of apparently neutral selection criteria.”

DUO selected approximately 26,800 students for an additional check between 2012 and 2023. DUO stopped using the algorithm in June 2023 due to commotion caused by media reporting.

Benefits affair

In its report, PwC notes that the algorithm was left out after the cabinet fell due, in part, to another discriminatory algorithm. The Rutte III cabinet fell due to the benefits scandal, in which the Tax Authorities appeared to use discriminatory algorithms to detect benefits fraud. In the aftermath of that affair, the government started looking into which other government institutions in the Netherlands were using potentially discriminatory algorithms.

DUO’s algorithm was not included in that inventory. According to PwC, this was because the algorithm did not look at ‘origin-related data’.