© filo / Getty Images

The Algorithm Addiction

Mass profiling system SyRI resurfaces in the Netherlands despite ban

The Netherlands has been among a growing club of nations experimenting with algorithms in pursuit of efficiencies in detecting welfare fraud. It has also been among those countries accused of pushing a “digital dystopia” since losing a landmark human rights case and seeing its government resign in 2021 in response to a scandal over child benefits, where it was shown to have falsely accused thousands of parents of fraud.

When a ‘mass profiling’ system named SyRI for “system risk indication” was found in 2020 to be in breach of EU human rights legislation after a coalition of civil society organisations the ruling was greeted as a breakthrough for digital rights.

The SyRI decision came as the outlines were emerging of the Dutch childcare benefits scandal, which saw false allegations of fraud against thousands of parents based in part on a machine learning model deployed by the national benefits authorities that illegally used dual nationality a risk characteristic.

When the full extent of harms from the “Toeslagenaffaire” emerged — ranging from bankruptcy to trauma for more than 20,000 parents — the administration of Mark Rutte stepped down in January of 2021. An official admission of “institutional racism” has followed but much of the same government was re-elected and the pursuit of welfare cuts and efficiency savings continues, so questions have remained as to whether potentially harmful technologies were still in use.

METHODS

In the wake of the political scandal over child benefits, we sent a FOIA request to one of the Netherlands’ most progressive cities, Utrecht, to find out if the city was using algorithms to predict welfare fraud. Their response was unnerving: Utrecht was not using such a system city-wide, but it was in Overvecht, a low-income neighbourhood in the city’s northern outskirts.

Documents we obtained revealed that it was not just the city government trying to predict fraud in this comparatively deprived neighbourhood but also the national benefit authorities, tax authorities and the employment agency. The tech and data sharing practices were uncannily similar to the notorious SyRI.

With partner VPRO Argos, we discovered that dozens of low-income neighbourhoods across the Netherlands had been singled out for data-driven profiling in the same way as Overvecht. Documents containing lists of ‘risk indicators’ showed that being a single mother who had a baby while on welfare and having a partner who lives abroad can result in residents being flagged. Other types of data analysed include age, gender, water and electricity consumption, children, partner relationships, vehicles, bank accounts and debts.

At the heart of these so-called “neighbourhood-oriented” projects was the LSI group, a collection of government ministries collaborating to combat welfare fraud and “abuses” steered by the Ministry of Social Affairs. In project plans obtained by Lighthouse Reports and Argos, there were mentions of an algorithmic ‘risk model’ used by the national benefits authorities to flag suspicious addresses for LSI projects. We later confirmed that this algorithm was the same one that had been part of the childcare benefits scandal, which saw thousands of parents wrongly labelled as fraudsters and prompted the eventual resignation of the Dutch government.

When presented with our findings, legal experts, including the original SyRI coalition, confirmed our suspicions: the government had silently continued to deploy a slightly adapted SyRI in some of the country’s most vulnerable neighbourhoods.

“Our major objection to the SyRI method was that unsuspecting citizens were exposed to massive amounts of opaque surveillance by linking their personal data. These documents show that after the SyRI ruling, these practices simply continued,” Tijmen Wisman, chair of Dutch civil rights organisation Platform Bescherming Burgerrechten, one of the parties in the lawsuit, said.

STORYLINES

Just north of Amsterdam, in the small city of Zaandam, authorities decided to single out the “problem” apartment building, rather than a specific neighbourhood, as a target for an LSI project.

One of the flagged residents, a young mother with three children, recalls fraud controllers barging into her apartment and telling her to cooperate or risk losing her benefits. The woman, who spoke on condition of anonymity, claims the controllers sifted through the family’s laundry, checked the electricity metre, and asked if her husband was having an affair.

“You feel as if you are responsible for a very serious act against the whole society in which you live,” she said.

For Laura Lazaro Cabrera, a legal officer at Privacy International, the Dutch case exemplifies risks of the unchecked digitalization of welfare systems — including in the UK — where some of society’s most vulnerable act as a testing ground for new technology.

“These intrusive technologies risk putting people already in vulnerable situations under excessive surveillance, effectively punishing their poverty,” Cabrera said.

In response to the findings, Minister of Social Affairs Carola Schouten announced that she would be launching an “external review” of the LSI’s working method. “It goes without saying that we must check whether benefits are rightly claimed and fraud must be tackled,” she wrote. “However I also think it is important that we take an open and critical look at current practices.”


BEHIND THE STORY

Every month we’ll be interviewing a lead contributor behind one of our investigations. Here is Gabriel Geiger talking to Beatriz Ramalho Da Silva about the story behind Algorithm Addiction.


To keep up to date with Lighthouse investigations sign up for our monthly newsletter

The Impact

Our investigations don’t end when we publish a story with media partners. Reaching big public audiences is an important step but these investigations have an after life which we both track and take part in. Our work can lead to swift results from court cases to resignations, it can also have a slow-burn impact from public campaigns to political debates or community actions. Where appropriate we want to be part of the conversations that investigative journalism contributes to and to make a difference on the topics we cover. Check back here in the coming months for an update on how this work is having an impact.