Paula's life has been ruined by artificial intelligence: – She was deprived of the opportunity to be a functioning human being

Paula's life has been ruined by artificial intelligence: – She was deprived of the opportunity to be a functioning human being

– I was deprived of the opportunity to be an active human being, Paula Boyer tells NRK.

She was forced to pay several hundred thousand kroner to the Dutch authorities after she was falsely accused of fraud.

She says: “I used to work in a bank during the day, and in a restaurant in the evening, so that I could pay money to the authorities.”

She no longer gets a loan from the bank. In order to survive, she had to sell much of what she owned.

The algorithm selected people who had done nothing wrong.

Illustration: Tom Bob Perrault Aronson/NRK

This happened due to a computer system error. Dutch authorities have used artificial intelligence with machine learning to find tax evaders.

But the tool also selected tens of thousands of other people who had done nothing wrong.

– People from minorities are always the ones who get labeled as Social Security fraudsters, Boyer says.

Based on residential addresses, income, who is married, and many other factors, the system selected people who could be at risk of engaging in fraud.

-We have not found out why KI chose to flag us as scammers.

Dutch Prime Minister Mark Rutte is surrounded by press as he arrives at the Cabinet Office at the Binnenhof in The Hague on January 15, 2021, where ministers meet to discuss the political consequences of the benefits issue.  - Dutch Prime Minister Mark Rutte said his government would resign on January 15, 2021, due to a scandal in which parents were falsely accused of benefits fraud, and admitted that the system was...

Dutch Prime Minister Mark Rutte resigned on 15 January 2021, over a scandal in which parents were falsely accused of benefit fraud, admitting that the computer system had gone “terribly wrong”.

Photograph: Remco de Waal/AFP

Boyer learned in 2021 that she had been found not guilty. In the same year, the Dutch government was forced to leave due to the scandal.

I'm afraid this will happen in Norway

There is a risk that algorithms will also lead to discrimination in Norway, if the legislation is not changed, believes Equality and Discrimination Ombudsman Björn Erik Thon.

Discrimination law must be changed now. These systems were created almost before we knew the word artificial intelligence, Thune says.

Blind artificial intelligence

The Ombudsman for Equality and Discrimination, Björn Erik Thon, believes it is necessary to change discrimination law.

Photo: Tom Balgaard/NRK

The government wants 80% of the public sector to adopt artificial intelligence by next year.

Digitalization Minister Karianne Tong believes each individual company must find a version of AI that suits them.

Thon believes that municipalities do not have the basic requirements for this.

– This is the recipe for something to go wrong. “I think giving some kind of order that everyone has to go out now and use AI is probably not a very smart thing to do,” Thune says.

Digitalization Minister Tong believes it is important to think afresh in order to renew the public sector. She says that everyone who uses KI has a responsibility to do so in a responsible way, and points to moderators from the LDO, the Norwegian Data Protection Authority and the Digitization Directorate.

I expect everyone who uses AI to recognize this to avoid potential risks, she wrote in an email to NRK.

– She emphasizes that what is illegal, such as discrimination, is also not legal when using artificial intelligence.

New report: The law must be changed

To believe in Since algorithmic discrimination is better detected than it is today, anti-discrimination law must change. This was confirmed by a recent report.

The law is structured on the basis that discrimination occurs in relationships between persons. But KI will enter into all areas of society, he says sVibeke Bleecker Strand is Professor of Jurisprudence at the University of Oslo.

Professor Vibeke Bleecker Strand at the University of Oslo.

University of Oslo professor Vibeke Bleecker Strand looked for blind spots in equality and discrimination law.

Image: University of Oslo

She's on a mission to I am doing Identifying gaps in legislation using artificial intelligence.

AI can easily classify people into different groups. It can be a challenge if not organized properly.

For example, there is nothing in current law that prohibits discrimination based on address or educational background, Strand says.

In the Netherlands, people are classified into these categories, based on this type of data.

In Norway, we are well advanced in adopting new technology. Strand believes we should also be early in securing strong legislation to ensure people are not discriminated against by algorithms.

– The review showed that there are several points in the law in which amendments must be taken into account.

Serious consequences

In the Netherlands, the purge after the social security scandal is still ongoing.

The bug in the public computer system was first discovered in 2018. In previous years, several people were falsely accused of fraud.

Not everyone received compensation. But according to Dutch media, this has not been done so far 33,000 families He was compensated.

– We have put artificial intelligence on the list of scammers. But if you forget the ethics of such tools, they can ruin people's lives, says Paula Boyer.

Several women stand outside holding signs.  It says on the labels "The IRS broke my heart".

Since the tax scandal, Paula Boyer has been fighting for justice. I also got involved in politics.

Photo: private

If authorities are going to use AI tools, people must know how the systems work, Dutch Boyer believes.

– People should participate too. You have to make sure you're actually doing what the algorithm is intended to do, she says.



07.05.2024 at 07.01

See also  Crazy growth in crypto scams: NOK 95 billion stolen this year alone
Hanisi Anenih

Hanisi Anenih

"Web specialist. Lifelong zombie maven. Coffee ninja. Hipster-friendly analyst."

Leave a Reply

Your email address will not be published. Required fields are marked *