Berk’s algorithm would predict who could lead to an ofender
Escrito por: Fernando Aguero
At the age of twelve Gustav Baumeister, after several visits to the office of the “Principal”, heard about Berk’s algorithm. “It’s software that makes my job emotionally easier,” the director said with a satisfied look. “Every mischief you make is recorded in the system, and the machine will at some point be in control of your future.”
Young Gustav was incredulous. The code system that had been haunting him for five years was not in his head. He was about to leave Eliot Elementary School in Boston.
The afternoon was cool when he took his bike. He slowly rolled down Charter Street pushing a black cloud in front of the handlebars.
He stopped at Copp’s Hill Terrace, the best place to review how many times he pulled Noreen’s hair in fifth grade. He looked up at the treetops and found no answers. He didn’t know where this need to misbehave came from. No doubt it was a hidden rage, but terrifyingly it was under the gaze of Berk’s algorithm.
He took a tablet out of his backpack and asked Google who had invented the school. The young man was convinced that everything bad in his life came from there. A quick glance led him to the name Hugh Balsham, “a pest who was born in the United Kingdom in 1284”.
He couldn’t stop thinking. “They invented it in Cambridge and instead of bringing in more water from England they used part of the Mayflowers’ cellars to bring in that school idea.”
The eye of God and Berk’s algorithm
By the end of next fall, pre-teen Gustav Baumeister was looking at himself at Clarence R. Edwards Middle School. That’s where he would arrive, according to his log. However, the principal’s words stirred up the foliage in the park: Maybe, Gustav, maybe, remember that you’re on your way to being a fraud!
“And who is this Berk guy?” The question came when the afternoon was running with a deer color on the horizon. Gustav appealed again to the seeker.
Richard Berk, the creator of Berk’s algorithm, is a professor of criminology and statistics at the University of Pennsylvania. Five years ago, he developed the codes that several states in the Department of Justice in the Union use to keep records of troubled teens and the rules for parolees.
Gustav appealed to his smartphone and made a touch on the Nick “pink hat haker”.
-Fast, fast! -What’s a predictive algorithm?
-Gustav don’t bother … the device you use to talk to me right now is full of predictive algorithms, Whatsapp, Facebook, TikTok, Snapchat, they’re control software that nobody forced us to use and that we asked to possess with tears and even screams, goodbye!
The creative mind
The boy thought of Berk’s algorithm as an abomination. Only an Ivy League mind could create something so sinister. Another fact Gustav found online was that almost every state in the United States has used this new type of government algorithm, he read from the Electronic Privacy Information Center, a nonprofit organization dedicated to digital rights.
“It’s certainly a conspiracy!” The boy quickly got on his bike and headed home. A strange sensation overtook him. He stopped pedaling and looked up at the sky and wondered if someone from the Justice Department was watching him from above.
Anxious and attached to his memories, he thought that his future lay in a criminal act. He couldn’t see himself in any greater action. He thought that he would be arrested for repeatedly disturbing the peace and that his life would pass through the vicissitudes of a parolee.
When he arrived at his case he tried to find out if society was happy with the use of Berk’s algorithm. He noted that many are angry about the increasing reliance on automated systems. The algorithms are taking humans and transparency out of the judicial process. It’s often not clear how the systems are making their decisions. Is gender a factor? Age? Zip code? It is difficult to say, as many states and countries have few rules requiring algorithm manufacturers to disclose their formulas.
Hallelujah said Gustav and shouted, long live civil rights! According to a fringe comment he found in the NYT, algorithms are supposed to reduce the burden on understaffed agencies. The software seeks to cut government costs and eliminate human bias.
But opponents say governments have not shown much interest in learning what it means to remove humans from decision making. A recent United Nations report warned that governments risked “stumbling like zombies into a digital wellbeing dystopia.
And if I make parole
Gustav Baumeister looked at himself in the mirror. He said to himself: I am angry with the world. The words were a decree that placed him with his imagination leaving the penitentiary. It was not difficult, hundreds of films he had seen throughout his life began with a man leaving prison.
After all, prediction algorithms, in their most basic form, work by using historical data to calculate the probability of future events, much like a sports book determines the odds of a game or pollsters predict an election result.
The technology is based on statistical techniques that have been used for decades, often to determine risk. They have been overtaxed by increases in affordable computing power and available data.
Once on the street, he considered going to find Richard Berk at the University of Pennsylvania to explain everything about the algorithm. In the dream he found Professor Berk, in a monologue in which he said that the original design was not to be used in this way.
“One of the things I make very clear about this algorithm – and all the others – is that they are handmade for a particular decision,” he said. “If you move them to another decision, the warranty no longer applies.”
Dr. Berk looked into the eyes of a larger Gustav Baumeister. He faced him. “All this controversy will fade away as the algorithms are used more widely.
He finally compared the algorithms to the autopilot systems on commercial planes. “The autopilot is an algorithm,” he said. “We’ve learned that the autopilot is reliable, more reliable than an individual human pilot. The same thing is going to happen here.”
This post is also available in: Español (Spanish)
11 February, 2020
10 February, 2020
6 February, 2020