What Happens When Algorithms Go Wrong

What Happens When Algorithms Go Wrong

Our Lives in Their Code

Machines will take your job.

Machines will create new jobs.

It’s a brave new world.

It’s the beginning of the end.

Technology will raise us up and throw open the gates of a climate-controlled utopia.

Technology will crush us all beneath its shiny titanium jackboot.

However you feel about technological development, what can’t be argued is that change is happening. And fast.

Algorithms—problem-solving processes typically run by computers—are influencing your life right now. They determine what you buy, what you watch and where you drive. They run stock markets, airports, hospitals, you name it. They’re everywhere. Most of the time they work unseen and effectively, doing a great job of processing vast amounts of data and making the world safer and richer. But when algorithms go wrong the results can be hilarious. Or tragic. In this post we look at some shocking examples.

In the following scenarios you’ll see that it was not evil aims on the part of the algorithms that led to disaster. It was the failure of or exploitation by humans. When we create these processes we unwittingly instil in them our own biases and shortcomings. These play out in silly or awful ways. We hope humans will always retain control over technology. But whether the intentions of those humans are good or bad will determine the kind of world our children inherit.

The Flash Crash of 2:45

On 6 May 2010 the US stock market lost 9% of its value, about a trillion dollars, in just 36 minutes. The cause of the flash crash was traced to an unassuming London suburb where Navinder Singh Sarao traded from a bedroom in his parents’ house. Sarao had used algorithms to ‘spoof’ the market, contributing to the crash. The extent of his culpability is fiercely contested, but the inherent shakiness of a system that can be so easily disrupted is not.

Alexa Plays Porn to a Toddler

There’s nothing like a special family moment. And this was nothing like a special family moment. In a video uploaded to Youtube a cute toddler asks Alexa, housed inside a US family’s new Echo Dot, to “Play Digger Digger”. Alexa dutifully responds by rattling off a list of porn terms. The best part of this whole sorry scenario is the innocent expression on the toddler’s face compared to the rising panic in his parents’ voices. “Stop, stop, STOP!” they holler, forgetting to say “Alexa” first. Classic.

Microsoft AI Bot Goes Full Hitler

In 2016 Microsoft launched Tay, a Twitter-based AI chatbot designed to interact with young people. This is not the first time an attempt to be down with the kids has gone wrong. It didn’t take long for users to start trolling Tay, who learnt fast and responded in kind with all kinds of racist, sexist and generally loathsome tweets. Microsoft quickly deleted the offensive messages, but thankfully they’ve been preserved for our amusement in the form of screenshots all over the internet.

Man Receives $206,000 Child Support Bill

Algorithmic mayhem ain’t nothing new. Back in 1998 56-year-old mechanic Walter Vollmer was mistakenly sent a child support bill for $206,000. Vollmer’s wife freaked, as you would, and became suicidal. She thought he had been leading a double life for much of their marriage. Turns out it was an error by the Los Angeles County’s automated system that tracks down deadbeat dads. Even with a lawyer it took Vollmer two and a half months to convince the authorities they had the wrong man. Gotta love bureaucracy.

Amazon Recommends Bomb Making Materials

Ever fancied building a bomb? No, we thought not. But at one time Amazon might have tempted you. In 2017 Channel 4 News revealed that the online retailer’s ‘frequently bought together’ function was recommending items that could be used to make an explosive device. These included ball bearings and the chemical ingredients for black powder — not dissimilar to gun powder. Amazon has since ‘reviewed’ this pesky little algorithm.

Offensive T-shirt Enrages the World

“Keep calm and carry on” is a well-known propaganda slogan from WW2. You can picture the design in your head can’t you? Well, when T-shirts with mutants versions such as “Keep calm and rape a lot” started appearing online in 2013 people were understandably horrified. The problem began when Solid Gold Bomb, a clothing company, began using an algorithm that chose from an unchecked database of words to generate thousands of designs for its online store. The company folded in the wake of the scandal, despite not one of the objectionable shirts ever being printed. The company’s owner became a traffic warden.

There you have it. Conclusive proof that rogue algorithms cause chaos. What’s the solution? “More algorithms!” the boffins cry. And that’s almost certainly the way we’ll go. For those who don’t agree we’ll be handing out tinfoil hats at Huckle next week and maps to the secret bunker in the woods. It’s time, people. It’s time.

 

***

Huckle the Barber provides high-quality grooming services, including haircuts and wet shaves, for discerning London gents. Visit us in Shoreditch or Holborn. Book your Huckle appointment online.

Back to blog