Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
When algorithms cause harm, should the analyst who builds them go to prison?
For years, social media companies have argued that they bear no responsibility for the harmful content that they promote. Citing technical challenges, freedom of speech and a range of excuses about their responsibilities, they have managed to successfully argue that they would like to do something but the problem is out of their hands - until now. In the UK, tech leaders who ignore warnings to take down content may find themselves facing criminal charges.
The stronger stance in the UK may be partly down to the tireless work of parents, Judy and Andy Thomas. They have been campaigning fiercely for more accountability in the search algorithms that social media companies use. They had followed all the guidelines when they were supervising their 15-year-old daughter, Frankie's, online activity. They monitored her computer at home and checked her phone regularly. Teenagers have a tendency to get around rules, though, and a school report on her school i-pad activities showed that Frankie had been accessing graphic articles about self harming and suicide in the 2 hours leading up to her decision to take her life. For her parents, time to take back control of social media was up.
Tech companies have long argued that they cannot and should not control the content that they share on their platforms. Many of their excuses come back to the idea that they cannot solve the problem technically, and that they have no legal responsibility to attempt to solve it through human processes.
It is true that deep search algorithms have little human oversight, and react to whatever entices people to keep clicking. Human nature tends to be pessimistic side, and negative or inciteful content does attract outrage and more attention. It misses the point, though. There is no reason that the algorithm has to be programmed to optimise clicks and only clicks. And to argue that the solution has to be technical breaks with a long standing business principle that a company should not put profit models above compliance with the rule of law. Inconvenience is not a good reason to break the law.
This week, the UK government passed the Online Harms Bill, with one amendment - executives at tech companies who wilfully ignore orders from the media regulating body, Ofcom, will now face criminal charges. It may be good news for many analysts. I remember a survey some years back that found that over half of analysts had been asked to do something in their work that they felt deeply uncomfortable about. There was a time when ethical unease was the main reason analysts changed jobs.
It is important to note that it would also be difficult to stumble into breaking this new law. The law only covers executives who refuse to do what they have clearly been told to do, and the blame is very much aimed at those at the top of the decision making chain, responsible for these toxic business practices.