When algorithms cause harm, should the analyst who builds them go to prison?

For years, social media companies have argued that they bear no responsibility for the harmful content that they promote. Citing technical challenges, freedom of speech and a range of excuses about their responsibilities, they have managed to successfully argue that they would like to do something but the problem is out of their hands – until now. In the UK, tech leaders who ignore warnings to take down content may find themselves facing criminal charges.

The stronger stance in the UK may be partly down to the tireless work of parents, Judy and Andy Thomas. They have been campaigning fiercely for more accountability in the search algorithms that social media companies use. They had followed all the guidelines when they were supervising their 15-year-old daughter, Frankie’s, online activity. They monitored her computer at home and checked her phone regularly. Teenagers have a tendency to get around rules, though, and a school report on her school i-pad activities showed that Frankie had been accessing graphic articles about self harming and suicide in the 2 hours leading up to her decision to take her life. For her parents, time to take back control of social media was up.

Tech companies have long argued that they cannot and should not control the content that they share on their platforms. Many of their excuses come back to the idea that they cannot solve the problem technically, and that they have no legal responsibility to attempt to solve it through human processes.

It is true that deep search algorithms have little human oversight, and react to whatever entices people to keep clicking. Human nature tends to be pessimistic side, and negative or inciteful content does attract outrage and more attention. It misses the point, though. There is no reason that the algorithm has to be programmed to optimise clicks and only clicks. And to argue that the solution has to be technical breaks with a long standing business principle that a company should not put profit models above compliance with the rule of law. Inconvenience is not a good reason to break the law.

This week, the UK government passed the Online Harms Bill, with one amendment – executives at tech companies who wilfully ignore orders from the media regulating body, Ofcom, will now face criminal charges. It may be good news for many analysts. I remember a survey some years back that found that over half of analysts had been asked to do something in their work that they felt deeply uncomfortable about. There was a time when ethical unease was the main reason analysts changed jobs.

It is important to note that it would also be difficult to stumble into breaking this new law. The law only covers executives who refuse to do what they have clearly been told to do, and the blame is very much aimed at those at the top of the decision making chain, responsible for these toxic business practices.

With more legal scrutiny of analytics work, it is important to operate within ethics. See our Governance and Professionalism Training to ensure that you understand how to stay within the law.

Datacamp - Learning Tracks

All IoA members can use the installation-free Data Camp environments to build, practice and test your skills in Data Camp. We have two custom built tracks to allow you to ensure your training is on course to fulfil your career goals. We’ve recommended two tracks of knowledge and analytics study aligned to all of the 7 first years in the Data Competency Framework.

Which Track is for me?

Business analyst with R: This track will take you through spreadsheet skills and BI tools in the early years, and build up your coding skills to use R environments in the later years with more challenging data projects.

Python analyst: This track goes straight into Python coding and will take you all the way to working with unstructured data and deep learning techniques.
Look for the track name and year when you search for a course.
With our custom tracks, we’ve selected the skills that we know employers are looking for but remember that you can also take any of the 300 courses and assessments and projects any time you want and add that to your CPD records, too. You can find a post discussing the aims and structure of the tracks here.

Datacamp - Learning Tracks

All IoA members can use the installation-free Data Camp environments to build, practice and test your skills in Data Camp. We have two custom built tracks to allow you to ensure your training is on course to fulfil your career goals. We’ve recommended two tracks of knowledge and analytics study aligned to all of the 7 first years in the Data Competency Framework.

Which Track is for me?

Business analyst with R: This track will take you through spreadsheet skills and BI tools in the early years, and build up your coding skills to use R environments in the later years with more challenging data projects.

Python analyst: This track goes straight into Python coding and will take you all the way to working with unstructured data and deep learning techniques.

Look for the track name and year when you search for a course.

With our custom tracks, we’ve selected the skills that we know employers are looking for but remember that you can also take any of the 300 courses and assessments and projects any time you want and add that to your CPD records, too. You can find a post discussing the aims and structure of the tracks here


View Learning Tracks


Go To DataCamp