How Predictive Policing Can Criminalize Kids’ Online Searches

An algorithm-based program operated by the UK police identifies youth who have displayed an online interest in cybercrime tools, with a view to staging an early intervention. Such “predictive policing” technologies, which are also used to identify potential gang members, have potentially harmful impacts, writes a London solicitor.

How Predictive Policing Can Criminalize Kids’ Online Searches

Using a computer to predict criminal behavior sounds like the stuff of science fiction. Yet police are already using algorithmic technology to identify and deter the offenders of tomorrow.

To date, there has been no systematic review of these efforts to forecast crime in the UK. A recent report has provided valuable insight into how UK’s National Crime Agency (NCA) is adopting these technologies in a cybercrime context, with some interesting case study examples.

Through the Cyber Choices program, the NCA identifies “at-risk” young people, based on online activity which indicates a potential interest in cybercrime forums or the purchase of cybercrime tools. Using a set of risk characteristics, the NCA then targets these young people before they engage in serious illegal activity.

Once identified, NCA officers visit these young people to discuss their behavior with them and with their parents.

Data gleaned from this program is also utilized in a complementary “influencer operations” project, in which young people, some as young as 14, who have googled cybercrime services receive targeted Google advertisements informing them that these services are illegal and that they face NCA action if they purchase them.

The NCA also linked these advertisements to hashtags for major gaming conventions.

Data-Driven Policing in the UK

 This is not the first time UK law enforcement agencies have used data-driven technology to forecast crime involving young people. Under the Gang Violence Matrix, a system maintained by the Metropolitan Police Service (MPS) that ranks individuals on a scale using data based on arrests, convictions and intelligence, an individual is added to the list as a gang “nominal” if he or she meets the threshold for inclusion.

The system identifies victims as well as perpetrators of gang-related crime, with the aim of ensuring they receive the support to prevent further victimization, and to divert them from gang activity.

As of July 2016, 15 percent of those on the Matrix were minors, the youngest being just 12 years old.

Proponents argue these tools help reduce crime and allow police to target resources to the benefit of the wider community. Concerns around the use of this technology center on its long-term consequences, including the effect of contact with law enforcement authorities at such a young age, with the potential risk of stigmatization by peers, educational establishments and future employers were they to learn of it.

 When Things Go Wrong

 When things go wrong, the consequences can be serious.

In 2017, a document with unredacted personal data called ‘Newham Gang Matrix 10th January 2017’ was leaked, and subsequently found to be circulating via Snapchat. The document contained the details of individuals on the Matrix. It had been disclosed by Newham Council, which was fined £145,000 (about US $197,300 at current exchange rates) for the data breach by the Information Commissioner’s Office (ICO).

The ICO also issued the MPS with an Enforcement Notice. Of serious concern to the ICO was how victims of gang-related crime were presumed to have gang associations themselves. The MPS had no system for accurately noting that the victim had been included in the Matrix solely or primarily because of their victim status.

Another major concern was how social media intelligence was being used by police officers, including intelligence derived from posting or commenting on YouTube videos. There was a failure on the part of the MPS when it came to assessing the relevance of this data.

The ICO said these breaches of data protection law had the “potential to cause damage and distress to the disproportionate number of young, black men on the Matrix.”

 At the time, over 100 children under the age of 16 had been profiled. The ICO required 16 actions to be taken by the MPS, including introducing a system for distinguishing victims of gang-related violence, and developing guidance on social media use as an intelligence source. A 2019 freedom of information request revealed that children as young as 13 were still listed on the Matrix. 611 of those listed were 18 or under, making up just under 20 percent of the total

The application of predictive policing has yet to be challenged in the UK Courts.

In 2020, the use of surveillance technology was considered by the UK Court of Appeal in a challenge to the use of live facial recognition technology by South Wales Police. In that case, the Court found that too much discretion had been left to individual officers. Predictive policing goes even further, with very significant potential intrusions on individual privacy rights.

These issues are currently under consideration by the UK House of Lords. In May 2021, the Justice and Home Affairs Committee in the British Parliament launched an inquiry into new technologies such as algorithmic policing and the application to them of the law.

There is no record of NCA involvement in the inquiry.

In July, members of West Midlands Police attended a closed session to demonstrate the National Data Analytics Solution, a tool being developed by a consortium of police agencies to create a new shared, central data and analytics capability.”

 This is said to be a true machine learning-based predictive tool, drawing on nearly 1,400 data indicators that could help predict crime. These indicators include previous crime records or association with criminals. The machine-learning component will use these indicators to predict which individuals may be on a trajectory of violence similar to that observed in past cases.

It is also using social network analysis to identify people who are at risk of becoming influencers of crime, a model that assesses the likelihood and risk of a particular individual becoming influential in co-offending.

A PowerPoint presentation delivered by West Midlands Police at the 2017 Excellence in Policing Conference indicates that the National Data Analytics Solution uses data from young people “up to 11” and “11 to 16” age groups, noting that “The younger you are when you commit a co-offending offence the more likely for you to go ahead to commit other crimes.”

A briefing note prepared by West Midlands Police Ethics Committee argues for the “social benefit of being able to identify the circumstances and reasons (particularly for young people) that lead individuals to commit their first violent offence ….”

 Future Risks

The development of predictive policing gives rise to complex and controversial issues. In its favor, it had the potential to protect the public and to divert potential offenders from a path that would, in all probability, lead to far harsher future involvement with the criminal justice system. In times of scarce resources, the saving to taxpayers will also prove attractive.

However, the risks of widespread adoption of such technology are concerning. They include the potential for algorithmic bias and error, and the stigmatization of young people who have committed no offense but who have merely exercised childlike curiosity.

Importantly, under Article 3 of the United Nations Convention on the Rights of the Child, adults are expected to consider the best interests of children when making choices that affect them. This is particularly important when children are still forming mental associations between their behavior, their environment and the consequences of their actions.

Suzanne Gallagher

The trial and roll-out of such technology without proper scrutiny or public debate poses acute risks to public trust and jeopardizes the UK’s model of policing by consent.

The genie is almost certainly out of the bottle in terms of the development of such technology, but the adoption of policing methods borrowed from science fiction first requires extensive public debate and the formulation and enforcement of a strict legal framework if it is to be a success.

 Suzanne Gallagher is a solicitor at BCL Solicitors LLP in London, where she specializes in corporate crime. Since qualifying in 2017, she has worked in a variety of regulatory and criminal investigations. She has developed expertise in surveillance, cybercrime, data protection and online harm.