Community-centred design and predictive policing applications in England and Wales: socio-legal dimensions, benefits and challenges
Conference paper
Lydon, D. 2025. Community-centred design and predictive policing applications in England and Wales: socio-legal dimensions, benefits and challenges.
Authors | Lydon, D. |
---|---|
Type | Conference paper |
Description | The integration of predictive technologies into UK policing is fast gaining momentum, promising solutions for crime prevention, detection and resource efficiency (see McDaniel and Pease, 2021: NPCC, 2023). Predictive policing applications require vast datasets to forecast potential criminal activity, and to facilitate strategic deployments (Home Office, 2024). However, such advancements are fraught with concerns about, inter alia, privacy, bias, and the risks of entrenching systemic inequalities (Veliz, 2024). Therefore, the challenge lies in developing predictive policing applications that are not only operationally effective but are demonstrably aligned with appropriate ethical and legal standards. This paper argues for a ‘community-centred’ design approach to predictive policing, emphasising active involvement of local communities in every stage of the decision-making process, from development to deployment and evaluation. Defining Community-Centred Design in Predictive Policing Community-centred design is a normative participatory framework that prioritises the needs, concerns, and values of the communities likely to be affected by technological innovations. In the context of predictive policing, this means shifting from an approach, where algorithms and hardware are developed by the police and partners, and deployed without community consultation, to a more collaborative model. Here, residents, community leaders, human rights organisations and advocates, and policing officials work together to discuss and co-design systems and deployment practices. This engagement can take various forms, including community advisory boards, participatory workshops, and real-time feedback mechanisms to ensure community oversight and ‘voice.’ Socio-Legal Dimensions The socio-legal landscape of predictive policing in the UK is complex, sitting at the intersection of human rights, data protection, equality, and governance and accountability legislation and mechanisms. Yet, these laws and processes face new and emergent challenges when applied to innovative AI systems. For instance, how can the principle of non-discrimination be upheld if predictive models disproportionately identify certain demographic groups as higher risk? How can transparency and accountability be guaranteed when algorithms operate as ‘black boxes’ whose decision-making processes are opaque to understand or scrutinise? The paper examines these questions, emphasising the need for legal interpretations that are adaptive and forward-thinking. It also discusses the role of law in establishing procedural safeguards, such as mandatory algorithmic impact assessments and the public right to contest decisions made based on predictive analytics (see Floridi, 2023). Furthermore, it highlights the importance of socio-legal research in understanding the experiences of communities subjected to predictive policing, informing more equitable regulatory responses. Benefits of Community-Centred Design Adopting a community-centred approach can yield benefits. First, it has the potential to reduce algorithmic bias and discriminatory outcomes. When communities are involved in the design and testing phases, they can highlight data sources that may perpetuate historical injustices and suggest a more balanced view to data collection and use. Second, it fosters greater transparency and potential for trust gains. Communities that feel heard and respected are more likely to view predictive policing as a legitimate tool, rather than an intrusive surveillance mechanism. Third, community-centred design promotes a holistic purview of public safety, understanding the root causes of crime, such as poverty, lack of social services, or inadequate infrastructure. Challenges and Limitations Despite some promise, community-centred design faces challenges. A primary obstacle is the resource-intensive nature of genuine community engagement. Building trust and creating meaningful participatory opportunities require time, funding, resources and a commitment from both the police and technology developers and providers. There is also the risk of community fatigue or scepticism, especially in areas where relationships with the police have historically been contentious. Additionally, ensuring that all voices are represented, particularly those of marginalised or underrepresented groups can be difficult. Without intentional efforts to engage these groups, there is a danger that community-centred processes could simply reinforce existing power imbalances. A further challenge is reconciling community-driven recommendations with policing objectives. Predictive policing tools are often deployed with the aim of increasing efficiency and reducing crime rates, but community priorities may include much broader concerns. This tension requires thoughtful negotiation and, in some cases, a ‘reimagining’ of what public safety and security means. Policy, Legal, and Practical Recommendations To overcome these challenges and fully realise the benefits of community-centred design, this paper supports several policy and legal recommendations. First, the UK government should legally mandate algorithmic impact assessments (AIAs) for any predictive policing tool (currently advisory guidance, but with a bill before the UK House of Lords (Public Authority Algorithmic and Automated Decision-Making Systems Bill, 2024)), prior to its deployment. These assessments would evaluate the potential social and legal consequences of the technology and should include community consultation as a compulsory component (this aspect is not currently under consideration). Second, establishing legally recognised community level oversight boards to provide a structured mechanism for community input and police accountability in the development, practice and use of systems and hardware. These boards could have the authority to review algorithmic outputs, suggest modifications, and even pause or stop their use, if it is found to harm the community. Third, the paper recommends creating funding streams dedicated to pilot projects that integrate predictive policing with, for example, social services and local authorities. Conclusion The rise of predictive policing in the UK presents many opportunities and profound challenges. Adopting a community-centred design approach, police agencies can build more ethical, equitable, and effective AI systems. This paper concludes that community involvement is not merely a procedural formality, but a fundamental requirement for the legitimate and responsible use of predictive technologies. Through legal and procedural reforms, community partnerships, and a commitment to transparency and accountability, policing in the UK has the potential to set a global benchmark for how AI applications can be used to enhance, rather than erode, public trust and confidence, community well-being and police legitimacy. References Floridi, L. (2023). The ethics of artificial intelligence: principles, challenges and opportunities. Oxford: Oxford University Press. |
Keywords | Predictive policing; AI ; Digital ethics; Community policing |
Year | 2025 |
Conference | Predictive Policing Network Workshop |
Related URL | https://www.sheffield.ac.uk/ccr/our-research |
File | File Access Level Open |
Publication process dates | |
Deposited | 13 Feb 2025 |
https://repository.canterbury.ac.uk/item/9q4zv/community-centred-design-and-predictive-policing-applications-in-england-and-wales-socio-legal-dimensions-benefits-and-challenges
Download files
12
total views5
total downloads12
views this month5
downloads this month