Artificially Intelligent, Genuinely Harmful: AI and Age Assessments in the UK Asylum System

Legal Updates

The government is trialling facial recognition AI to assess the age of asylum-seeking children - repurposing tech used for online age checks to make life-changing decisions about who gets child protection

The Home Office has announced plans to trial the use of Artificial Intelligence (AI) in age assessments through Facial Age Estimation (FAE). This technology is being proposed for use on unaccompanied asylum-seeking children, who are young people seeking safety who arrive in the UK without any parent or guardian. Young people often do not have documents to prove their age, so the government conducts an ‘age assessment’ to decide whether they should be treated as a child or an adult. This decision determines access to child protection, education, and housing support. It is a complex and challenging process, especially for the young person at the centre of it.

Adding another “initiative” to their ever-expanding portfolio of cruelty, the government now intends to test FAE later this year, aiming to fully integrate this technology into the age assessment process by 2026. This is the same facial recognition technology used for online age checks, like verifying access to adult websites, now being repurposed to decide who qualifies for child protection, except the consequences of mistakes are life-changing.

Technology is not neutral 

AI systems are essentially computer programs which are fed lots of data and are designed to recognise patterns, solve problems and make decisions. Crucially, they are created and programmed by humans with structural and societal biases baked in along with it. Facial Age Estimation (FAE) works by analysing facial features from a photograph and comparing them to a select group of images where the person’s age is already known. The AI identifies patterns across these images to predict how old someone is. However, FAE cannot take into account experiences such as sleeping in refugee camps, visible ageing due to grief, sun exposure, poor nutrition or carrying the physical marks of trauma and displacement. When a Black teenager is told they “look too old” to qualify for protection, that’s not a technical error but racism and bias built into the system from the start. 

The critical question remains: what images and data are the FAE tools trained on? There are already many examples of AI perpetuating discrimination and racism with real life consequences. Even if FAE is trained on extensive and diverse data, how can the complexities of human life, and how our experiences manifest in our faces and bodies, be reduced to processes and numbers?  

The government’s plan to use AI for assessing age isn’t happening in isolation. It’s part of a broader pattern of the government using technology to monitor, discriminate, criminalise, dehumanise, and outsource decision-making under the illusion of objectivity and fast solutions. AI can mimic human judgement, but it cannot empathise – it cannot feel compassion, understand cultural and historical context or draw from genuine lived experience. ​​

The ongoing challenge of assessing age 

To understand how problematic this new direction is, we have to look at the troubled history of age assessments in the asylum system. Assessing age is complicated. The existing processes in place ranging from visual checks at the border by Home Office staff to full holistic assessments carried out over several days by social workers. In 2022, the Home Office introduced so-called “scientific methods” (e.g. dental records, x-rays and scans) to determine age despite previously in 2016 publicly ruling out the use of dental checks as “inaccurate, inappropriate and unethical”. The announcement of trialing FAE in age assessments seems to signal an end to these “scientific” methods and the Home Office have confirmed they are not commissioning any of these methods, which tracks with the government’s habit of switching between harmful and ineffective policies. 

The Independent Chief Inspector of Borders and Immigration (ICIBI), who is an independent body that inspects immigration and asylum procedures, recently released a report on the Home Office’s use of age assessments between July 2024 – February 2025. Unsurprisingly the ICIBI highlighted a culture of disbelief, racial bias, poor data systems, and lack of coordination with safeguarding professionals which places vulnerable children at serious risk.The report concluded that “many of the concerns about policy and practice that have been raised for more than a decade remain unanswered”. 

Outsourcing humanity? 

UK law and Home Office guidance is clear that you can’t decide age on appearance alone except in “the most obvious cases” (when someone is clearly a child or clearly an adult). In theory FAE can only lawfully be used as a tool to support assessment and should not be solely relied upon in making decisions. However, in practice the Home Office has a long history of failing to follow their own policies and guidance and relying heavily on flawed methods to discredit young people and deny them support, like when Border Staff decide which young people are “clearly an adult”.

We cannot ignore the government’s increasing use of AI and technology as a way to try and conveniently “solve” complex and deeply human issues. AI has the potential to act as a tool to uphold human rights rather than erode them, but only if that AI is guided by values of justice, accountability and care. It is important that we continue to centre empathy, compassion and humanity in our fight for migrant justice 

​​Has someone told you that you are older than 18, when you know that you are younger than 18? 

  • You can text Humans for Rights Network for help on +447506 663 089
  • Read the toolkit for young people written by other young people who have been through age assessment processes here

Are you supporting a young person who is under 18 but is being treated as an adult (over 18)?

  • Read the age assessment toolkit for practitioners created by organisations across the UK here.
  • Learn more about age assessments here.

-Ally, Legal Education Officer


Discussion:

Leave a Reply

Please note Right to Remain cannot provide immigration legal advice that is specific to your individual asylum and immigration application.

This site uses Akismet to reduce spam. Learn how your comment data is processed.