The Home Secretary Priti Patel has announced that they will stop using an algorithm that was deemed racist for determining Visa applications.
The announcement follows a judicial review challenge from the Joint Council for the Welfare of Immigrants (JCWI) and Foxglove.
Why is the algorithm deemed to be racist?
The algorithm, which has been in use since 2015, used a ‘traffic-light system’ to rank visa applications to the UK. In a nutshell, the systems filters through applications by designating a Red, Amber or Green rating to applicants depending on the perceived level of risk.
JCWI has argued the system has been programmed to automatically assign applicants from ‘suspect nationalities’ a higher risk score. These applicants then formed the subject of intense scrutiny and were more likely to be refused entry.
Whilst the system cracked down on applicants from so-called risky countries, it favoured other applicants allowing for ‘speedy boarding for white people’.
Whilst the Home Office has not conceded the claims made in the judicial review challenge, it has confirmed the streaming tool will no longer be used in a bid to address issues surrounding unconscious bias in processing applications.
In the absence of a new system, for the time being the Home Office will seek to determine visa applications on the basis of information about the specific applicant instead of focusing on nationality.
Click here for our interactive guidance on employment and other issues (it's free)