The first legal challenge against the government's use of an algorithm is currently going through the UK courts. The algorithm, which was rolled out in secret, is being used by the Home Office to “stream” visa applications according to their supposed level of risk.
In assessing the level of risk, the Home Office take into account the applicant's nationality. This essentially leads to "speedy boarding for white people" say the Joint Council for the Welfare of Immigrants and not-for-profit Foxglove, who are bringing the litigation.
In the fourth and final episode of our podcast series, we spoke to Cori Crider, a lawyer and the Director of Foxglove, about how algorithms can lead to poor quality and biased decision making. We also hear about the barriers civil society faces in scrutinising automated-decision making and why this is a major challenge of our time.
All content for openDemocracy is the property of openDemocracy and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
The first legal challenge against the government's use of an algorithm is currently going through the UK courts. The algorithm, which was rolled out in secret, is being used by the Home Office to “stream” visa applications according to their supposed level of risk.
In assessing the level of risk, the Home Office take into account the applicant's nationality. This essentially leads to "speedy boarding for white people" say the Joint Council for the Welfare of Immigrants and not-for-profit Foxglove, who are bringing the litigation.
In the fourth and final episode of our podcast series, we spoke to Cori Crider, a lawyer and the Director of Foxglove, about how algorithms can lead to poor quality and biased decision making. We also hear about the barriers civil society faces in scrutinising automated-decision making and why this is a major challenge of our time.
The first legal challenge against the government's use of an algorithm is currently going through the UK courts. The algorithm, which was rolled out in secret, is being used by the Home Office to “stream” visa applications according to their supposed level of risk.
In assessing the level of risk, the Home Office take into account the applicant's nationality. This essentially leads to "speedy boarding for white people" say the Joint Council for the Welfare of Immigrants and not-for-profit Foxglove, who are bringing the litigation.
In the fourth and final episode of our podcast series, we spoke to Cori Crider, a lawyer and the Director of Foxglove, about how algorithms can lead to poor quality and biased decision making. We also hear about the barriers civil society faces in scrutinising automated-decision making and why this is a major challenge of our time.
The High Court recently found a fee charged to children registering their British citizenship to be unlawful. In conversation with Amnesty's Steve Valdez-Symonds openJustice get an insider's view of the case - the historical and political context and some practicalities in bringing it.
openJustice spoke to Abi Brunswick and Clare Jennings of Project 17 about their strategies to overcome unlawful practices by local authorities who refuse to provide support to destitute migrant families.
The first legal challenge against the government's use of an algorithm is currently going through the UK courts. The algorithm, which was rolled out in secret, is being used by the Home Office to “stream” visa applications according to their supposed level of risk.
In assessing the level of risk, the Home Office take into account the applicant's nationality. This essentially leads to "speedy boarding for white people" say the Joint Council for the Welfare of Immigrants and not-for-profit Foxglove, who are bringing the litigation.
In the fourth and final episode of our podcast series, we spoke to Cori Crider, a lawyer and the Director of Foxglove, about how algorithms can lead to poor quality and biased decision making. We also hear about the barriers civil society faces in scrutinising automated-decision making and why this is a major challenge of our time.