Systems and Asylum Procedures

After the COVID-19 pandemic stopped many asylum procedures around Europe, new technologies have become reviving these types of systems. Coming from lie recognition tools examined at the border to a program for confirming documents and transcribes interviews, a wide range of systems is being utilised in asylum applications. This article is exploring just how these technologies have reshaped the ways asylum procedures happen to be conducted. This reveals how asylum seekers happen to be transformed into forced hindered techno-users: They are asked to adhere to a series of techno-bureaucratic steps and also to keep up with unforeseen tiny within criteria and deadlines. This kind of obstructs the capacity to work these devices and to go after their right for security.

It also displays how these types of technologies happen to be embedded in refugee governance: They help in the ‘circuits of financial-humanitarianism’ that function through a whirlwind of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by hindering them from being able to access the channels of protection. It www.ascella-llc.com/counseling-services-for-students further states that studies of securitization and victimization should be combined with an insight in the disciplinary mechanisms for these technologies, in which migrants are turned into data-generating subjects whom are self-disciplined by their dependence on technology.

Drawing on Foucault’s notion of power/knowledge and comarcal understanding, the article argues that these solutions have an inherent obstructiveness. They have a double impact: while they aid to expedite the asylum process, they also help to make it difficult for the purpose of refugees to navigate these systems. They are positioned in a ‘knowledge deficit’ that makes these people vulnerable to bogus decisions created by non-governmental celebrities, and ill-informed and unreliable narratives about their instances. Moreover, that they pose fresh risks of’machine mistakes’ that may result in incorrect or discriminatory outcomes.

Leave a Comment

Your email address will not be published. Required fields are marked *