After the COVID-19 pandemic stopped many asylum procedures throughout Europe, new technologies have become reviving these systems. Via lie diagnosis tools examined at the boundary to a program for verifying documents and transcribes selection interviews, a wide range of technology is being found in asylum applications. This article is exploring how these technology have reshaped the ways asylum procedures will be conducted. It reveals just how asylum seekers happen to be transformed into compelled hindered techno-users: They are asked to abide by a series of techno-bureaucratic steps and to keep up with unpredictable tiny within criteria and deadlines. This kind of obstructs all their capacity to get around these devices and to pursue their legal right for proper protection.

It also illustrates how these kinds of technologies will be embedded in refugee governance: They accomplish the ‘circuits of financial-humanitarianism’ that function through a flutter of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by hindering them from interacting with the stations of security. It further argues that analyses of securitization and victimization should be combined with an insight in the disciplinary www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students/ mechanisms of them technologies, by which migrants will be turned into data-generating subjects so, who are regimented by their reliability on technology.

Drawing on Foucault’s notion of power/knowledge and comarcal understanding, the article states that these technology have an inherent obstructiveness. There is a double effect: whilst they help to expedite the asylum process, they also generate it difficult pertaining to refugees to navigate these kinds of systems. They are really positioned in a ‘knowledge deficit’ that makes all of them vulnerable to bogus decisions created by non-governmental stars, and ill-informed and unreliable narratives about their situations. Moreover, they will pose new risks of’machine mistakes’ which may result in erroneous or discriminatory outcomes.