What is Technosolutionism in Human Rights?

What is Technosolutionism in Human Rights?

Technosolutionism is a way of grasping how the world prioritises and values engineered solutions to human problems. It is the theory that a machine, software program or mobile phone application can improve the way something is performed. Typically, this improvement is quantified through speed, as it matches the unsavoury need the Western world has for development. However, with the rush to embrace immediate technological fixes comes new human rights violations (Mattix, 2021). These are viewed as unfortunate side effects by creators of technology, ignored by the end-user and justified by the ultimate problem sped up and solved.

Furthermore, there is an ideology of competition with modern technology that parallels the neoliberalist agendas of Western civilisation. This ultimately leads to architects of technology releasing a product quicker than their opponent, negating that it may not be finished or fully assessed by the end-user. With this comes the belief that technical issues and human rights violations can be patched on the run.

There are two other important issues to consider in the invention of technology. Firstly, there seems to be a belief that repairing problems with technology will lead to a finish line, where all the world's dilemmas are solved. The Western world overlooks the need for comprehensive assessments of how technology influences the human rights discourse. Without this understanding develops an inherent risk of creating new human rights violations that need to be solved by yet another solution.

Furthermore, technology is skewed on the biases that create it. Silicon Valley is a long way from the heart of where most refugees come from, and its confidence outweighs its capability to solve issues it does not completely understand. This has been identified by several scholars and activists who have outlined that there is "tremendous denial on the part of Silicon Valley elites and governments to acknowledge and act upon the myriad social harms that emanate from their products and services" (UCLA Center for Critical Internet Inquiry, 2020, p. 5).

A prime example is in the documentary Coded Bias by Kantayya (2020). Here Kantayya outlines that artificial intelligence features such as facial recognition services are "based on data, and data is a reflection of our history". This statement is made following the detection of systemic failures in this software to identify physical features based on the racial bias from data input during the creation phase, and further explains how machine learning algorithms utilised in the asylum seeker application process can perpetuate society's existing racial, class and gender-based inequalities.

With a frightening reminder that technology is not neutral, it embodies the assumptions and prejudices of those who build it.


Data Extraction, Refugees and Human Rights Violations

Data Extraction, Refugees and Human Rights Violations

The Pitfalls of Technosolutionism and Refugee Rights

The Pitfalls of Technosolutionism and Refugee Rights