Artificial Intelligence and its Imperialist Roots
- Natasha Karanja |
- July 21, 2022 |
- Access To Information,
- Artificial Intelligence
Introduction
According to a United Nations Special Rapporteur report, “emerging digital technologies” are held to “exacerbate and compound existing inequalities”, that exist along “racial, ethnic and nation origin grounds.”1 AI is held to be a technological advancement that reshapes “modern societies” and power relations present.2 Despite the positive change AI yields, it poses significant threats that may further aggravate the position of vulnerable individuals.3 These risks are held to be a natural progression of the ‘colonial system’ that is embedded within the current global order.4 Within this progression, Mbembe argues that we are currently facing a third shift in the “arrangement of race and ethnicity in global society.”5 The first shift was slavery and colonisation whilst the second looked towards development of writing and text that articulated the formal process of decolonization, and the third is currently one, that looks towards the “proliferation” of digital technologies that are representative of the “latest phase of high-modernity.”6
AI and Power Relations
AI and its innovations have a large influence on cultural, political and economic segments of modern society. Appreciating this, assists in understanding the role and impact of AI.7 AI possess a dual role of “object and subject”, where we view AI as a “systems of networks and institutions.”8 Within these systems and institutions, questions of values and power relations come into play. Questions here are ; what sort of values are implanted , what sort of values and norms should be embraced when utilising AI9. In what ways does AI mitigate for asymmetrical power dynamics and do “unacknowledged and unquestioned systems of values and power inhibit assessment of the harms and failures present.” 10
These questions highlight the concern of AI ‘obscuring’ power relations in a manner that prevents concerned stakeholders from addressing various needs during formulation of AI systems and technologies. Looking towards previous global orders, “the intention to deepen racial inequalities was more explicit” while in today’s global order inequity is preserved by negligence of those who design the systems and technologies, as there is no careful consideration of the present inequalities in place.11
Algorithmic Colonization.
Applying this, we look towards key illustrations such as algorithmic colonization. This is defined as “coloniality features” in algorithmic decision-making systems.12 Algorithmic colonization is motivated by capitalistic objectives of “profit maximisation at any cost” with values and ethics held to be secondary needs.13 This is illustrated by the activity of the Global North that aims to control the Global South’s digital eco system through the presence of Western monopoly powers .14 The monopolies aim to “liberate” and present solutions to disadvantaged groups. 15The solutions are held “colonial tales” under the “guise of technology”.16 Evidence of this is found within initiatives from Western monopolies such as Facebook and its strategic initiative of “connecting the unconnected.”17
Context here comes into play , as it is hard to conceptualise how Western monopolies can assist with solutions and yet they have no understanding or experience with the realities of the Global South.18 The above is a clear example of algorithmic colonization presented as “technological solutions” for the “developing world”.19 These solutions usually receive a positive reception and are rarely analysed .20 Birhane argues context matters in mitigating the above, as we first to have appreciate that “systems vary from culture to culture” including “what is considered to be a vital problem and a successful solution.”21 Therefore, solutions formulated by one culture may not be implemented effectively within another.22
Furthermore, applying the context argument within AI discourse, one can observe there is under-representation of the Global South. Power imbalances are present as “more economically developed countries are shaping discourse more than others, which raises concerns about neglecting local knowledge, cultural pluralism and the demands of global fairness.”23 Therefore, exclusion of the Global South leads to “coloniality features” in algorithmic decision making systems that create “new labour markets, impact geopolitical power dynamics and influence ethics discussion.”24 In order to streamline AI discourse to a more inclusive approach, there is need to ‘decolonise’ the system.
Way Forward
The “language of decoloniality” assists with dismantling the clear hierarchies of power present.25 It presents AI discourse, a starting point in assessing the imbalances of power in AI, where there is light shed on “structural dependencies” of the Global South, assessment of infrastructures and investigation of the power imbalances in the “design/development/ deployment” of the system. 26 This can be done through adopting and implementing strategies that are inclusive of African research and expertise, that acknowledge the ‘African context’.27 Stakeholders (government, intergovernmental organisations , AI entrepreneurs and corporate actors) should aim to partner with each other to assist each other with making conscious informed decisions regarding the regulation, formulation, deployment and use of AI within the African context. 28The above will assist in providing “fairness, accountability and transparency” within AI.29 Therefore, decolonisation here involves adopting a contextual and conscious approach of deployment of AI within the Global South. Consciousness here is representative of AI that is designed and developed with the intention of meeting and serving the needs and wants of the Global South. Thus, interests of the Global South are held to be included within the main discourse of AI.
1 United Nations General Assembly (UNGA), Racial discrimination and emerging digital technologies a human right ; Report of the Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance (15th June -3rd July 2020) A/HRC/44/57.
2 Mohamed S , Marie-Therese Png & William I, Decolonial AI : Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence [2020] P&T 33.
3 ibid.
4 Adams R, Can Artificial Intelligence be decolonized [2021] ISR Vol 46:1-2 ; Artificial Intelligence and its Discontents.
5 Mbembe A, Critique of Black Reason (Witwatersrand University Press, 2017).
6 ibid.
7 Mohamed (n2).
8 ibid.
9 ibid.
10 ibid.
11 Ruha B, Race After Technology : The New Jim Crow (Duke University Press 2019).
12 Adams(n4)
13 Birhane A, The Algorithmic Colonization of Africa [2020] Vol 17; 2,391.
14 ibid.
15 ibid.
16 Michael Kimani, “5 Reasons Why Facebook’s New Cryptocurrency ‘Libra’ is Bad News for Africa” (Kioneki, 28 June 2019), available at< https://kioneki.com/2019/06/28/5-reasons-whyfacebooks-new-cryptocurrency-libra-is-bad-news-for-africa> last accessed 1st July 2022.
17 Abecassis D, Korsukova E, Kende M, Morgan R &Novik S, Analysys Mason ; The Impact of Facebook’s Connectivity initiatives in Sub-Saharan Africa, Report for Facebook (June 2020) ; expands on the various initiatives Facebook has undertaken to tackling the barriers to connectivity to ensure the unconnected are connected.”
18 ibid.
19 Birhane(n13)393.
20 ibid.
21 ibid..395.
22 ibid.
23 Dutton T, An Overview of National AI strategies [2018] https://medium.com/politics-ai/an-overview-of-national-ai-strategies-2a70ec6edfd last accessed 4th July 2022.
24 Mohamed(n10).
25 ibid.
26 Irani L, Vertesi J, Dourish P, Philip K & Grinter R.E, Postcolonial computing: a lens on design and development [2010] In Proceedings of the SIGCHI conference on human factors in computing systems , New York: ACM.
27 Besaw C & Filitz J, Artificial Intelligence in Africa is a Double -edged Sword [2019] Governance, Technology, Urban Development <https://ourworld.unu.edu/en/ai-in-africa-is-a-double-edged-sword> last accessed 5th July 2022.
28 ibid.
29 ibid.