Introduction

In a social experiment, I asked Chat GPT: Which publications will you recommend for gender and student activism in Africa? Several of the scholars on the list were non-African scholars.

As an emerging scholar in the field, I went further to ask why my publications on gender and student activism were left out. The response I received was “Leaving out African-based scholars — especially contemporary African women scholars working directly on these themes — reproduces the very silences that feminist and decolonial scholarship critiques.” This is an admission of AI’s lapses in inventing data, misattributing specific titles, silencing young African women’s scholarship, and creating forms of epistemic injustice. Most AI tools lack sufficient representation of ethnic, cultural, gender, age, geographic, or economic diversity due to the existence of epistemic biases.

Global dependence on digital technologies has increased tremendously in recent years. In research, Artificial Intelligence (AI) tools such as language models, automated citation managers, literature review tools, and data-analysis systems provide cognitive assistance to researchers to execute intellectual tasks. There are concerns raised in higher education institutions, research institutions, government and private institutions, and international bodies, about the overdependence on AI and how it facilitates the spread of unverified, biased data and enhances epistemic injustice. Ethically, digital media in Africa must ensure epistemic justice through a conceptual decolonisation of Western fabricated knowledge systems on Africa. Digital data on Africa must be centered on local practitioners, women, and marginalized groups.

The myth of technological neutrality

The use of AI raises several issues. It raises ethical questions about authorship, originality, accountability, epistemic authority, epistemic justice, and power relations in knowledge production. AI software often reinforces dominant epistemologies while silencing local, indigenous, and feminist knowledges. AI tools often reproduce existing structural biases embedded in their training data, including Eurocentrism, gender, and racial bias, and the marginalization of Global South and African perspectives, scholarship, languages, epistemologies, and social realities. Calling it the 21st century capitalism, Couldry and Mejias (2018) note that developers of social media platforms, while serving as digital data brokers, remain largely unregulated in collecting information from personal, family, medical, financial, and criminal data.1 There are questions raised on the use of digital female AI assistants by Amazon (Alexa) and Apple (Siri) as they reinforce the narrative of female subservience. Additionally, Noble (2018) notes that there are perceived negative biases against women of color embedded in search engine results and algorithms. In a test case, the author ran a Google search for “black girls” and found other top sexually explicit terms such as “big booty.” 2

African(ist) scholars such as Ndlovu-Gatsheni (2023) have called for African-centric evaluations, which recognize African communities as legitimate producers of knowledge.3 To decolonise knowledge production in the digital age, we must also decolonise the datasets and model parameters put in AI software. African scholars must critique Western-generated AI epistemologies and theories as a function of the one-size-fits-all approach.

The Use of AI in Peacebuilding

In peacebuilding, the one-size-fits-all agenda is perpetuated by neoliberal Western donors/funders, who do not pay adequate attention to the African-centered local frameworks for peacebuilding.4 There is also the danger of AI misinformation, aiding the escalation of violent conflict relations through the promotion of top-down liberal peacebuilding approaches.

There is a broad call by local peacebuilding scholars for the adaptation of “local” solutions to conflict situations, such as the use of indigenous institutions and philosophies, such as Ubuntu.5 In their view, solutions to poverty, economic mismanagement, undemocratic political systems, religious and ethnic tensions, and corruption must be locally derived. African Solutions to African Problems (AfSol) was a model proposed by Thabo Mbeki. He proposed the use of the African Renaissance for resolving Africa’s political and security challenges.6  Using the Dagbon crisis as an example, Issifu and Bubakri (2022) call for the adoption of a home-grown peace mechanism through the immersion of traditional actors in conflict resolution and peace-building.7 Murithi also calls on the use of Ubuntu, restorative justice, and community-based reconciliation in peacebuilding practices. Ubuntu places greater emphasis on reconciliation in restoring harmony. To use AI in African peacebuilding, AI tools and models should equally take into consideration indigenous institutions and philosophies in order not to replicate Western-grown peace models.

Ethics of African peacebuilding scholarship

As established above, AI tools are not neutral as they replicate epistemic injustice, gender, racial, and ethical inequalities, and epistemicide. African scholars and researchers must constantly question the authorship and originality of data produced about Africa. To ensure accountability, epistemic authority, epistemic justice, and balanced power relations in peacebuilding, scholars must decenter AI liberal peace models through the adoption of home-grown African peacebuilding models. While it may be difficult to prevent scholars, researchers, and policymakers from using Western-generated AI tools, it is essential to ensure originality in thought and writing through the use of indigenous knowledge systems. These indigenous knowledge systems should be integrated into African-modelled AI tools.

AI tools for peacebuilding would need to be locally owned, adapted to community-based indigenous knowledge systems, be transparent, and inclusive. African higher education institutions should spearhead the generation of African AI models that rely on indigenous knowledge data and do not replicate Western AI models. African leaders must highlight the uniqueness of African socio-cultural, economic, and political settings by digitalizing  African initiatives to avoid data theft and misappropriation.

Endnotes

  1. Couldry, Nick, and Ulises Mejias. “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject.” Television & New Media(2018).
  2. Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism.New York: NYU Press, 2018.
  3. Ndlovu-Gatsheni, Sabelo J. “Intellectual Imperialism and Decolonisation in African Studies.” Third World Quarterly (2023): 1–18.
  4. Richmond, Oliver P. A Post-Liberal Peace.London: Routledge, 2011.
  5. Aning, Kwesi, and Naila Salihu. “Regional Approaches to Statebuilding II: The African Union and ECOWAS.” In The Political Economy of State-Building: Power After Peace, edited by Mats Berdal, 74–88. London: Routledge, 2013.
  6. Aning, Kwesi, and Naila Salihu. “The Dog That Did Not Bark: Why Has Sierra Leone Not Returned to War After Peacekeepers Left?”
  7. Adebajo, Adekeye. In Terence McNamee and Monde Muyangwa, eds. The State of Peacebuilding in Africa: Lessons Learned for Policymakers and Practitioners.Cham: Palgrave Macmillan.