×

RECEIVE NEWS AND UPDATES

Feminist AI in MENA – A Neutral Technology?

By: Nada Nassar and Nagla Rizk

Ever wondered if Artificial Intelligence (AI) is truly neutral? Is AI genuinely free from human prejudice? Is the technology impartial?

These questions and more are tackled by the <A+> Alliance f<A+I>r Global Network.

The f<A+i>r network is a global and multidisciplinary feminist coalition of academics, activists and technologists committed to support the skill and imagination of the Global South feminists in producing effective and innovative interdisciplinary models that harness emerging technologies to correct real life biases and barriers to women’s representation and equality.

As the  <A+> Alliance’s Middle East and North Africa (MENA) Hub, the Access to Knowledge for Development Center (A2K4D) has been engaging with multiple stakeholders in the region through a series of f<a+i>r hub network meetings. Based at the American University in Cairo’s School of Business, A2K4D is a research hub exploring the role of digital technologies and innovation in promoting inclusive development, decent work standards in the platform economy and governing responsible AI and data, all with focus on Egypt, the Arab World and Africa. A2K4D embraces an interdisciplinary approach across all research projects and capacity building efforts and adopts a gendered lens while investigating the challenges faced in the region.

Our workshop series helps us take steps towards answering the above-posed questions.

For AI to be a truly neutral intelligence, it must be fully representative of all people, both in design and in the data sets fed into its system. This, however, is not always the case in the real world, and MENA is no exception.

To start with, a discussion on AI necessitates a critical look at the data which is fed into the technology. Data is a major building block in developing AI technologies as they are utilized for the creation and training of AI and Algorithmic Decision Making (ADM) systems, as our webinar speaker from Lebanon, Manal Jalloul tells us. These data sets have a tendency to exclude minorities and groups who are absent from national statistics. Examples include women working in the informal economy and those living in informal communities. This is rampant in different parts of the MENA region, and typically translates into an AI bias which exacerbates biases inherent in historical and cultural norms. Data sets need to be more inclusive and representative of all people in the region for ADM systems to work fairly and produce a more equitable future for all.

 
 

The MENA region also lacks sufficient MENA-specific data, as regional or localized datasets are sparse. This absence of regional and localized data is exacerbated by a lack of Arabic language content compared to English content. This results in outcomes that are not relevant to the region’s unique needs and context.

While addressing bias in datasets is crucial, additional biases can also be ingrained in the design of the algorithms themselves. Algorithms are designed by humans with biases, and such biases tend to seep into the blueprint of the algorithms. These groups of AI designers and data scientists usually exclude minority groups, especially women. If women are being excluded and neglected in the process of developing AI, the technology, then, is not neutral and AI is gendered intelligence. This rivals the common perception of emerging technologies as being completely “bias-free”.

The way we historically recognize patterns and obtain knowledge is gendered and has always been seen through the male lens. This brings up the need for a feminist epistemology, as our webinar speaker from Egypt, Aliah Yacoub tells us. Epistemology is a branch of philosophy concerned with knowledge that asks questions such as: what is knowledge? Where does it originate from, and how is it constructed? Feminist epistemology refers to knowledge that reflects the unique perspectives and experiences of women. More than any other time, AI highlights the dire need to adopt a feminist epistemology discourse in developing AI, in MENA and elsewhere.

In the case of MENA, the theoretical & socio-technical exclusion of women’s knowledge is exacerbated by their underrepresentation in the technology work force. For example, women represent 38 percent of individuals in the workforce in Egypt, even though they represent 52 percent of students in STEM related fields. This is due to women finding it difficult to transfer their degrees into employment resulting from gender discrimination. Structural challenges facing women in STEM thus include STEM education carrying over to the workforce, socio-cultural norms and gendered labor, internet penetration, the digital divide and financial exclusion. Both these analogue and digital inequalities are intertwined and can be aggravated by AI technologies if not addressed.

The MENA region also has a significant digital gap with low ICT and internet access, some of the highest internet prices in the world and limitations on what websites one may access. The region also lacks the adequate infrastructure, which has delayed its development. All these factors impact women. While there are high level plans to address these issues and invest in AI for smart cities, pensions, healthcare and more in the region, there are not enough women or youth present at the table. There is a huge disconnect between the different stakeholders that could play a role in bringing about change, such as those working in feminist spaces, those working in tech, and stakeholders involved in decision and policy making. Not enough major stakeholders are looking at AI with a feminist lens. It is necessary to bring all groups together to share knowledge and have discussions on ways to create a more inclusive future.

If we want AI systems to approximate general human intelligence, then we have to acknowledge that we are not only embodied beings; we are also gender relational beings. This means for AI systems to fully imitate the human brain in all its possibilities, female epistemology must be included within the algorithms. To avoid the exacerbation of historical biases through AI, teams developing these solutions need to be more diverse and inclusive of minority groups, including women and people of color, in order to take into account and prevent human biases from impacting the technology.

Nevertheless, there are signs of progress towards the inclusion of female epistemology in the MENA region, with the increase in female-led tech startups with a focus on FemTech. There have also been initiatives across the region attempting to correct for data biases by updating datasets and addressing female underrepresentation, offering AI-curated tech training programs for women, and campaigns incentivizing quality investments in female-led startups.  

Impactful and context-specific research that is also led by women in the area of feminist AI is also emerging in the region. In Egypt, MENA Feminist AI researcher Marwa Soudi and her team are designing an “Explainable AI Based Tutoring System” based on Human-Centered Artificial Intelligence (HCAI) approaches to reduce dropout rates for girls in Egyptian community schools. The book ‘Arabic Glitch’ by Feminist AI researcher Laila Shereen Sakr, “explores the concept of the glitch within the context of posthuman techno-feminist theory and practice and the implications of the interplay between technology and society.” Moreover, Raya Sharbain, from Jordan, advocates for digital self-defense to support sexual rights and reproductive justice movements amidst growing crackdowns and threats.

Given the nascent nature of research on the topic regionally and the many questions that persist regarding the expected trajectory of AI with regards to women’s inclusion in MENA, it became apparent that there is a need for further solutions based on empirical research and awareness raising around societal and ethical issues of AI. As such, A2K4D continues to deepen outreach and develop a MENA wide network of institutions and individuals that come together and share insights about women’s inclusion in the digital economy against increasing social inequalities in the region. Specifically, to correct biases, inequalities and implications of representation (or lack thereof) of women in data and in algorithm. Our aim is to support the efforts of women and men working on transformational solutions, shaping technology and coding for more inclusive policy making as new technologies continue to permeate all aspects of our daily lives.

 

Related Posts

Towards Gender Parity in AI Development and Deployment in MENA

Why is the underrepresentation of women in every stage of tech production a problem for datasets, for organizations, and for society as a whole?

Overview: A2K4D’s Fourteenth Anniversary Workshop

On February 4, 2024, A2K4D celebrated its Fourteenth Anniversary by holding a workshop titled “Artificial Intelligence, Data and Platform Work: MENA Cases...

A Compassionate Approach to AI in Education

Generative artificial intelligence (AI), and how it has potential to disrupt, challenge or transform education, depending on whom you ask.