AFFECTING MACHINES:
A TOOLBOX FOR EQUITABLE AI
ABOUT US
In Spring 2023, Concordia’s Applied Artificial Intelligence Institute was awarded a research grant through the Commission des partenaires du marché du travail to improve gender equity within AI/STEM sectors. The proposed research enacted a community-based action methodology to collaboratively establish meaningful trajectories for the project. Thus, an interdisciplinary working group, Affecting Machines, was launched. This working group is made up of researchers, community representatives, and AI professionals who work to create practical tools for those interested in improving gender equity.
We affect machines: We create the data they learn from and develop the algorithms that guide them. Machines affect us in benign, positive, and harmful ways: they recommend the songs we listen to and the movies we watch, they identify the food we buy in self-checkout aisles, and they can (mis)identify our faces in crowds and report our location to police. Through this project, we seek to affect machines through intersectional feminist methodologies, engaging data ethics of care, to contribute to the development and deployment of equitable systems.
WHY GENDER EQUITY?
Women have historically been underrepresented in Science, Technology, Engineering and Mathematics (STEM) fields and careers, especially in computer science and artificial intelligence (AI). For instance, in 2017, women constituted merely 35% of STEM-related fields in higher education worldwide (UNESCO, 2017). The underrepresentation of women in STEM extends beyond higher education into the professional world. In 2023, only 29.2% of all STEM workers on LinkedIn around the world are women, but that number drops to 12% for executive roles (World Economic Forum, 2023). In the global AI labor market, women constitute 30% of the AI workforce in 2023 (World Economic Forum, 2023).
This rate of participation is much lower for Black women and women of color. Gender non-conforming individuals are also assumed to be underrepresented in STEM, computer science, and AI fields and careers, as there is no public data on trans and gender-diverse worker representation. To address these inequities, we adopt an intersectional approach to machine learning, highlighting how sexism is reproduced both in the workplace and through AI systems.
Beyond representation as researchers and professionals in the field, gender is fraught, conceptually, within AI systems. In order to recommend products or music, or generate text or images, algorithms make a variety of assumptions about gender that often are not aligned with current understandings of what gender is, how it should be encoded, and how a gender variable should be ethically used.
– TOOLS & RESOURCES –
- NORMATIVE PRINCIPLES
- TRADING CARDS + TIMELINE
- EQUITABLE HIRING PRACTICES
- GEMinAI MENTORING PROGRAM