Living in the shadow of the Reference Man

Ola Gwozdz
Women in Technology
5 min readMar 6, 2024

--

Imagine a 20–30-year-old, white, six-foot man that weighs 11 Stone. You’re likely to think of a character from a Hollywood movie or a model from the front page of GQ and you would not be wrong. The ‘person’ however I want to introduce you to is the Reference Man.

The Reference Man who stands tall, with his arm raised boldly ready to take on the world, was indeed the vision of Le Corbusier’s system for re-ordering the universe. The Modulor (as named and envisioned by Le Corbusier in 1943), was designed to create a single human form serving as a universal mathematical, aesthetics and design reference (El Modulor de Le Corbusier, 1943–1954). By 1950 the first Modulor-inspired building was erected. From then on, the Reference Man became a standard in architecture, medicine, anatomy and even the dimension reference for furniture and seat design. Every step he takes, every door he opens and every chair he sits on are perfectly designed to ensure his comfort.

But the reference man represents far more than just a set of measurements (measurements that don’t represent an average man let alone a human being.) It’s an ideology of inequality and supremacy of a single expression of what it means to be a human that dominates our built and natural environment today and has seamlessly seeped into the digital models and technologies operating in our cities.

According to Cities Alive: Designing Cities that Work for Women report produced by Arup in partnership with the United Nations Development Programme and the University of Liverpool, gender bias is built into the fabric of our cities. The report shows that “the way our cities are planned, built and managed can significantly restrict women’s ability to move around, be economically active”, feel safe or be represented (only 2–3% of statues are of women) (Candiracci and Power 2022).

The feminist standpoint theory recognises that ‘the knowledge’ (i.e. Data) is situated in a particular location, based on the social, political and cultural viewpoints and therefore can never be truly neutral (Harding 2004; Haraway 1988). As Dr Buolamwini’s research (amongst many others) shows “the past dwells in our data” and as a result, many of our digital technologies are coded with sexism, racism and other forms of discrimination (2023, Crawford and Calo 2016). Buolamwini calls this the “coded gaze”- forms of digital representations created (consciously or unconsciously) with male users in mind. For example, darker women are 32 times more likely to be misclassified than lighter males by facial recognition tools used by CCTV cameras in our airports and cities (Buolamwini 2023). A 2018 “Face Off” report by British civil liberties organisation Big Brother Watch found that “ a staggering 95% of ‘matches’ wrongly identified innocent people.” The Metropolitan Police has the worst record, with less than 2% accuracy” (https://bigbrotherwatch.org.uk/research/). The “coded gaze” of the Reference Man influences not only the bias selection of the training data but also informs the algorithmic bias in some Generative AI tools.

Nicoletti and Bass’ investigation of text-to-image AI platforms reveals just how much these models are informed by stereotypes rather than statistics (Bloomberg, 2023). Having analysed over 5000 images generated by Stable Diffusion (text-to-image AI platform) they reveal the racial and gender disparities to be far more extreme than observed in the real world. According to Stable Diffusion: “the world is run by white male CEOs, women are rarely doctors, lawyers or judges (…) men with dark skin commit crimes, while women with dark skin flip burgers.” (Nicoletti and Bass, Bloomberg, 2023).

Some experts predict that within the next few years, 90% of content on the internet could be artificially generated thus further obstructing our view of reality. Google in an attempt to ‘fix’ the algorithmic bias in their AI model Gemini went in the opposite direction by overcompensating for political correctness. As a result, Gemini ended up generating wildly historically incorrect content, and with it erasing the history of gender and racial discrimination (Kleinman, BBC, 2024). The after-the-fact reflection of the Senior Director of Product for Gemini on the importance of nuance in historical contexts shows promise (Reuters, 2024). However, it’s not just about the historical context but the recognition of the situated knowledge (who produced it and where) in those contexts that need to be recognised and inform any future designs.

If we truly want to claim the neutrality of data within our digital and physical structures, we must first consider their social, political and ideological origins. This realisation and informed practice is critical for ensuring the effectiveness and relevance of the physical and digital structures we build and shape.

Conversely, by paying attention to the situated knowledge and acknowledging the origins of some of our data, we can help create meaningful, diverse and impactful solutions.

PolArctic is the first AI platform that uses a blend of Indigenous Knowledge, satellite data and scientific research to train its model. Sanikiluaq, an Inuit community in Nunavut, Canada is shaping modern technology with a thousand-year wisdom. “AI and Indigenous culture are often positioned as if in conflict with each other, but this project achieved a level of success that would not have been possible without the benefit of both.”(Canavera, WWF, 2021).

In French Polynesia, the indigenous-led conservation project is led by another powerful and mutual partnership of local wisdom and experience powered by AI. The AI-mediated acoustic environment informs the local team whether additional restoration efforts are needed based on the recorded sound of the reef. (Martinescu, 2023)

This equal meeting of Western knowledge and Indigenous Wisdom acknowledges the importance of nuance and mutual learning and with it unlocks meaningful and responsible innovation that can tackle some of the challenges on a local and global scale.

As a Japanese proverb teaches: “Knowledge without wisdom is a load of books on the back of an ass”.

Maybe the universal truth that Le Corbusier sought so desperately does not reside in one universal form (or a single AI platform) but rather, in embracing many shapes and shades that hold the secret to solving the universal challenges we share. This requires the knowledge and wisdom that expands beyond that of the Reference Man- calling for much-needed diversity of scientific disciplines and of human and non-human stakeholders that inform both training data and algorithms.

The shadow of the reference man hovers over us, narrowing our vision and creating blind spots but if we dare to step outside of it, our horizons will expand and our minds broaden.

All of the content was created by real humans and industry experts.

Want to learn more? Here are just some of the resources that inspired this article:

https://www.arcticwwf.org/the-circle/stories/blending-indigenous-knowledge-and-artificial-intelligence-to-enable-adaptation/

https://www.arup.com/perspectives/publications/research/section/cities-alive-designing-cities-that-work-for-women

https://www.bbc.co.uk/news/technology-68412620

https://bigbrotherwatch.org.uk/research/

https://www.bloomberg.com/graphics/2023-generative-ai-bias/

https://data-feminism.mitpress.mit.edu/

https://oxfordinsights.com/insights/ai-indigenous-intelligence/

https://youtu.be/jcbt6V_fp44?si=SydnuB3ase6o2AXv

https://www.unmasking.ai/

--

--

Ola Gwozdz
Women in Technology

Data Philosopher, Doctoral Candidate, MSc in Innovation, Leadership and Management, music producer and a co-founder of a non-profit.