Almost two years ago, I picked up Caroline Criado Perez’s book Invisible Women and devoured its contents. It is not an exaggeration to say that I have not stopped thinking (and subsequently telling everyone I meet) about this book. They may be sick of hearing about it, but I will not stop talking about this book until the data gap between men and women disappears.
In my own words, the book explores how data is often only collected for and about men, and consequently the world is simply not built to consider us — the extreme being that this usually causes more frequent death in similar situations for women.
A couple examples from memory include:
1. Up until very recently, there were no female crash test dummies at all, which helps explain why women die more often in similar crash situations compared to men.
*I did check to see if there were any updates and found that, according to the US Department of Transportation’s National Highway Traffic Safety Administration, there are 2 50th percentile height (to simulate average) male crash dummies but the 2 female crash dummies are both (inexplicably to my mind) 5th percentile. This is highly concerning to myself (a 5'10" woman) to only be represented by a 4'11" crash dummy. I would hope to be represented by a more average height dummy (the average height of women in the US being 5'4").
2. Any kind of drug or medicine is typically only tested on men (because scientists don’t want to have to take into account women’s pesky monthly hormonal shifts) and the typical dose is usually for a 30-year-old white man. This can lead to drugs not treating women as effectively as men, or with woman-specific medical issues not being researched to with the same frequency as “universal” or male-specific medical issues.
Basically, when the world is created by men, it leads to men being seen as the Default Human. Often headlines and studies proclaiming some new finding for people usually only did the research on men. This is in spite of the fact that women make up 51% of the US population.
So why is this the case?
It typically is a result of who is in charge and who is present in STEM rooms. Even as man/woman gaps are shrinking in many fields, often men are still the ones in charge, the ones who decide what is important enough to study. Until women are in these positions of power (or we have male allies willing to use their power to work towards equal data gathering and analysis), we will never close the data gap.
So how does this all relate to software engineering?
Okay, Dafoe, I will tell you.
These same gender gaps influence the internet, web applications, the technology that we humans use every day. So it becomes a problem if the creators of this technology are homogenous. And not merely based on sex.
In her book, Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble analyzes six years of academic research on Google searches. To quote the great Wikipedia (because they say it better than I could):
Noble argues that search algorithms are racist and perpetuate societal problems because they reflect the negative biases that exist in society and the people who create them. Noble dismantles the idea that search engines are inherently neutral by explaining how algorithms in search engines privilege whiteness by depicting positive cues when key words like “white” are searched as opposed to “asian,” “hispanic,” or “Black.” (source 2)
If we only have white men creating “the internet”, the internet will be inherently biased towards white men. And the statistics don’t look good.
As of 2021, the five largest tech companies (Amazon, Apple, Facebook, Google, Microsoft) employ only 34.4% women. And that is for all types of employees— the stats for software engineering in particular are bleak:
Women currently remain highly underrepresented in software engineering (14% of total workforce) and computer science-related jobs (25% of total workforce). In fact, women software engineer hires have only increased 2% over the last 21 years. (source 3)
2% increase in the last twenty-one years!
And during COVID, women were twice as likely to be furloughed or laid off as their male counterparts. We will have to wait and see the updated data for women represented in the field.
So that is one of the reasons why I am here. I want to be the woman (or preferably one of many) in the room, who can attempt to remind the room that half the world is female and to create accordingly.
In the past year, I heard of a COVID contact tracing app in the UK that accidentally allowed women to pretty easily be stalked by men using the app. This safety issue could have been avoided if a woman was in the room going “yeah… maybe we shouldn’t allow the location/sex/names of people to be so clearly visible on our app…”.
Let’s also take a moment to remember that the precursor to the incredibly influential Facebook was FaceMash — where women’s physical attractiveness was ranked/rated by the users. So… yah.
Technology has great power.
And with great power comes great responsibility.
Create responsibly, fellow software engineers :)
Sources & Resources:
*** I highly recommend signing up for Caroline Criado Perez’s newsletter at https://carolinecriadoperez.com/ ***