(this blog post is a part of a DEV celebration of the annual #wecoded campaign - earlier known as #shecoded)
As I'm nearing my fourth year in tech, I see the same discussions happen over and over again around gender in tech - and I'm tired. In the past, I found that writing about these problems always left me exhausted and deflated for a few days so now I’ll try something different: I turned the reasons for my fatigue into ideas for your next-generation AI app ✨
Salary negotiation
Contrary to common belief, women are equally as good (or bad, actually) as men in salary negotiation - that is, when they actually negotiate. Sadly, a number of factors can impact a woman's willingness to negotiate an offer, including how the job description is written:
When jobs advertisements contained female-threatening language, women lowered their salary minimums compared to when jobs were described in a female-supportive way, whereas men were unaffected. (...) If women have lower aspirations, they may feel less motivation to negotiate and simply accept the first offer.
Consistent with my own experience, another study found that women are especially good at negotiation when advocating for the salary of others:
Our research suggests that (...) women perform better when negotiating on behalf of others than they do when negotiating for themselves; no such difference emerges among male negotiators. (...) Female executives negotiating as the mentor secured compensation that was 18 percent higher than the compensation female executives negotiated when they were playing the candidate. Meanwhile, male executives performed consistently across both roles, at the level of female executives negotiating as the candidate.
(...) It's not that our female participants felt less entitled to a good salary. (...) Nor were women more or less competent at the negotiation itself. Rather, it appears that the women executives were particularly energized when they felt a sense of responsibility to represent another person's interests.
⭐️ Here are some ideas for what AI could do:
- a real-time salary negotiation assistant that will provide the applicant with the pep talk, sound bites, and arguments for a successful negotiation
- estimating what folks at a given salary might be making (not only via Glassdoor or Blind) and what the candidate's skillset and experience is, and based on that, suggests acceptable salary ranges
- (for HR) scanning job descriptions for gender triggers and suggesting alternatives
- (for candidates) an extension that replaces gender triggers in job description with gender neutral language
At work
When it comes to gender issues at workplace, there’s probably enough to fill every letter of the alphabet - and perhaps write a whole encyclopedia. Well, let me just mention five things my friends and I noticed this past year:
- I see a lot of my amazing female friends partake in glue work, either voluntarily or because they are expected to.
- I have watched too many sessions where men talk a lot and engage in killgissa and mansplaining in 1:1 conversations with renown women or a non-binary folks who happen to be experts on the subject
- I also have heard my share of "well, actually"s and seen loads of passive aggressive nit-picky PR reviews that my female friends share with me to ask for a sanity check.
- I read a lot about that performance reviews for women focus mostly on their personality, and men on their work (the same study found that men are so much more likely to be called "genius" or "brilliant" and women to be called "overachievers")
- I see so many people having no problem remembering names of obscure retro games, whole lineages of elves in some books, or the 80s niche bands but that amazing memory fails them in case of personal pronouns
⭐️ Here are some ideas for what AI could do:
- a Notion/Confluence/Asana extension that shows stats for glue work for different team members
- a Zoom app that measures air time, including air time spent on just taking space (EDIT: I heard that Vowel does that already! Awesome in case you don't want to reinvent the wheel.)
- an extension that scans a performance review to show gendered biases
- an app that scans all performance reviews from the company for a given year to make sure that the same metrics are used for all folks at given a level/role/team
- grammarly but for correct pronouns
- a Slack app that not only checks for exclusionary language but also reminds you to use the correct pronouns - before you even send the message! (even if you can't retain that knowledge, at least you don't create a situation that's awkward at best and damaging at worst)
Tech conferences, meetups, steams, events, twitter
Time and time again, we as a tech community learn about conferences offering unequal perks and payment for talks to different speakers.
We've also seen numerous conferences that completely forgot that there are plenty experts in tech who are women and non-binary people. To alleviate that, it happens frequently women (oftentimes experts themselves) are moderators in an all-male panel.
⭐️ Here are some ideas for what AI could do:
- create a list of women and non-binary speakers, together with examples of most recent talks, and contact info
- create a list of male experts to moderate panels featuring women and non-binary experts
- fetch a list of male tech community leaders who could deliver talks on the value of empathy, collaboration, inclusivity, team culture
- find out if a male speaker you're considering is known to talk about gender equality so you can ask him to mentor a first-time speaker who is a woman or a non-binary person
- get a report on whether your tech event location might be dangerous for any minority, such as LGBTQIAP+ folks in UAE or trans folks in many states in the USA
- (for male community tech leaders who state allyship with women and non-binary folks) checking stats on who you are retweeting, quote tweeting positively, and liking - it's really no longer enough to just say you support women, trans women and non-binary folks but in the times of wild twitter algorithms (for some, and life-threatening trans-hatred for others), you just gotta amplify their voices and recommend them as experts
Surveys
For at least as long as I have been in tech, the "State of JS" and "State of CSS" surveys have been not only doing a poor job when it comes to survey design, data collection, and analysis - they also have done a lot of damage for women in tech. For context, the survey consistently gets only around 5% of responses from women and non-binary folks, and year after year, there's some discussion around it.
There's a lot I could say about the survey given that I taught research methods for two years as a part of my PhD in Sociology program. For example, the presentation of the data on most popular community leaders does not factor the gender of the responders, resulting in an impression that that is the ultimate list of who's who in the JS ecosystem - and it features only four women. But not if you adjust the data for demographics - then you'd already see 10 women creators there. And if the survey actually collects more responses from women and non-binary respondents, the list will be only more diverse.
But the survey design flaws are less important than its outcomes. I don't think there's any data on the actual consequences so for now, three anecdotes from my experience need to suffice:
- A male friend lamented how he can't get any women into the hiring pipeline "but then again, there are fewer than 10% of women in the web dev".
- A conference organizer pondered if there's even any point in diversity efforts with regards to the lineup.
- A few women streamers or community leaders were disheartened by not making it to the "Top 50" list of the survey.
The organizers say they sent dozens emails to women in tech organizations and heard almost no responses. This sounds almost incredible given how easy it is to get a response for a project like that. Was there a problem with how the emails were written? Maybe when they were sent? Why emails, anyway? Why not a DM on Twitter or an intro through a friend? Weirdly enough, none of my six community organizer friends received such an email, nor did I found one in my inbox - even though I have been a co-organizer of a React community for women and non-binary folks for the past four years, and even though I myself offered such help for example here on DEV. It's really odd because I have seen Sacha Greif, the main author of the survey, advocate vehemently for women, trans folks, and all other minorities - and I actually do trust that his intentions are good.
⭐️ Here are some ideas for AI to fix it:
- following the advice from the Sociology handbooks and research findings, evaluate the survey before it is released from the perspective of research method design, with special emphasis on potential biases (not only gender)
- OR: being given the goals of the survey, design one in reference to the advice from the Sociology handbooks and research findings
- given the scope of the survey (JavaScript ecosystem), define the representative sample, making sure it is statistically significant, as well as the sampling method
- generate ideas for tweets, toots, blog posts, Discord and Slack messages that the organizers can post in various communities when asking for help
- having received responses, assign a correct weight to respondents in representing the data
- scan the final report for inaccuracies and include context information
Is a better world possible with AI?
As much as I'd love to believe the opposite was true, AI is just an extension of the biases of its creators.
For example, earlier in this post I mentioned performance reviews - in fact, ChatGPT writes longer and harsher performance reviews when it assumes that the employee is a woman. It views: nurse, receptionist, kindergarten teacher as women; mechanic as a man; banker and engineer as men or neutral. Here's also a tweet about ChatGPT recommendation letters:
male name reports "spotty attendance but very bright" -- with female name, we hear about their kindness & empathy and the children they tutored.
All the above app ideas could (and should) be done not by AI but by a human - if only the humans in questions cared more, were more self-aware, and possibly less exhausted (here I'm giving a benefit of a doubt to everyone who is well-meaning but has only so many spoons). For example, I spoke about better conference lineups - if you are curious about how to design one, read this great writeup about DevOps Days authored by Bridget Kromhout.
And before you actually go and design a next AI-driven app that will change the world - see if the change will be for better. AI has been designed with huge biases and it goes majorly unchallenged.
⭐️ If you'd like to learn more about the biases of AI, follow:
- Timnit Gebru who runs Distributed AI Research Institute
- Kestral Gaian who teaches ethics at London Interdisciplinary School
- Safiya Umoja Noble who wrote "Algorithms of Oppression"
Cover photo was designed by DALLE with a prompt: "if gender equality was a kaleidoscope image"