Photo by @neuvalence

Artificial Intelligence and Fairness

Inequality in the field of AI, Gender Theory and Action

The Current Inequality in the field of AI

I would like to present a few key stats from EqualAI’s website. It is an initiative focused on correcting and preventing unconscious bias in the development of AI. With leaders across business, technology and academia they are developing guidelines, standards and tools for equal representation in the field of AI.

  • In 1984, 37 percent of computer science majors were women, but by 2014 that number had dropped to 18 percent.
  • The computing industry’s rate of U.S. job creation is three times the national average, but if trends continue, the study estimates that women will hold only 20 percent of computing jobs by 2025.
  • As few as 13.5% of the machine learning field is female, and only 18% of software developers and 21% of computer programmers are women, according to some estimates.
  • Only 4% of Americans could name a female leader in tech — a quarter of which named “Alexa” or “Siri”.
  • 4 of the 4 top in-home virtual assistants have been created with a female gender.
  1. To value traditional femininity.
  2. To deny differences between men and women.
  • There is a diversity crisis in AI sector across gender and race
  • The focus on women in tech is too narrow and likely to privilege white women over others.
  • The vast majority of AI studies assume gender is binary, and commonly assign people as ‘male’ or ‘female’ based on physical appearance and stereotypical assumptions, erasing all other forms of gender identity
  • The focus on pipeline in recruitment has not addressed deeper issues in working cultures, power asymmetries, harassment, exclusionary hiring practices, unfair compensation, and tokenisation that cause people to leave or avoid the sector.
  • AI systems classification, detection, and prediction of race and gender need urgent re-evaluation. Appearance classification is scientifically flawed and easily abused. Predicting sexuality, criminality or competence (through micro-expressions) are replicating patterns of bias that may justify historical inequality.

Algorithmic Accountability Act

Us lawmakers are proposing a bill that would require large companies to audit machine learning-powered systems. Large in this case would mean (A) $50,000,000 in average annual gross receipts for the 3-taxable-year; (B) possesses or controls personal information on more than 1,000,000 consumers or 1,000,000 consumer devices; © owned by a company meeting the previous requirements; (D) is a data broker or commercial entity. The bill would thus require companies to: “study and fix flawed computer algorithms that result in inaccurate, unfair, biased, or discriminatory decisions impacting Americans.”

Gender Theory in Machine Learning

It was argued in May 2018 by Susan Leavy that over-representation of men in the design of artificial intelligence can quietly undo decades of advances in gender equality. This was published in a paper at the International Workshop on Gender Equality in Software Engineering. If big data is ‘laden’ with stereotypical concepts of gender it will perpetuate a bias that may disadvantage women. At the time of writing she presented a few issues:

  • Facial recognition works best for white men.
  • Reflects the embedded values of its creators. Developers of artificial intelligence are overwhelmingly male.
  • Scoring systems are used to make decisions about finance, insurance and insurance. Which may be problematic with the given conditions.
  1. Improved assessment of training data.
  2. Incorporation of concepts of fairness in algorithms.
  3. Bring in research from decades of existing research on the relationship between gender ideology and language. This can be incorporated into machine learning from textual data.
  4. Review of learned gender-based associations.

The Making of Computer Scientists

We can change a lot in companies however there is an argument to understanding more in depth how we address the education of computer scientist and related disciplines. After all what we learn and the way these different professionals are conditioned shapes the products, bias and strategies that are being made.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alex Moltzau

AI Policy, Governance, Ethics and International Partnerships at All views are my own.