International Women’s Day 2023
Today is March 08 and with it International Feminist Fight Day or International Women’s Day - here are some of our thoughts on how this day relates to data and digital literacy.
2023-03-08 | Zoé, Verena, Roven & Camille

International Feminist Fight Day
Today is March 08 and thus the International Feminist Fight Day or International Women’s Day - since 1911, every year on this day attention is drawn to women’s rights, equality of all genders and networking of groups oppressed by patriarchy. In Germany, International Women’s Day is a public holiday in two states: Berlin and Mecklenburg-Western Pomerania. Here are some thoughts on how this day relates to data and digital education.
The Gender Data Gap
In addition to Gender Pay Gap, Gender Pension Gap, Gender Lifetime Earnings Gap, Gender Care Cap, and other gender gaps, there is also the Gender DATA Gap. As the name implies, this is about the gender gap in data - data on the various factors that impact women and girls are largely missing. Women account for just 24 percent of heart-related studies - with the result that symptoms of heart attack for women are considered atypical and are often not recognized or are recognized too late. This is a problem that could easily be solved.
Gender Equality, technology and digital education
The United Nations has chosen this year’s Women’s Day under the theme of “DigitALL: Innovation and technology for gender equality”. 37 percent of women do not use the Internet, only 22 percent of jobs in the field of artificial intelligence are filled by women. How can barriers that prevent women from being active in the digital world be removed?
Violence on the Internet
70 percent - yes, 70 percent of girls and women in Germany have already experienced threats, insults and discrimination in social media. It is hardly surprising that women often stay out of the discourse there - with all the resulting problems for society and democracy.
Artificial Intelligence (AI) Discrimination
Some examples: AI-Image Recognition on social media leads to pictures of women being deleted more frequently, thus making these women invisible. An Algorithm of the Labor Market Service in Austria predicts that women have less of a chance in the labor market, because that’s what the algorithm was taught based on past labor market data. An AI can only be as good as the data with which it “learns” - so we absolutely need diversity in this data.
Conclusion
Women are still not equal in areas of data and artificial intelligence, which has consequences for health, the labor market or security. Especially in artificial intelligence - which is fed with data from the past - this becomes a problem of equality. The examples considered here refer mainly to “women”, more precise information is usually not given in the sources. It is not difficult to imagine the extent of these unequal treatments to additional oppressed groups. Black women and women of color, disabled women, queer women, non-binary people, trans* & inter* people, or women who are disadvantaged based on their social background are often affected by multiple forms of discrimination due to intersections of their characteristics (see intersectionality). Systems that disregard these dimensions thus learn patriarchal structures and automatically perpetuate these discriminations. Therefore, we are particularly concerned with training and raising awareness of women and other people oppressed by patriarchy in these areas in order to provide better data for better AI in the future.