Search This Blog

Thursday, June 22, 2023

Gender bias on Wikipedia

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Gender_bias_on_Wikipedia

 

The Wikipedia Monument in Słubice, Poland, features both male and female editors. The initial model for the sculpture featured only men.

Gender bias on Wikipedia, also known as the Wikipedia gender gap, refers to the fact that Wikipedia contributors are mostly male, that relatively few biographies on Wikipedia are about women, and that topics of interest to women are less well-covered.

In a 2018 survey covering 12 language versions of Wikipedia and some other Wikimedia Foundation projects, 90% of 3,734 respondents reported their gender as male, 8.8% as female, and 1% as other; among contributors to the English Wikipedia, 84.7% identified as male, 13.6% as female, and 1.7% as other (total of 88 respondents). In 2019, Katherine Maher, then CEO of Wikimedia Foundation, said her team's working assumption was that women make up 15–20% of total contributors.

Wikipedia's articles about women are less likely to be included, expanded, and detailed. A 2021 study found that, in April 2017, 41% of biographies nominated for deletion were women despite only 17% of published biographies being women. The visibility and reachability of women on Wikipedia is limited, with a 2015 report finding that female pages generally "tend to be more linked to men". Language that is considered sexist, loaded, or otherwise gendered has been identified in articles about women. Gender bias features among the most frequent criticisms of Wikipedia, sometimes as part of a more general criticism about systemic bias in Wikipedia.

In 2015, Wikipedia founder Jimmy Wales announced that the encyclopedia had failed to reach its goal to retain 25% female editorship. Programs such as edit-a-thons and Women in Red have been developed to encourage female editors and increase the coverage of women's topics.

Gender bias in participation

Efforts to measure gender disparity

The first study of world-wide presence in 2008 found that 13% of all editors were female, which, after a follow-up study in 2011, was reduced to 9% globally. In the United States, especially within the English Wikipedia, a 2015 study found that 15% of contributors were women.

In 2009, a Wikimedia Foundation survey revealed that 6% of editors who made more than 500 edits were female, with the average male editor having twice as many edits.

Comparison of results for the proportion of Wikipedia readers and editors from the nationally representative Pew survey and the WMF/UNU-MERIT survey (UNU) for a series of dichotomous variables in both surveys. Adjusted numbers for editors assume that response bias for editors is identical to observed response bias for readers and, in the rightmost column, that bias is stable for editors outside the United States. Table reproduced from this source.
Variable Readers US (Pew) Readers US (UNU) Editors US (UNU) Editors US Adj. Editors (UNU) Editors Adj.
female 49.0 39.9 17.8 22.7 12.7 16.1
married 60.1 44.1 30.9 36.3 33.2 38.4
children 36.0 29.4 16.4 27.6 14.4 25.3
immigrant 10.1 14.4 12.1 9.8 8.2 7.4
student 17.7 29.9 46.0 38.5 47.7 40.3

In 2010, United Nations University and UNU-MERIT jointly presented an overview of the results of a global Wikipedia survey. A 30 January 2011 New York Times article cited this Wikimedia Foundation collaboration, which indicated that fewer than 13% of contributors to Wikipedia are women. Sue Gardner, then executive director of the foundation, said that increasing diversity was about making the encyclopedia "as good as it could be". Factors the article cited as possibly discouraging women from editing included the "obsessive fact-loving realm", associations with the "hard-driving hacker crowd", and the necessity to be "open to very difficult, high-conflict people, even misogynists". In 2013, the results of the survey were challenged by Hill and Shaw using corrective estimation techniques to suggest upward corrections to the data from the survey and to recommend updates to the statistics being surveyed, giving 22.7% for adult US female editors and 16.1% overall.

In February 2011, The New York Times followed up with a series of opinions on the subject under the banner, "Where Are the Women in Wikipedia?" Susan C. Herring, a professor of information science and linguistics, said that she was not surprised by the Wikipedia contributors' gender gap. She said that the often contentious nature of Wikipedia article "talk" pages, where article content is discussed, is unappealing to many women, "if not outright intimidating". Joseph M. Reagle reacted similarly, saying that the combination of a "culture of hacker elitism", combined with the disproportionate effect of high-conflict members (a minority) on the community atmosphere, can make it unappealing. He said, "the ideology and rhetoric of freedom and openness can then be used (a) to suppress concerns about inappropriate or offensive speech as 'censorship' and (b) to rationalize low female participation as simply a matter of their personal preference and choice." Justine Cassell said that although women are as knowledgeable as men, and as able to defend their point of view, "it is still the case in American society that debate, contention, and vigorous defense of one's position is often still seen as a male stance, and women's use of these speech styles can call forth negative evaluations."

In April 2011, the Wikimedia Foundation conducted its first semi-annual Wikipedia survey. It suggested that 9% of Wikipedia editors are women. It also reported, "Contrary to the perception of some, our data shows that very few women editors feel like they have been harassed, and very few feel Wikipedia is a sexualized environment." However, an October 2011 paper at the International Symposium on Wikis and Open Collaboration found evidence that suggested that Wikipedia may have "a culture that may be resistant to female participation".

A study published in 2014 found that there is also an "Internet skills gap" with regard to Wikipedia editors. The authors found that the most likely Wikipedia contributors are high-skilled men and that there is no gender gap among low-skilled editors, and concluded that the "skills gap" exacerbates the gender gap among editors. During 2010–14, women made up 61% of participants of the college courses arranged by the Wiki Education Foundation program that included editing Wikipedia as part of the curriculum. Their contributions were found to shift the Wikipedia content from pop-culture and STEM towards social sciences and humanities.

In 2015, Katherine Maher, Gardner's successor as the director of the Wikimedia Foundation, argued that Wikipedia's gender bias "reflects society as a whole". For example, she noted that Wikipedia is dependent on secondary sources which have similar biases. She agreed that Wikipedia's editing process introduces biases of its own, especially as topics that are popular among its predominantly male editors draw more edits.

A 2017 study found that women participating in an experiment by editing a Wikipedia-like site tended to view other editors as male, and to view their responses as more critical than if the other editor was gender-neutral. The study concluded that:

...visible female editors on Wikipedia and broader encouragement of the use of constructive feedback may begin to alleviate the Wikipedia gender gap. Furthermore, the relatively high proportion of anonymous editors may exacerbate the Wikipedia gender gap, as anonymity may often be perceived as male and more critical.

A 2017 study by Heather Ford and Judy Wajcman observes that research on the gender bias continues to frame the problem as a deficit in women. In contrast, their central argument states that infrastructure studies in feminist technoscience allows the gender analysis to be taken to a further level. It looks at three issues within the infrastructure: content policies, software, and the legalistic framework of operation. It suggests that progress can be made through altering that culture of knowledge production through encouraging alternate knowledge, reducing the technical barriers to editing, and addressing the complexity of Wikipedia policies.

In their February 2018 article, "Pipeline of Online Participation Inequalities: The Case of Wikipedia Editing", Shaw and Hargittai concluded from their studies that solving the problems of participation inequality including gender bias requires a broader focus on subjects other than inequality. They recommended a focus on encouraging participants of all educational backgrounds, skill levels, and age groups will help Wikipedia to improve. They recommended further that informing more women that Wikipedia is free to edit and open to everyone is critical in eliminating gender bias.

In March 2018, mathematician Marie A. Vitulli wrote in Notices of the American Mathematical Society, "The percentage of women editors on Wikipedia remains dismally low."

In 2014, Noopur Raval, a PhD candidate at UC Irvine, wrote in "The Encyclopedia Must Fail!– Notes on Queering Wikipedia" that "making a platform open access does not automatically translate to equality of participation, ease of access, or cultural acceptance of the medium." In 2017, researchers Matthew A. Vetter and Keon Mandell Pettiway explain that the white, cis-gendered male dominance among Wikipedia editors has led to the "erasure of non-normative gender and sexual identities", in addition to cis-gendered females. The "androcentric and heteronormative discourses" of Wikipedia editing insufficiently allow "marginalized gender and sexual identities to take part in language use and the construction of knowledge."

Causes

Sue Gardner street portrait
Former Wikimedia Foundation executive director Sue Gardner provided nine reasons, offered by female Wikipedia editors, "Why Women Don't Edit Wikipedia."

Some gender research literature suggests that the difference in contribution rates could be due to three factors: (1) the high levels of conflict in discussions, (2) dislike of critical environments, and (3) lack of confidence in editing other contributors' work.

The New York Times pointed out that Wikipedia's female participation rate may be in line with other "public thought-leadership forums". A 2010 study revealed a Wikipedia female participation rate of 13 percent, observed to be close to the 15 percent overall female participation rate of other "public thought-leadership forums". Wikipedia research fellow Sarah Stierch acknowledged that it is "fairly common" for Wikipedia contributors to remain gender-anonymous. A perceived unwelcoming culture and tolerance of violent and abusive language are also reasons put forth for the gender gap. According to a 2013 study, another cause of the gender gap in Wikipedia is the failure to attract and retain female editors, resulting in a negative impact on Wikipedia's coverage. As well, Wikipedia "...editors that publicly identify as women face harassment" from other Wikipedia editors.

Former Wikimedia Foundation executive director Sue Gardner cited nine reasons why women don't edit Wikipedia, culled from comments by female Wikipedia editors:

  1. A lack of user-friendliness in the editing interface.
  2. Not having enough free time.
  3. A lack of self-confidence.
  4. Aversion to conflict and an unwillingness to participate in lengthy edit wars.
  5. Belief that their contributions are too likely to be reverted or deleted.
  6. Some find its overall atmosphere misogynistic.
  7. Wikipedia culture is sexual in ways they find off-putting.
  8. Being addressed as male is off-putting to women whose primary language has grammatical gender.
  9. Fewer opportunities for social relationships and a welcoming tone compared to other sites.

Though the proportion of female readership to male readership on Wikipedia is roughly equal (47%), women are less likely to convert themselves to editors (16%). Several studies suggest that there may be a formed culture in Wikipedia that discourages women from participating. Lam et al. link this culture due to a disparity in male-to-female centric topics represented and edited, the tendency for female users to be more active in the social and community aspects of Wikipedia, an increased likelihood that edits by new female editors are reverted, and/or that articles with high proportions of female editors are more contentious.

In 2019, Schlomit Aharoni Lir described "the vicious circle" model, displaying how the five layers of negative reputation, anonymity, fear, alienation and rejection – enhance each other, in a manner that deters women from contributing to the website. In order for more women to join Wikipedia, the researcher offers the implantation of a "Virtuous Circle" that consists of nonymity, connection to social media, inclusionist policy, soft deletion and red-flagging harassments.

In Wikimedia's Gender Equity Report in 2018, 14% of interviewees identified poor community health as a significant challenge in being an editor on Wikipedia. In the study, community health was defined as harassment, a general lack of support for gender equity work and a lack of diversity in leadership.

After reviewing testimonies that ranged from microaggressions to direct attacks, the Wikipedia Board of Trustees voted in May 2020 to adopt a more formal moderation process to fight against harassment and to uphold Wikipedia's community standards. The foundation has been tasked to finish the draft of this plan by the end of 2020, and it will include banning users who participate in gender harassment, providing support and communities for all gender identities, putting more resources into the Trust and Safety Team, and more.

Collier and Bear in 2012 summarized the reason for working barriers of women in Wikipedia in three words: conflict, criticism and confidence. The authors suggested that "If a community tolerates a culture of conflict that males perceived to be simply 'competitive' or witty and sarcastic they are likely to find themselves losing the many benefits female contributors can bring to the table." Criticism refers to women's unwillingness to edit someone else's work and to let their work be edited by someone else; Confidence shows that women are often not too confident about their own expertise and ability in editing and contributing to a certain work. Wikipedia's free to edit policy gives Internet users an open platform, while also unconsciously breeding a competitive and critical environment that limits women's incentives to participate.

Through examining the power infrastructure of Wikipedia, Ford and Wajcman pointed out another cause that may reinforce Wikipedia's gender bias. Editing on Wikipedia requires "particular forms of sociotechnical expertise and authority that constitute the knowledge or epistemological infrastructure of Wikipedia". People who are equipped with this expertise and skill are considered more likely to reach positions with power in Wikipedia. These are proposed to be predominantly men.

Studies have also considered the gender bias in Wikipedia from a historical perspective. Konieczny and Klein indicated that Wikipedia is just a part of our biased society which has a long history of gender inequality. As Wikipedia records daily activities by individual editors, it serves as both "a reflection of the world" and "a tool used to produce our world".

An example of a direct account of gender bias comes from Wikipedia user Lightbreather, where she recounts having pornographic images linked to her username as a way to discredit her Wikipedia contributions.

Harassment, however, also exists for LGBT people. Those who identify as being part of the community are typically subjected to harassment if their identities are made public. For example, an administrator on a Wikipedia page blocked an editor, merely because the person's username implied they were a part of the LGBT community.

Gender bias in content

In 2016, Wagner et al. found that gender inequality manifests itself in Wikipedia's biographical content in multiple ways, including unequal thresholds for including an article on the person, topical bias, linguistic bias, and structural inequalities. The authors found that women with biographies on Wikipedia are slightly more notable than men on Wikipedia, and proposed three possible explanations for future research: 1) that editors are more likely to write about their own gender, 2) that men are more likely to create articles about themselves, and 3) that external sources make women less visible. As for topical bias, biographies about women tend to focus more on family-, gender-, and relationship-related topics. This is especially true for biographies of women born before 1900. The authors also found structural differences in terms of meta-data and hyperlinks, which have consequences for information-seeking activities.

Article creation and deletion

Of the roughly 1.5 million biographical articles on the English Wikipedia in 2021, only 19% were about women. The biographies that do exist are considerably more likely to be nominated for deletion than existing articles of men.

In the English Wikipedia and five other language editions that were studied by researchers, the ratio of articles about women to articles about men was higher than in three other databases. However, analysis with computational linguistics concluded that the way women and men are described in articles demonstrates bias, with articles about women more likely to use more words relating to gender and family. The researchers believe that this is a sign Wikipedia editors consider male the "null gender" (in other words, that "male" is assumed unless otherwise specified, an example of male as norm). Another critique of Wikipedia's approach, from a 2014 Guardian editorial, is that it has difficulty making judgments about "what matters". To illustrate this point they noted that the page listing pornographic actresses was better organized than the page listing women writers.

The International Journal of Communication published research by Reagle and Lauren Rhue that examined the coverage, gender representation, and article length of thousands of biographical subjects on the English-language Wikipedia and the online Encyclopædia Britannica. They concluded that Wikipedia provided better coverage and longer articles in general, that Wikipedia typically has more articles on women than Britannica in absolute terms, but Wikipedia articles on women were more likely to be missing than articles on men relative to Britannica. That is, Wikipedia dominated Britannica in biographical coverage, but more so when it comes to men. Similarly, one might say that Britannica is more balanced in whom it neglects to cover than Wikipedia. For both reference works, article length did not consistently differ by gender. A 2011 study by researchers from the University of Minnesota and three other universities found that articles edited by women, "which presumably were more likely to be of interest to women", were "significantly shorter" on average than those worked on by men or by both men and women.

A side-by-side comparison of the portion of available biographies about women on Wikipedia versus the portion of women biographies nominated for deletion from January 2017 to February 2020, Francesca Tripodi

According to a 2021 study by sociologist Francesca Tripodi, biographies on Wikipedia about women are disproportionately nominated for deletion as non-notable. In October 2018, when Donna Strickland won a Nobel Prize in Physics, numerous write-ups mentioned that she did not previously have a Wikipedia page. A draft had been submitted, but was rejected for not demonstrating "significant coverage (not just passing mentions) about the subject".

In July 2006, Stacy Schiff wrote a New Yorker essay about Wikipedia entitled "Know It All". The Wikipedia article about her was created the very same day. According to Timothy Noah, she was apparently not notable by Wikipedia standards, despite the Guggenheim fellowship and Pulitzer Prize many years previous. Her essay and the article about her are now featured in the Wikiproject to counter systemic bias.

Article content

While most attention falls on the gap between biographies of men and women on Wikipedia, some research also focuses on linguistics and differences in topics covered. In 2020, the Association for Computational Linguistics performed a textual analysis of gender biases within Wikipedia articles. The study found that articles about women contain more gender-specific phrases such as "female scientist" while men are referenced using more gender-neutral terms such as "scientist". The study concluded that overall gender bias is decreasing for science and family oriented articles, while increasing for artistic and creative content.

A 2015 study found that, on the English Wikipedia, the word "divorced" appears more than four times as often in biographical articles about women than men. According to the Wikimedia Foundation, "We don't fully know why, but it's likely a multitude of factors, including the widespread tendency throughout history to describe the lives of women through their relationships with men."

A 2020 study of the coverage of Fortune 1000 CEOs found a gender bias in favour of women. The study investigated contributions from brandnew versus more established editors and found that new editors are more likely to introduce information biased against women, but that established users tend to overcompensate when reacting to these edits. Articles written by a more diverse group of new and established editors were found to be most neutral.

Gender bias harassment also goes beyond those who identify as cisgender on Wikipedia. For example, when celebrities come out and identify as transgender, they are commonly subjected to discrimination and their pronouns are then put up for debate. Notable examples of these debates include Chelsea Manning in 2013 and Caitlyn Jenner in 2015, when their self-declared pronouns were vandalized and reverted to their previous pronouns. A 2021 study found that articles about transgender women and non-binary people tend to have a higher percentage of their article devoted to the "Personal Life" section, which often focuses on the person's gender identity: "The implication that gender identity is a noteworthy trait for just these groups is possibly indicative of 'othering', where individuals are distinguished or labeled as not fitting a societal norm, which often occurs in the context of gender or sexual orientation."

Reactions

Wikipedia has been criticized by some academics and journalists for having primarily male contributors, and for having fewer and less extensive articles about women or topics important to women.

Writing for Slate in 2011, conservative political commentator Heather Mac Donald called Wikipedia's gender imbalance a "non-problem in search of a misguided solution." Mac Donald asserted, "The most straightforward explanation for the differing rates of participation in Wikipedia—and the one that conforms to everyday experience—is that, on average, males and females have different interests and preferred ways of spending their free time."

In August 2014, Wikipedia co-founder Jimmy Wales announced in a BBC interview the Wikimedia Foundation's plans for "doubling down" on the gender content gap at Wikipedia. Wales said the Foundation would be open to more outreach and more software changes.

In Invisible Women: Exposing Data Bias in a World Designed for Men, Caroline Criado Perez notes that many Wikipedia pages that refer to men's teams or occupations are listed as gender neutral (England national football team), while pages for similar teams or occupations for women are specifically gendered (England women's national football team).

Efforts to address gender bias

Refer to caption
Attendees at the 2013 Women in the Arts edit-a-thon in Washington, DC

Wikimedia Foundation

The Wikimedia Foundation has officially held the stance, since at least 2011 when Gardner was executive director, that gender bias exists in the project. It has made some attempts to address it but Gardner has expressed frustration with the degree of success achieved. She has also noted that "in the very limited leisure time women had, they tended to be more involved in social activities instead of editing Wikipedia. 'Women see technology more as a tool they use to accomplish tasks, rather than something fun in itself.'" In 2011, the Foundation set a target of having 25 percent of its contributors identifying as female by 2015. In August 2013, Gardner said, "I didn't solve it. We didn't solve it. The Wikimedia Foundation didn't solve it. The solution won't come from the Wikimedia Foundation."

In 2017, Wikimedia Foundation put a funding of $500,000 in building a more encouraging environment for diversity on Wikipedia.

VisualEditor, a project funded by the Wikimedia Foundation that allows for WYSIWYG-style editing on Wikipedia, is said to be aimed in part at closing the gender gap.

Thanks to a Wikimedia Foundation grant, in March 2021 an alpha version of Humaniki was released, providing a wide variety of gender gap statistics based on Wikidata. The stats are automatically updated as new information is made available.

User-led efforts

Dedicated edit-a-thons have been organized to increase the coverage of women's topics in Wikipedia and to encourage more women to edit Wikipedia. These events are supported by the Wikimedia Foundation, which sometimes provides mentors and technology to help guide newer editors through the process. Recent edit-a-thons have given specific focus to topics such as Australian female neuroscientists and women in Jewish history.

An early-2015 initiative to create a "women-only" space for Wikipedia editors was strongly opposed by Wikipedians.

Some users have tried to combat this male dominated space by creating support groups for female Wikipedia users, a prominent one being the WikiWomen's User Group. This group is used not only to promote women editing and contributing on more pages, but to also add more pages about women who contribute to society at large.

The Wikipedia Teahouse project was launched with the goal to provide a user-friendly environment for newcomers, with a particular goal of boosting women's participation in Wikipedia.

In the summer of 2015, the WikiProject Women in Red was launched on the English-language version of Wikipedia, focusing on the creation of new articles about notable women. Mainly through its monthly virtual editathons, Women in Red encourages editors to participate in extending Wikipedia's coverage. Thanks in part to the efforts of this project, by June 2018 some 17,000 new women's biographies had been added to Wikipedia.

Many Wikiprojects are committed to promoting editors' contribution on gender and women studies, which include "WikiProject women, WikiProject feminism, WikiProject gender studies, and the WikiProject countering systemic bias/gender gap task force".

Expanding beyond the male/female gender binary, Wikiproject LGBT creates a space for "re/writing the inclusion and representation of LGBTQ culture into Wikipedia mainspace."

In 2018, one edit-a-thon organizer named Sarah Osborne Bender explained to The Guardian how men take down Wikipedia pages about women leaders. "I wrote a Wikipedia article about a woman gallerist and the next day, I got a message saying it was deleted because she is not a 'noteworthy person', but someone in our community gave me advice on how to edit it to make the page stay."

In 2022, an article in VICE magazine detailed how British scientist Jessica Wade has created over 1,700 Wikipedia entries on women scientists since 2017, as many women whose contributions have gone unnoticed.

Third parties

In 2013, FemTechNet launched "Wikistorming" as a project that offers feminist scholarship and encourages Wikipedia editing as part of school and college teaching.

In July 2014, the National Science Foundation announced that it would spend $200,000 to study systemic gender bias on Wikipedia.

In 2015, Jennifer C. Edwards, history department chairperson at Manhattan College, explained that educational institutions can use Wikipedia assignments such as encyclopedia's gender gap analysis and coverage of female topics to inspire students to alter the current gender imbalance.

In 2022, Angela Fan, a researcher at Meta Platforms, announced an open-source software artificial intelligence model that will be able to create Wikipedia-style biographical rough drafts that "will one day help Wikipedia editors create many thousands of accurate, compelling biography entries for important people who are currently not on the site", including women.

Representation of a Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Representation_of_a_Lie_group...