时间:2025-04-26 17:39:19 来源:网络整理编辑:焦點
Unlike a lot of email signatures these days, Gmail doesn't specify its preferred pronoun.To avoid pe
Unlike a lot of email signatures these days, Gmail doesn't specify its preferred pronoun.
To avoid perpetuating gender bias, Gmail stopped its "Smart Compose" text prediction feature — which provides likely ends of sentences and other phrases for Gmail users while composing emails — from suggesting pronouns, Reuters reported Tuesday.
SEE ALSO:Amazon used AI to promote diversity. Too bad it’s plagued with gender bias.Google told Mashable that Smart Compose launched in May with that bias-averting policy already in place. However, Gmail product manager Paul Lambert only recently revealed this intentional move in interviews with Reuters.
Apparently, during product testing, a company researcher noticed that Smart Compose was assigning gendered pronouns in a way that mirrored some real-world gender bias: It automatically ascribed a "him" pronoun to a person only previously described as an "investor." In other words, it assumed that the investor — a role in a largely male-dominated field — was a man.
Studies show that in language, gender bias — or assuming someone's gender based on stereotypes or tendencies associated with men or women — has the power to both "perpetuate and reproduce" bias in the way people treat each other, and the way we think of ourselves.
"Gender-biased language is harmful because it limits all of us," Toni Van Pelt, the president of the National Organization for Women (NOW) said. "If a woman is using AI, and it refers to an engineer as a 'him,' it may get in her brain that only men make good engineers. It limits our scope of dreaming. That’s why it sets us back so far."
Tweet may have been deleted
Gmail reportedly attempted several fixes for its own subtle gender bias, but none of them were perfect. So the Smart Compose architects decided the best solution was to remove pronoun suggestions altogether.
"At Google, we are actively researching unintended bias and mitigation strategies because we are committed to making products that work well for everyone," a Google spokesperson told Mashable over email. "We noticed the pronoun bias in January 2018 and took measures to counter it (as reported by Reuters) before launching Smart Compose to users in May 2018."
But an inherently sexist A.I. is not to blame for the potential gender bias within the algorithm. As with other A.I. tools, the gender bias at the root of Google's pronoun problem is a human one.
"Algorithms are reproducing the biases that we already have in our language," Calvin Lai, a Washington University in St. Louis professor and research director for the implicit bias research center Project Implicit told Mashable. "The algorithm doesn’t have a sense of what’s socially or morally acceptable."
Both Lai and Saska Mojsilovic, IBM's AI Science fellow specializing in algorithmic bias, explained that bias usually enters algorithms through the data algorithms learn from, also known as "training data."
Mojsilovic said, "Training data can reflect bias in some way shape or form, because as a society, this is what we generate."
A Natural Language Generator (NLG) like Smart Compose learns how to "speak" by reading and replicating the words of humans. So if data contains overt or subconscious bias, expressed in language, then AI learning from that data will reproduce those tendencies.
Tweet may have been deleted
Another sticking point is that bias in text generation is often difficult to articulate, and very dependent on context. And because the idea of bias and gender can be more interpretive or subjective, it can be harder to teach a machine to recognize and eradicate it.
"For us, as scientists and researchers, text is a more difficult category to master than other data types," Mojsilovic said. "Because text is fluid, and it's very hard to define what it means to be biased."
"A lot of times we think about gender bias in an old-school explicit way," Lai said. "But a lot of it happens much more subtly, on the basic assumptions that we have of other people."
Google is aware of the challenges that arise from training data. The company confirmed that it tests its algorithm training data for bias before deploying it. This is a continual process.
"As language understanding models use billions of common phrases and sentences to automatically learn about the world, it can also reflect human cognitive biases by default," a Google spokesperson told Mashable over email. "Being aware of this is a good start, and the conversation around how to handle it is ongoing."
Moreover, Gmail's Smart Compose provides its own set of challenges beyond other NLG tools. At the launch of Smart Compose predecessor Smart Reply, Google wrote that its NLG tools learn from and tailor its suggestions to individual Gmail users. So even if the algorithm was trained on data tested for bias, the very real and flawed humans it continues to learn from may have prejudices that they subconsciously express through text.
"They’re ultimately based on how people are using the language," Lai said. "And sometimes that might reflect something accurate about the world. And sometimes it might not."
At this point, removing pronoun suggestions may be the best option to avoid gender bias, or to avoid prescribing a pronoun that doesn't match someone's gender identity. NOW's Toni Van Pelt applauds the decision, and sees sensitivity around pronouns as an admirable move for an industry leader like Google.
"I think it’s really important that they were aware of their prejudice, they were aware of their bias, and did the right thing in being conservative in eliminating this," Van Pelt said. "They are leading by example for the other AI companies."
But it's also a temporary fix to the pervasive problem of making sure AI doesn't reflect and enhance our own biases.
"It leaves it up to the user to make up their own minds, rather than put the responsibility on the algorithm’s shoulders," Lai said. "That seems to be one way to absolve or remain a neutral party."
This is a problem Google is proactively working on. The company has released multiple studies, tools, and other initiatives to help developers eradicate bias. And it's working to define a criteria "fairness," which is a prerequisite for getting rid of bias from AI NLG tools in the first place.
Other researchers are also leading the way. IBM has built a tool anyone can use to assess training data. Lai's consortium Project Implicit studies the phenomenon of and potential preventions for implicit bias. (You can see some of their work here). And, crucially, hiring a diverse workforce — one that reflects the real world — is paramount to creating equitable and moral AI.
"We hold these algorithms, perhaps rightfully so, to a higher standard than we hold every day people," Lai said. "There is a vested interest in terms of our society’s values and morals to be gender neutral in many of these cases."
The silver lining: The extent to which these biases are so deeply engrained in our collective language is coming to the fore because of the development of AI. Recognizing bias as we build these tools provides the opportunity to help correct it.
"We are living in a world that is full of biases, the biases we created as humans," Mojsilovic said. "If we are really diligent about it, think about the outcome that we can end up with the technology that can actually be better than us, or help us be better, because it will teach us or point out what we ourselves might have missed."
TopicsArtificial Intelligence
Honda's all2025-04-26 17:14
足總杯八強出爐!四分之一決賽曼聯大戰富勒姆,曼城對陣伯恩利!2025-04-26 17:13
2023.02.04 昨夜今晨 英超 比賽集錦 !2025-04-26 17:00
FIFA 2022 年度最佳陣容 26 人候選名單出爐 ,梅西 、C 羅領銜,如何評價這份名單 ?(fifa梅西過人)2025-04-26 16:51
Nancy Pelosi warns colleagues after info hacked2025-04-26 16:25
國羽歐洲賽季首秀糟糕 ,石宇奇韓悅一輪遊,兩對頭牌男雙全出局2025-04-26 16:16
【意甲】克雷莫納VS羅馬,紅狼再戰降級隊 ,能否成功複仇2025-04-26 16:10
3/6周一賽事前瞻 :英超 布倫特VS富勒姆(內附4場掃盤)2025-04-26 16:08
Balloon fanatic Tim Kaine is also, of course, very good at harmonica2025-04-26 15:56
世界羽聯德國公開賽3.9日戰報賽程 :國羽14戰9勝 ,張藝曼遇山口茜2025-04-26 15:39
Mall builds real2025-04-26 17:14
原創 世界羽聯宣布2023年至2026年巡回賽將增加四站賽事 ! ! !2025-04-26 16:46
原創 足總杯賽程如火如荼,切爾西希望渺茫,阿斯頓維拉笑納大禮!2025-04-26 16:33
3.1周三賽事解析:足總杯英超法國杯八場賽事奉上!2025-04-26 16:28
Did our grandparents have the best beauty advice?2025-04-26 16:21
意甲第23輪前瞻 :都靈 VS 克雷莫納 ,雙方急需3分,首發陣容出爐2025-04-26 16:01
足總杯8強產生4席!曼城3球完勝 ,萊斯特城12025-04-26 15:58
2023年世界羽聯巡回賽賽程表2025-04-26 15:46
U.S. pole vaulter skids to a halt for national anthem2025-04-26 15:22
英超聯賽針對阿森納頻繁的錯判不是技術原因,純粹是良心問題2025-04-26 15:05