时间:2025-07-07 06:57:34 来源:网络整理编辑:焦點
Amazon can’t block its shareholders from a vote on the company’s controversial facial re
Amazon can’t block its shareholders from a vote on the company’s controversial facial recognition technology.
In a letter to the e-commerce behemoth this week, the Securities and Exchange Commission (SEC) struck downan appeal from Amazon to block voting on proposals related to sales of Rekognition to government entities. The SEC says that Amazon must allow its shareholders to vote on these proposals.
The two proposals would speak to the concerns of shareholders, activists, and civil rights groups over the company’s facial recognition tool. The first proposalwould stop Amazon from selling its Rekognition technology to governments unless the board approved of the sales. The second proposal would require an audit into the technology in order to research Rekognition’s effects on privacy and civil rights.
SEE ALSO:Elizabeth Warren is coming after AmazonBasics. Why Amazon shouldn't fight it.Amazon has been subject to intense criticismover Rekognition, particularly concerning who the company has been selling the facial recognition service to. The Seattle-based tech giant has sold Rekognition to local police forcesand the FBI. It even sought to sell its services to U.S. Immigration and Customs Enforcement.
The company continues to deploy Rekognition to governments despite a study by the ACLU which found that the technology had could not correctly identifyimembers of Congress. Even worse, the study discovered that the facial recognition tech suffered from racial bias.
It’s unlikely either proposal will receive the support it needs to pass. Amazon CEO Jeff Bezos alone owns a sizeable sharein the company. However, it is extremely noteworthy that the company sought the unusual move to block these proposals from even coming to a vote nonetheless.
In an attempt to address these issues, Amazon has released a set of guideline suggestionsfor possible future facial recognition regulation. However, as the ACLU points out, the company’s guidelines puts the onus of the technology on the entities using the tech, saw as law enforcement, and not on the service providers like Amazon.
Just this week, Amazon received a separate letter-- this one from dozensof top AI researchers from Facebook, Microsoft, Google, and even Amazon itself -- raising civil rights concerns with Rekognition and supporting the proposals laid out by shareholders.
Amazon shareholders will meet to vote on these proposals in May.
TopicsAmazonArtificial IntelligenceFacial Recognition
Tributes flow after death of former Singapore president S.R. Nathan2025-07-07 06:48
5 creative ways to trick people into eating healthy2025-07-07 06:39
Secluded library retreat is a book lover's dream2025-07-07 06:35
It's now safe for T2025-07-07 06:10
This weird squid looks like it has googly eyes, guys2025-07-07 06:08
How tech companies are committing to helping the global refugee crisis2025-07-07 05:58
Donald Trump gets the mobile game everyone imagined, and it's amazing2025-07-07 05:31
Mum raises awareness for childhood cancer with heartbreaking school photo2025-07-07 04:28
Early Apple2025-07-07 04:26
15 public art projects that boldly advocate for social justice2025-07-07 04:24
Uber's $100M settlement over drivers as contractors may not be enough2025-07-07 06:56
Over a dozen bread products in Australia recalled after metal pieces found inside, again2025-07-07 06:51
Rami Malek makes perfect ‘Mr. Robot’ joke while accepting Emmy2025-07-07 06:00
New research extends Earth's temperature record back 2 million years2025-07-07 05:50
5 people Tim Cook calls for advice on running the biggest company in the world2025-07-07 05:43
Petition to get Steve Irwin on Aussie currency is really, really popular2025-07-07 05:36
Emmy Predictions 2016: Who Will Win2025-07-07 05:04
Tom Hiddleston stares into the abyss while wearing a fancy suit and petting a dog2025-07-07 05:02
This company is hiring someone just to drink all day2025-07-07 04:58
In praise of Mandy Moore's sperm tattoo2025-07-07 04:34