您的当前位置:首页 >百科 >【】 正文

【】

时间:2024-11-23 17:27:54 来源:网络整理编辑:百科

核心提示

A seven-second self-driving Tesla video posted to Twitter this past weekend shows a likely illegal d

A seven-second self-driving Tesla video posted to Twitter this past weekend shows a likely illegal driving event in San Francisco that, unfortunately, happens all the time in ordinary situations. It’s the poster’s reaction that has left many online unsettled. 

In the video, we watch from inside the car’s cabin as the robo-Tesla edges close to a pedestrian who enters an intersection. The software display clearly shows the software picking up on the presence of the completely unprotected and vulnerable human, but Tesla’s software, it appears, doesn’t trigger compliance with the relevant law that says cars have to yield to pedestrians. 

The text of the tweet couldn’t be more jolly about the victimless (in this case) moving violation, calling it "One of the most bullish / exciting things I've seen on Tesla Full Self-Driving Beta 11.4.1," and describes not yielding as proceeding "like a human would," rather than “slamming on the brakes." 

SEE ALSO:Tesla's fastest-ever Model S gets tested in Top Gear video

Whole Mars Catalog, which posted the video, is the online identity of a Tesla and SpaceX fan apparently named Omar Qazi. The Whole Mars Catalog Twitter account has hundreds of thousands of followers, and draws approving replies from Elon Musk.

Mashable Games
Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

For a rundown of this not-at-all obscure law, check out the third second of the video, in which a sign can be seen which says "STATE LAW YIELD TO [PICTURE OF PEDESTRIAN] WITHIN CROSSWALK," just to the right of the pedestrian in the crosswalk not being yielded to.   

Elon Musk recently touted the latest update to Tesla Full Self-Driving beta, version 11.4.1, as a steep improvement. It’s been slowly rolling out to Tesla consumers this month, and It’s supposedly so good that it should be called version 12, except that 12 will, Musk claims, be the release of an "end-to-end AI" version of FSD, whatever that means. 

The video has some Twitter users concerned:


Related Stories
  • Tesla increases prices of Model 3 and Model Y in the U.S.
  • Tesla to have a Cybertruck delivery event in the third quarter of 2023
  • Tesla Cybertruck's massive windshield wiper caught on video

The Whole Mars Catalog Twitter account is, so far, full-throated in its defense of FSD 11.4.1, claiming that "people who don't live in cities aren't getting this," and that prior FSD updates would have caused the Tesla in question to stop within the crosswalk in this scenario. "That is wrong, continuing forward is right," claims Whole Mars Catalog. Indeed, stopping within a crosswalk blocks it, and no, that isn’t safe for pedestrians. The issue, however, is that this Tesla quite clearly has ample time to stop well before the crosswalk. 

That a California driver may well also ignore the law in a similar situation is irrefutable (the author of this article is a reluctant Los Angeles driver). However, whether or not we should program our robotic cars to also drive this way should, perhaps, be a matter for public debate, and, maybe, for the legal system to decide. Such a democratically-informed decision could perhaps be made before this new norm is rolled out as a software update, and exuberantly lapped up by fans of a billionaire who then get to experiment with it on public roads. 

TopicsTesla