Tesla’s self-driving technology fails to detect children in the road, tests find

Professional test driver using Tesla’s Full Self-Driving mode repeatedly hit a child-sized mannequin in its path

A safe-technology advocacy group issued claimed on Tuesday that Tesla’s full self-driving software represents a potentially lethal threat to child pedestrians, the latest in a series of claims and investigations into the technology to hit the world’s leading electric carmaker.

According to a safety test conducted by the Dawn Project, the latest version of Tesla Full Self-Driving (FSD) Beta software repeatedly hit a stationary, child-sized mannequin in its path. The claims that the technology apparently has trouble recognizing children form part of an ad campaign urging the public to pressure Congress to ban Tesla’s auto-driving technology.

Continue reading…

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments