Tesla Robotaxis Reportedly Crashing at a Rate That's 4x Higher Than Humans
-
They’re 4 times as capable ~of~ ~crashing~ as a human driver. How efficient!
Whaaa how do you do subscript (?) text! Aaaaah!
-
Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.
Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
I do (sarcastically) love knowing Leave the World Behind is a documentary.

-
a crash with a bus while the Tesla vehicle was stopped
Uuh…wouldn’t that be the fault of the bus? I mean, the system is faulty as fuck so there’s really no need to mix in shit like this, it reduces legitimacy of the otherwise very valid criticism.
Entirely possible, but all incidents are counted as it would probably be difficult to produce reliable stats where you’re leaving out some based on some kind of an assessment of blame.
Because Tesla hides most of the details unlike the competition we can’t really look at a specific one and know.
-
Regular FSD rate has the driver (you) monitoring the car so there will be less accidents IF you properly stay attentive as you’re supposed to be.
The FSD rides with a saftey monitor (passenger seat) had a button to stop the ride.
The driverless and no monitor cars have nothing.
So you get more accidents as you remove that supervision.
Edit: this would be on the same software versions… it will obviously get better to some extent, so comparing old versions to new versions really only tells us its getting better or worse in relation to the past rates, but in all 3 scenarios there should still be different rates of accidents on the same software.
The unsupervised cars are very unlikely to be involved in these crashes yet because according to Robotaxi tracker there was only a single one of those operational and only for the final week of January.
As you suggest there’s a difference in how much the monitor can really do about FSD misbehaving compared to a driver in the driver’s seat though. On the other hand they’re still forced to have the monitor behind the wheel in California so you wouldn’t expect a difference in accident rate based on that there, would be interesting to compare.
-
Smh they should have paid for the ‘not killed spontaneously’ package. Their fault, really.
Username checks out
-
He’s right in that if current AI models were genuinely intelligent in the way humans are then cameras would be enough to achieve at least human level driving skills. The problem of course is that AI models are not nearly at that level yet
I am a Human and there were occasions where I couldn’t tell if it’s an obstacle on the road or a weird shadow…
-
“So long as the AI has the same intelligence as a human brain” is a pretty big assumption. That assumption is in sci-fi territory.
Yeah thats my point
-
Use lidar you ketamine saturated motherfucker
Can’t do that. Then he would have to upgrade all legacy cars. And he is missing the lidar dataset.
-
I am a Human and there were occasions where I couldn’t tell if it’s an obstacle on the road or a weird shadow…
Yes. In theory cameras should be enough to get you up to human level driving competence but even that is a low bar.
-
Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.
Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
-
Cameras are inferior to human vision in many ways. Especially the ones used on Teslas.
Lower dynamic range for one.
-
Eh, not really though. Generally if your car is stopped, even in the middle of the road, you are not at fault if someone else hits you. You can still get fined for obstruction of traffic, but the incident is entirely the fault of the moving vehicle.
If you stop in the middle of a highway you absolutely are at fault.
-
I do (sarcastically) love knowing Leave the World Behind is a documentary.

Thanks Obama.
-
Can’t do that. Then he would have to upgrade all legacy cars. And he is missing the lidar dataset.
The best time to add lidar would have been years ago, the second best time is right now. I don’t think he would have to update the old cars, it could just be part of the hardware V5 package. He’s obviously comfortable with having customers beta testing production vehicles so he can start creating a lidar set now or he can continue failing to make reliable self-driving cars.
-
Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.
Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
And newsom doesn’t give a shit
-
The best time to add lidar would have been years ago, the second best time is right now. I don’t think he would have to update the old cars, it could just be part of the hardware V5 package. He’s obviously comfortable with having customers beta testing production vehicles so he can start creating a lidar set now or he can continue failing to make reliable self-driving cars.
I agree with you. Musk’s ego doesn’t.
-
I agree it would be better. I’m just saying that in theory cameras are all that would be required to achieve human level performance, so long as the AI was capable enough
Except humans have self cleaning lenses. Cars don’t.
-
A Angela shared this topic
-
Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.
Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
@spaghettiwestern @AngelaPreston But but but. They told us self-driving cars would be safer than human drivers.
-
Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.
Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
Musk = POS Nazi.
-
Are they even insured like typical insurance?
If Tesla owns it, don’t they just pay out of pocket as needed, they don’t actually have a monthly payment to themselves or anything?
There’s no way, then why can’t I drive around “uninsured” with the promise I’ll pay out of my pocket for any damage.
