Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Please stop using anecdotes and provide safety statistics in your argument, like this: https://electrek.co/wp-content/uploads/sites/3/2023/10/Tesla...


Please stop using intentionally falsified statistics by the manufacturer to push product. NHTSA has already stated that Tesla numbers do not even make basic attempts to make the numbers comparable.

Tesla only counts crashes with pyrotechnic deployments. NHTSA has stated this only accounts for ~18% of crashes on average [1] which can be derived from publicly available datasets. No competent statistician or scientist would miss a literal 5x underestimation that is frequently mentioned by laypeople as a source of uncompensated bias and that is easily derivable from well-known public datasets. They make no attempt to account for other less easily computable or subtle forms of bias before blasting it at the top of their lungs to convince customers to risk their lives.

That is intentional falsification meant to push product and has no place in civil society.

[1] https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf


But... Your comment was an anecdote too?


safety arguments need to be statistical in nature

there's no such thing as perfect safety


"Tesla vehicles not using Autopilot technology" did not even have basic AEB


I would also prefer more granularity on the data, 100%


If you care about statistics, you would check those numbers, and find that they are basically bullshit.

Tesla does not release any verifiable data on FSD / Autopilot, and for a good reason (good for them).


They're legally required to provide accurate accident data to the government, that's what these figures are based on

If you have information that Tesla is faking statistics it's giving to the government, you should make your evidence public


No, they are not.

Tesla is pretty much the only car maker that does NOT provide data on autonomous car testing. It's not mandatory.

As for the concerns about the data accuracy, they are very much public now. Here is the latest ongoing NTHSA investigation into Tesla:

https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf

Some quotes:

"Gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting. Tesla receives telematic data from its vehicles, when appropriate cellular connectivity exists and the antenna is not damaged during a crash, that support both crash notification and aggregation of fleet vehicle mileage. Tesla largely receives data for crashes only with pyrotechnic deployment,2 which are a minority of police reported crashes.3 A review of NHTSA’s 2021 FARS and Crash Report Sampling System (CRSS) finds that only 18 percent of police-reported crashes include airbag deployments."

"ODI uses all sources of crash data, including crash telematics data, when identifying crashes that warrant additional follow-up or investigation. ODI’s review uncovered crashes for which Autopilot was engaged that Tesla was not notified of via telematics. Prior to the recall, Tesla vehicles with Autopilot engaged had a pattern of frontal plane crashes that would have been avoidable by attentive drivers, which appropriately resulted in a safety defect finding.

Peer Comparison Data gathered from peer IR letters helped ODI document the state of the L2 market in the United States, as well as each manufacturer’s approach to the development, design choices, deployment, and improvement of its systems. A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities. "




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: