A Chinese TV channel spent a bunch of money doing ADAS tests and Tesla came out on top of all the Chinese brands, including all the LIDAR systems. Although tests were all in the day time.
There's also been talk of companies pushing a hybrid LIDAR+vision approach using custom hardware since it's complex to merge the two datasets. So the answer might eventually be somewhere in between instead of companies choosing one or the other depending on costs.
- Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
- Virtually all of the Chinese car infotainment were basically a 1:1 copy of Tesla's. I couldn't find any that genuinely tried something unique lol
> - Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
Three things can be simultaneously true:
* Tesla's cameras are sufficient for some scenarios.
* Tesla's cameras are insufficient for other scenarios.
* A system with good data and bad algorithmic processing is still going to be bad. The Chinese vehicles almost always fail the tests because they see the obstacle but drive into it anyway.
Yeah it's interesting hearing their engineering logic, that fewer sensor types means less sensor collision and faster iteration, where iteration speed is really what matters. I also think people overhyped lidar because they don't understand it, and human behavior is to associate things we don't understand to magic. It's not magic, it performs poorly in inclination weather and can have issues with resolution over range and data processing (although lidar does do a lot of things well).
All of this said, once Karpathy left they have slowly looked at adding new sensors (recently radar), so who knows what the future for Tesla's sensor suite holds.
I wonder if we're going to see a different spin on dieselgate in the future. Where a car company collects all the data from the NHTSA's test environment through the cars cameras/sensors and then includes that data into the training datasets for other cars/sfw updates. (I'm not implying that this happened, but I imagine it would at some point)
Neat. I wonder which others will pass. I wonder if safety sense 3 cars will pass too. Speaking of which it’s insane a sienna doesn’t have that. I wish Tesla made a van instead of the cyber truck. Americans and their truck obsession…
Looks like they replace the entire drivetrain. Some older Ford vans, like the 90s windstars, were built on small truck platforms, so you could probably do one. But I'm not sure they'd sell one to ya
Autopilot (no longer for sale) is so unsafe I’m surprised there’s no class action for owners to force Tesla to upgrade it to FSD for free.
Especially in right hand drive markets (non US) it’s even worse than Toyota’s radar cruise.
I’ve nearly been killed by it about 5 times because it randomly steers into fences and things. It also randomly fails to change lanes (1 in 100), and then just randomly steers full lock and goes out of control.
Autopilot in my 2024 Model Y never changes lanes. That has always been a feature restricted to “Full” Self Driving. Autopilot is just lane assist and cruise control.
I can’t recall anytime either Autopilot or FSD put me in danger though.
The branding is confusing. I’m talking about the paid version of Autopilot, which was for sale for $5000, sometimes called “Autopilot Plus”.
For right hand drive markets, it seems to be a stripped down version of FSD 10 or 11. It automatically changes lanes, takes corners and highway exits, but does not stop at traffic lights. It drives exactly in the middle of the lane, doesn’t shuffle over for trucks, and is easily confused.
So, that's not my experience with current FSD versions. But whatever, sure. Let's accept your data point as measured:
Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation. The human failure tends to be failing to detect a real obstacle, vs. slowing for a phantom one.
This is just too much. If you don't like it don't use it. But to pretend that stomps-the-brakes-every-few-hours is a stop ship kind of safety bug is quite frankly ridiculous.
> Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation
Wait...what are counting as an "unexpected braking event"? I can't think of anything I do with brakes that would not be counted as ordinary braking that happens anywhere near as often as every two hours.
The article is vague, but I suspect this is referring to FMVSS-127 which makes certain active safety features mandatory in 2029 and also increases the difficulty of some required to pass scenarios. The new scenarios require responding from higher initial speeds which effectively requires longer sensor ranges and/or lower latency.
There is a big difference between "something like" and actually passing the tests, I would be surprised if any non vision based system has the reaction time needed to pass the new pedestrian tests.
Apparently some HW3 cars can get it. It's listed available for my 2022 Model 3 (Australia/Sydney). However the cost is twice what they charge for HW4, I believe.
It seems other HW3 might get a FSD-lite version. There's no official way to upgrade HW3-HW4.
I'm in the same boat, this is a whole thing right now. There is some kinda class action in Europe which will hopefully make them pay up or deliver something useful. I think a Refund plus interest plus a hefty fine for lying would be a good start.
China is more repressive than the United States on basically every metric of fascism/authoritarianism that political scientists actually use. Do we need to elaborate?
A Tesla still can't detect a motorcycle next to it, so I can't see how it would ace the blind spot warning test.
Any other administration and I would be willing to grant the benefit of the doubt, but Musk's spent a lot of money to corrupt government agencies over the past year and a half so that he could get silly pronouncements that the most dangerous "advanced" driving system in the world is somehow also the safest. (More people have been killed by Tesla's ADAS systems than every other automaker's ADAS systems, in the world, combined.)
https://www.youtube.com/watch?v=0xumyEf-WRI&t=1203s
https://electrek.co/2025/07/29/another-huge-chinese-self-dri...
XPENG (major chinese ADAS brand) recently decided to copy Tesla's vision-only+AI world gen data approach, after originally focusing only on LIDAR https://electrek.co/2026/04/29/xpeng-vla-2-test-drive-tesla-...
There's also been talk of companies pushing a hybrid LIDAR+vision approach using custom hardware since it's complex to merge the two datasets. So the answer might eventually be somewhere in between instead of companies choosing one or the other depending on costs.
- Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
- Virtually all of the Chinese car infotainment were basically a 1:1 copy of Tesla's. I couldn't find any that genuinely tried something unique lol
Three things can be simultaneously true:
* Tesla's cameras are sufficient for some scenarios.
* Tesla's cameras are insufficient for other scenarios.
* A system with good data and bad algorithmic processing is still going to be bad. The Chinese vehicles almost always fail the tests because they see the obstacle but drive into it anyway.
Yes Waymo exists, but the amount of training data they have is a few orders of magnitude lower.
All of this said, once Karpathy left they have slowly looked at adding new sensors (recently radar), so who knows what the future for Tesla's sensor suite holds.
Cause if not, it would be hilarious to do that to a clapped out van...
Still would love to see it. The idea reminds me of farm truck https://okcfarmtruck.com/pages/about
Especially in right hand drive markets (non US) it’s even worse than Toyota’s radar cruise.
I’ve nearly been killed by it about 5 times because it randomly steers into fences and things. It also randomly fails to change lanes (1 in 100), and then just randomly steers full lock and goes out of control.
Model 3 - Highland
I can’t recall anytime either Autopilot or FSD put me in danger though.
For right hand drive markets, it seems to be a stripped down version of FSD 10 or 11. It automatically changes lanes, takes corners and highway exits, but does not stop at traffic lights. It drives exactly in the middle of the lane, doesn’t shuffle over for trucks, and is easily confused.
Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation. The human failure tends to be failing to detect a real obstacle, vs. slowing for a phantom one.
This is just too much. If you don't like it don't use it. But to pretend that stomps-the-brakes-every-few-hours is a stop ship kind of safety bug is quite frankly ridiculous.
Wait...what are counting as an "unexpected braking event"? I can't think of anything I do with brakes that would not be counted as ordinary braking that happens anywhere near as often as every two hours.
Don't most cars do something like that now? I'm curious what's different between Tesla and, say, a Honda Accord?
It seems other HW3 might get a FSD-lite version. There's no official way to upgrade HW3-HW4.
Any other administration and I would be willing to grant the benefit of the doubt, but Musk's spent a lot of money to corrupt government agencies over the past year and a half so that he could get silly pronouncements that the most dangerous "advanced" driving system in the world is somehow also the safest. (More people have been killed by Tesla's ADAS systems than every other automaker's ADAS systems, in the world, combined.)