It’s not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:
"There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.
Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks…"
It’s one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack… unless you’re willing to just plain ignore the local rules.
Rest of the worlds solution: remove Tesla
American solution : remove bus lanes
„Not reliably, no“ sums up Tesla pretty well.
It can’t stop when running directly at a wall but you expect it to handle special lanes?
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
It switched autopilot off when the parking sensors (ultrasonic/radar range finder with a very narrow range) detected the wall.
If Tesla isn’t doing that to hide deficiencies from NHTSA investigations, I’ll eat my shoe.
Also FSD is an app running on the same hardware and camera systems as autopilot, what would make it better at “seeing” through mist or a reasonable facsimile of the road up ahead?
No it didn’t switch it off because of that.
FSD and autopilot do different things using the same data. It’s fact that they behave differently.
From NHTSA IN 2022:
“The agency’s analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact,” the report reads.
—https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
How else with they suddenly know to hit brakes and stitch off ~1 seconds before impact? Coincidence, or negligent coding?
So they tried to hide it from them by explicitly logging when it switched on and off in the data that they report to them? Huh?
deleted by creator
Is the taste of shoe leather getting any better?
I’m not kink shaming, but do you have a humiliation fetish? Everywhere you comment you get dragged, and yet you are one of the most prolific commentators on this site.
“you get dragged” lol
People slinging personal insults doesn’t bother me. It amuses me tbh because they get so riled up and make up all these imaginary things in their mind - like what you just did.
People who are wrong, or who don’t understand what they’re talking about, trying to correct me isn’t getting “dragged”. Just because other people in the circle jerk agree with them doesn’t mean they’re right.
I don’t care what any of you think btw. It doesn’t affect me, I don’t pay it a second thought. I could have had a dozen conversations with you and I wouldn’t even recognise your username because it means nothing to me.
But keep trying, whatever makes you feel big on the internet :)
Yessss meme war
Fuck me man over 800 comments in less than a month?
Just put the phone down and go get some air
I work on a computer 8-10 hours a day. I can multitask very well. 800 comments in a month, most which take like 10 seconds to type and hit post, is not some insanely difficult challenge.
How about instead of trying to shame people - while posting on the same site nonetheless lol - you try and post something on topic or productive? Or maybe use that time for something better?
Well thank god we have you do be productive and on topic. Clearly doing a service to the world, and totally not just shit posting out of boredom.
Just block this maroon, he’s jacking off to consternation instead of working his help desk job
FSD wouldn’t have done any better, it can’t even figure out shadows on the road properly as seen in this crash 3 days ago:
That guy is lying through his teeth lol.
- The crash happened in February, within 2 weeks of him buying the car. He never thought to bring it up with Tesla until just now, apparently. Try and make that make sense.
- There’s no proof that he was using FSD, and that this wasn’t driver error. He has posted footage of every camera except for the interior camera which would show if he was driving or not. I wonder why?
- He claims that he had a version of FSD that wasn’t released to the public at the time.
- Tesla have a program you can download to get all your telemetry data, showing exactly when FSD is enabled and disabled. He never tried to get that.
- Again - he never even contacted Tesla to ask them to investigate.
- 3 months later he’s posting his story to 10+ reddit subs and tagging Mark Rober and every anti-tesla person on X, and promoting his X account on Reddit.
He’s fame chasing. He saw an opportunity to turn his user error crash into 15 mins of fame and money.
Did you watch the video? There’s no way that it was just “user error”, nobody randomly swerves into a tree when nothing’s there. Maybe you’re implying it was insurance fraud?
Tesla gives out beta access to users, so I wouldn’t put too much weight on that claimed version they were using.
Did you watch the video? There’s no way that it was just “user error”, nobody randomly swerves into a tree when nothing’s there.
You’re assuming that he was paying attention and driving normally. He could have dozed off and pulled the wheel.
All I’m asking for is evidence to support his claim other than “trust me bro”.
Tesla gives out beta access to users, so I wouldn’t put too much weight on that claimed version they were using.
As others have pointed out, that version wasn’t out to regular people on that day.
It all points to a guy who crashed his car and has now seen an opportunity months later to get his 15 minutes of fame. Why didn’t he post about this in FEBRUARY WHEN IT HAPPENED? Why did he not reach out to tesla in the last 3 months? It just doesn’t add up. He has provided zero evidence that FSD caused this.
Where are you getting February from? As far as I know this happened last week. I don’t blame the person not wanting to reveal their face.
The guy posting it everywhere all over reddit and twitter himself says it happened in February. People on here, including myself, have linked to his posts saying it happened in February.
He could simply blur his face……
He didn’t use FSD because he was on a track and FSD requires a destination. It was using Autopilot, according to his statement. Are you suggesting that Autopilot is inherently less safe than FSD? I’m confused about your position on this.
He claimed in the video he was using FSD, but then mainly used autopilot - which was one of the biggest issues people had with his video in the first place. Autopilot is not as good as FSD. We also saw FSD engaged for a brief second before disengaging (from what looked like him either turning the wheel or accelerating, or possibly because he activated it 2 seconds before he was about to manually drive through a wall and it realized it as soon as he turned it on).
FSD doesn’t require a destination btw.
Are you suggesting that Autopilot is inherently less safe than FSD?
It’s not “less safe”, it’s far less advanced and serves a different purpose.
I genuinely don’t understand what FSD has to do with any of it. My car’s front collision sensor works regardless of whether cruise control is enabled.
If I’m understanding your argument correctly, the driver needs to enable a setting first for a Tesla not to plow directly into a wall? I would say that makes it less safe.
Mark Rober drove a Tesla manually into the fake wall that he made specifically so he could promote his friends LiDAR company. He lied and said he had FSD enabled, then changed his story to say only autopilot was on when called out on his lies.
What car do you have? Are you saying that just in normal everyday manual driving your car would stop your car automatically from 60mph and not hit a wall because of a collision sensor? Collision sensors are for slow moving things that are like 1m in front/behind you.
I’m saying Rober lied, intentionally and deceptively so. FSD has everything to do with it because he said FSD drove him into the wall, but it wasn’t even enabled - he manually drove straight through the wall.
What car do you have?
Volkswagen Group vehicle.
Are you saying that just in normal everyday manual driving your car would stop your car automatically from 60mph and not hit a wall because of a collision sensor?
My car’s AEBS will apply braking, shake the steering wheel, sound a loud alarm and flash the dashboard. I can’t say for sure if it applies full braking, or if that only applies at lower speeds.
Collision sensors are for slow moving things that are like 1m in front/behind you.
Perhaps I’ve not described the system accurately, because I’m not referring to parking sensors. My car’s owner’s manual states that AEBS works at speeds up to 220 km/h, and I’ve personally experienced it trigger while going over 120 km/h.
My take on Rober’s video is simply that Tesla’s automated driver safety systems are sub-par compared to other manufacturers. Perhaps somebody could perform another test with FSD enabled, but I personally don’t think it’s safe to require a driver to first enable a specific mode in order to avoid an accident—then they might as well just press the brakes themselves.
I’m not aware of any cars that will use radar, cameras, or lidar all the time and automatically stop your car to a complete standstill while you’re manually operating it. Yours does this? It physically won’t let you run into something, ever?
Can AI driven systems handle anything? Not reliably, no.
There, fixed it for ya.
Tesla’s don’t have self driving software.
Stop calling things names that don’t reflect reality. A Tesla couldn’t get itself out of a parking lot, let alone drive itself across the US, despite all the lies that Elmo has been telling
If tesla’s “AI” was used in airplane autopilots, there would be a crash every 12 hour and the entire fleet of tesla-AI aircraft would be defunct by the end of the year.
You may not have much experience with autopilots, so no. There are different levels of autopilots in aviation, not just the full control with auto-land you may be thinking of. I used to fly a small prop plane with single axis autopilot. much less capable than Tesla full self-driving. However it was safe and useful because I understood its capabilities and limitations. I knew what to use it for and what not, so even an extremely simple analog autopilot successfully reduced pilot workload, improving safety
Ok. But they’re talking about Tesla fsd ai. It would have killed you.
Well yeah, because Tesla don’t make planes so their FSD wouldn’t know what to do when in control of a plane.
It would also have killed you in one of their cars probably
How many people have died from their Tesla’s FSD making mistakes?
At least 54 https://www.tesladeaths.com/
Funny, because one random one of those that I picked:
was marked as “Verified Tesla Autopilot Death”…when all it is is a lawsuit has filed claiming that autopilot resulted in the death. Funnily enough almost every one that I look at is not actually “verified”. This one:
https://web.archive.org/web/20240104001740/
Never even mentions autopilot or FSD.
The fine print down the bottom of the page basically says “yeah nah we don’t actually know if autopilot or FSD played any part at all in these accidents or deaths, but we don’t care because the NHTSA says that the cars have autopilot” lol. The “SGO” they put next to ones that have “confirmed autopilot death” as proof is rubbish, because according to the NHTSA who collect the SGO data, they say this about the ADS/ADAS that this spreadsheet says caused the crash:
It is important to note that these crashes are categorized based on what driving automation system was reported as being equipped on the vehicle, not on what system was reported to be engaged at the time of the incident.
Source: https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting
So basically the people making this site have gone “OMG the SGO categorized this as a Automated Driving System crash!!! That means Autopilot/FSD caused the crash!!!” before reading what that category actually means lol. Well either that or they do know (most likely) and have a bit of an agenda.
BTW here is another lawsuit over a death where Autopilot was claimed to have caused it:
What was the truth? The driver was drunk and didn’t even have autopilot activated. Tesla won the court case. It’s almost like people lie when faced with the consequences of their own, or their loved ones, actions and want someone else to be to blame.
Teslas ai is based off of its drivers in crowdsourced data, they are assholes who break the law. Yesla “self driving” will NEVER be safe.
It’s actually extremely straightforward to mark lanes as “bus-only” when you map out roads, but that’s assuming you’re not being micromanaged by a moronic egomaniac.
It’s actually extremely straightforward to mark lanes as “bus-only” when you map out roads
Yeah… until they change. And municipalities are not known for thoroughly documenting their changes, nor companies keeping their info up to date even if those changes are provided.
Sooooo … You people don’t have signs? Because my car can read those.
If the road is marked as bus only, surely it’s still bus only? That’s how it works here at least, road signs etc. are applicable until removed. Doesn’t matter if it’s still there because government forgot to remove something. You obey the signs because they keep the roads safe.
I wouldn’t set foot on anything associated with musk after he sold out the country to the Russians to enrich himself m. The man could not spend his money in. A million lifetimes and what does he do to us? Grabs more.
First video was from somewhere around Toronto, Canada (my guess is Mississauga). Some places have very unclear signage of when bus lanes are active, when they’re not, disappearing in and out of existence sometimes, but that was a clear red lane that combined with lane data the Tesla’s camera based navigation system should have been equipped enough to handle it…
Sucks for the driver letting Tesla net him a fine and demerit points.
Sucks for the driver letting Tesla net him a fine and demerit points.
Nah, they are also responsible for not correcting the situation whan the defective self driving feature put them in a bus lane. Same as if they let their cruise control rear end another car.
I agree with you that often the signage/lane marking suck and it takes a bit to figure it out. But me as a human, I’ll remember the next time I travel that route. The Tesla continues to have zero prior knowledge every time.
oh hello betterridge’s law of headlines
“Can Kay Write a Decent Headline?”
yes as it turns out!
Oh thank you :D