Driver just prevents Telsa from running into an IM train

Apparently Tesla software cannot detect items running side to side? So what happen when some vehicle busts thru a stop light in front of your Tesla?

Tesla owner says his car’s ‘self-driving’ technology didn’t detect moving train (nbcnews.com)

Tesla will not be able to dispute the video evidence. Video seems to indicate driver turned car into crossing gate pole? Tell it to the judge?

I notice that the headlights were not bright enough to get the reflective strips on the cars to ‘flash’, and the reflective strips seemed to be ‘covered’ by the gates in the down position.

I don’t know what algorithm Tesla is using to identify the passing something that is moving in a direction 90 or so degrees in relation to the Tesla. Is it looking for the reflective strips to identify something moving across the path of the vehicle?

https://www.tesla.com/support/autopilot

Apparently the driver does not understand the named driving features of Tesla, and is overestimating the abilities of these various driver assistance features. As I understand it, there is no legal technology that allows a driver to set the vehicle to make a trip and then hop into the back seat and take a nap; although that is the the product vision for the future. Musk refers to that as “Autonomous Driving.” That is the goal, but we are not there yet. However, it is easy for purchasers of Teslas to assume they have purchased the real deal, and crashes have resulted. Auto Pilot and Full Self-Driving are not what they imply to many purchasers of Teslas.

seems like Autopilot and Full Self-Driving are not named properly and likely to loose the legal case as “pound of flesh” did. “driver assist” seems more descriptive

Doubtful. From the page you linked.

“Autopilot and Full Self-Driving capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”

What judge? Why would they dispute the evidence? What are you talking about?

Here’s how the Tesla ‘driver assist’ system sees a train at a railroad crossing. Note that this video was uploaded to Youtube over a year ago.

https://www.youtube.com/watch?v=V_v7bZfNapE

Given Musk’s dislike of everything train related I suppose it’s not surprising that his software design team made a system that isn’t capable of comprehending a railroad crossing.

Hope it doesn’t see a gap in all that ‘truck traffic’ and then try to drive through it…

https://www.theverge.com/2024/4/26/24141361/tesla-autopilot-fsd-nhtsa-investigation-report-crash-death

There is a lot of controversy surrounding the Tesla marketing of Full Self-Driving and Auto Pilot. Naturally buyers will easily believe these are feature names for the actual hands-free, 100% automatic driving that is implied by the Tesla brand.

The position of Musk seems to be that the names are just words for features and not specific enough to establish exactly what they mean. For the precise information on what the vehicle is capable of, one must read the owner’s manual and abide by it.

I believe there is litigation underway over who is to blame for recent crashes and deaths surrounding this issue.

Only one of my several friends who owns Teslas really knows how to drive. And he is the only one who did not order the “autopilot” option. The rest are scary without their autopilot. Now I can’t tell whether they became dumb drivers after they starting relying on the autopilot or whether they ordered the autopilot because they knew they were lousy drivers.

My little town in Maryland now has a Tesla as one of its marked Police Cars.

First thought that came to mind is if the ‘self-driving’ software package had a map database with locations of RR grade crossings. Second though was why the self-driving software did not recognize the flashing lights at the crossing gate - I’ve seen driver assist packages that will recognize speed limit signs, not to mention pedestrians on the side of the road.

Tesla sure ships a lot of their vehicles by rail for their CEO to “dislike everything rail related”.

Driver should get a ticket for driving into gate mechanism reslting in property damage to the RR. as car damage proves. Tell it to the judge trying to get no fine or points!

Just some charges.

  1. Driving too fast for conditions = Fog

  2. distracted driving that is suspicious because he did not recognize until too late that a collision was imminent.

  3. Failure to yield to the train

4 damage to private property in excess of $1k or what ever in that jurisdiction.

  1. Failure to properly control car.

Perhaps others can think of more possible charges

I never said he wasn’t financially savvy, and even Musk can’t deny that rail freight are currently the cheaper option for shipping new cars.

But I’m sure Elon views this as a temporary situation until his self-driving semis are able to take over. That is if he even knows how his new cars get shipped across the continent.

Blue Streak 1:

How about adding reckless driving.

All those items are violations of the law, but if Musk is found to be liable, it could shift the blame onto him and off of the driver. It depends on whether the driver is found guilty for not complying with the vehicle manual, or if Musk is found guilty of misleading the market by implying that the Tesla would take fu

Euclid:

I agree, and very well stated.

Great, and none of that would include Tesla for them to “dispute” the evidence. It would fall on the driver.

Tesla is still fighting 4 very serious fatal accidents that happened with their self driving programs where their cars literally drove under semi trailers in various states. Why their radar they used could not tell the difference between the side of a 53 foot trailer and the sky.

TESLA FULL SELF-DRIVING/AUTO PILOT vs. AUTONOMOUS DRIVING

The futuristic vision of autonomous driving is to have Artificial Intelligence drive motor vehicles on public roads and never make a mistake. A large market enthusiastically welcomes this futuristic vision, but it has yet to arrive. So in the meantime, the vision is delivered incrementally in a way that car owners feel like they have it even though they don’t have it. In a way, it becomes a status symbol for an advanced feature that is itself only a symbol. It is a symbolic magic wand to eliminate car accidents caused by human error.

But for now, what could be more dangerous than people driving cars having make-believe safety features?

The way this pretend self-driving works is that drivers are allowed to rely on a system of somewhat automatic driving—but—they must be alert and ready to take over hands-on manual driving if the automatic system gives any hint of failure to do the job that it is intended (or erroneously expected) to do. And there is also an underlying assumption that the automatic system can and will make suc