Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In my opinion, this is yet another case of people being lulled into a false sense of security that the car drives itself, when in reality the Tesla's driver should have been attentive and should have reacted.

In every similar story I always fail to imagine how the driver can possibly stay in a state of "active attention" 100% of the time, while also not doing anything active. That's not how much of the human race works.

At this point it's a lot easier to just actively drive, instead of having to make a sudden split-second decision while being dulled by the inaction. I feel like non full-autonomous driving just doesn't make sense at all.



It's also why airplane pilots do much more than they theoretically would have to – under normal conditions, a modern autopilot should be able to handle the whole flight without human supervision.

But then the human won't be reacting fast enough when he is needed to handle an unforeseen problem, so they're kept in the loop and active, just in case.


I wonder how many flaps you could possibly check on your Tesla during your commute (i'm being sarcastic).

Also, I could imagine that an airplane is, from a totally theoretical and simplicistic point of view, much easier to manage for a computer, even for simply having a lot of space to maneuver, and knowing constantly what other airplanes are doing.

But driving a car? How can a computer possibily handle a scenario where edge-cases make for a good 85% of each travel, unless you drive in the desert?


It doesn't have to be a 1:1 replacement. The computer has some advantages that can make up its lack of brains by having lower latency, not getting distracted, better sensors, better maths when it comes to motion ("objects in the mirror may be closer than they appear") and more training than a human will ever experience.


I totally agree. But also, critically, an auto-pilot is precisely expected to have "a brain", it's implicit in the name.

Maybe we should just agree that auto-pilot, in its common interpretation, is currently just a sci-fi concept, and communicate more honestly what a computer inside a car can't and can do.

Apparently, dodging weird obstacles is something it struggles with. Which is a problem, since all driving is, essentially, dodging weird obstacles.


I agree with you. So maybe the system could expect the driver to actually drive.

I've never driven a Tesla, but I've seen a similar thing in a Citroën I rented once. On the highway, with cruise control, the car could practically drive itself. It would follow the lane, slow down if cars in front got too close, accelerate back to the set speed when they went away, etc.

Once or twice I completely lifted the hands off the wheel, and pretty quickly it would start complaining. So with this system, the driver actually had to be engaged.


I'm pretty sure a Tesla also starts alerting and alarming if you take your hands off the wheel




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: