1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Tesla on Autopilot Drove 7 Miles With Drunk Driver Sleeping

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 4, 2018.

  1. tunejunky

    tunejunky Master Guru

    Messages:
    686
    Likes Received:
    191
    GPU:
    gtx 1070 / gtx 1080ti

    this asshat is (soon to be was) chairman of the planning commission in Los Altos (super-rich silicon valley town...even more so than where i live).
    plebes may not bother looking for a place to live there... upper management and venture capital exclusively (except for their millennial kids).
     
    K.S. likes this.
  2. The Laughing Ma

    The Laughing Ma Ancient Guru

    Messages:
    3,573
    Likes Received:
    293
    GPU:
    WC 980GTX
    Do commercial aircraft use over the air updates for software, I honestly don't know but if they require physical access to the systems to do any kind of software updates then that limits who can do what to them while it's clear that the auto car makers are looking at going for over air updates and while their may be some question about some recent air crashes and their causes their have been quite a few incidents as well as crashes caused when autonomous systems on aircraft having gone wrong. Some only being recovered because their was a person in the cockpit who could take control to recover, an option that auto driving car makers seem to want to remove totally.
     
    tunejunky likes this.
  3. K.S.

    K.S. Maha Guru

    Messages:
    1,054
    Likes Received:
    81
    GPU:
    1080 Ti SEA HAWK
    Ironic when you consider flight is statistically the safest way to travel - you'd think we'd mirror the control scheme at the very least.

    Yes pilots are trained & screened under intense scrutiny, flight has less planes, ground has more vehicles etc..

    Although a point is a point is still a point and hmm that's interesting "pilots are trained under more scrutiny" perhaps drivers should be too! the USA does hand out licenses like candy...

    Oh god... sounds like a real gene pool winner'
     
  4. D3M1G0D

    D3M1G0D Maha Guru

    Messages:
    1,298
    Likes Received:
    663
    GPU:
    2 x GeForce 1080 Ti
    Human intervention isn't always a good thing. There was a case (Colgan Air Flight 3407) where human intervention actually led directly to a crash. The plane stalled due to low speed and the anti-stall system kicked in, lowering the plane's nose to gain speed. However, the pilot repeatedly pulled the nose back up (probably thinking that the plane was diving to the ground), fighting against the automated system which was trying to save the plane. After repeated stalls the plane became unrecoverable and it crashed.

    There is early evidence that human are the leading cause of self-driving car accidents. The idea that humans make things safer is due to our own ego and hubris more than anything else. Although human intervention can sometimes correct errors with automation, it's more likely to cause problems instead. Even though I'm a good driver, a computer can probably drive better than me from a safety standpoint.
     

  5. K.S.

    K.S. Maha Guru

    Messages:
    1,054
    Likes Received:
    81
    GPU:
    1080 Ti SEA HAWK
    I just find that a little bit ironic - I sincerely do not intend any offense I just mean in a sense of that people can be prideful and in their pride incorrectly build a autonomous system. - the irony. (My way of trying to put it delicately)

    They may also build it perfectly and error free. In which it resolves all "human-error" (The autonomous) system. Although entirely removing the option for a human "fallback layer" as it were is in my eyes the pure definition of hubris (just my opinion - not trying to offend or argue - I just found it ironic)
     
  6. The Laughing Ma

    The Laughing Ma Ancient Guru

    Messages:
    3,573
    Likes Received:
    293
    GPU:
    WC 980GTX
    The first part of that sentence doesn't make a huge amount of sense, how else do you correct a failure of an autonomous system. The system has failed it isn't going to correct itself which means that the only way to get round it would be for someone to step in and correct it themselves. If the failure is going to lead to a crash how exactly will the human stepping in make things worse? Will it crash more?

    The link doesn't load, however at this point in time we are in a weird situation where self driving cars and associated tech is not good enough. They crash, they cause crashes that's a fact. The problem is that they instil a level of confidence in idiots that the vehicle is able to deal with any and all situations which results in that person not paying the attention they would have had they been in full control. The result being that when the auto systems encounter that situation they cannot deal with, and they do, the driver is not in a state of awareness that allows them to take control to avoid the accident.

    To put it simply when you encounter people who are so stupid they can't turn their own headlights on, parroting, 'The car does it for me' then we either need a level of self driving vehicles that is 100% perfect 100% of the time or we need to stop putting in systems that encourage idiots to give up control and stick to systems that are passive. That sit in the background designed to assist fully aware drivers. I mean if you call your half arsed autonomous driving tech 'autopilot' is it any wonder morons get in the car thinking it will drive itself and then get killed when it crashes in to something?
     
  7. D3M1G0D

    D3M1G0D Maha Guru

    Messages:
    1,298
    Likes Received:
    663
    GPU:
    2 x GeForce 1080 Ti
    As the article I posted indicates (here's another link), the answer is yes. Human interaction was what was responsible for 81 out of 88 accidents for self-driving cars. The idea that human action will rescue a dangerous situation is, like I said, human hubris - we like to think that we are more capable than machines.

    Not good enough? Self-driving cars have proven to be FAR safer than human drivers (if that's not good enough then what does that make us?). Traffic accidents are still one of the leading causes of death and injury - google the topic and it will show that there are 6 million automobile accidents a year in the US, and 1.3 million fatalities. If autonomous vehicles led to even half those figures then it would be more than worth it (even one less death would be a win).

    From what I can see, self-driving cars would not only be far safer but it would also allow a lot more leisure time. That half an hour drive to and from work could be put to better uses, like catching up on sleep, eating, reading, gaming, browsing or any number of other tasks. I don't understand why some people are so opposed to something that will vastly improve our lives.
     
  8. The Laughing Ma

    The Laughing Ma Ancient Guru

    Messages:
    3,573
    Likes Received:
    293
    GPU:
    WC 980GTX
    I think some clarity needs to be applied here, lvl 2 AVs ARE far safer than a human by themselves but they are only safer IF the human in charge of the vehicle is paying attention. As I stated the issue with a lot of drivers and certain car makers is that they are instilling drivers with a level of confidence in their cars in a way that is actually making them less safe. Name your AV system 'auto pilot' and you get morons thinking the car can drive them home drunk or climb in to the passenger seat in the middle of a busy highway, the question is not how many accidents have the systems avoided but rather how many of the accidents that HAVE happened while the systems were engaged could have been avoided if the 'idiot' behind the wheel had been paying attention?

    Until we have all cars using lvl 4 and 5 AI then we are going to have this weird semi state where idiots believe that their vehicle can get them out of any trouble, no matter what, so they don't have to pay as much attention as they should. Like I said once you have lvl 4 and lvl 5 you have a situation were people who can't or are unsuitable to drive suddenly being able to get 'behind the wheel'. What happens when suddenly every man and his dog is now on the road with their self driving vehicle, as I said does the increase in safety and traffic flow brought about by having self driving cars suddenly get erased by virtue of having far more vehicles on the road?
     
  9. D3M1G0D

    D3M1G0D Maha Guru

    Messages:
    1,298
    Likes Received:
    663
    GPU:
    2 x GeForce 1080 Ti
    I have no idea where you are getting this from. I see no mention of user attentiveness being a factor - the only thing it says is that human action was responsible for most of the accidents for autonomous vehicles. The data is very clear on this: to make AVs safer, the human element needs to be removed. We are the problem, not the solution.

    The ideal state of self-driving cars is to simply get in the car and punch in an address/location. There won't be any steering controls or a designated driver and the inside of the car will be geared towards leisure time. It'd be like riding a train or plane, just sit back and relax as the car takes you to your destination.
     
  10. RealNC

    RealNC Ancient Guru

    Messages:
    2,592
    Likes Received:
    841
    GPU:
    EVGA GTX 980 Ti FTW
    Early days still. They will most probably add failsafes that detect drivers who fall asleep or are otherwise incapacitated and pull over and stop.

    Or the car should have a built-in breather that tests your alcohol level before you can drive :D
     

Share This Page