New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • daikiki@lemmy.world
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    11
    ·
    1 year ago

    I have a lot of trouble understanding how the NTSB (or whoever’s ostensibly in charge of vetting tech like this) is allowing these not-quite self driving cars on the road. The technology doesn’t seem mature enough to be safe yet, and as far as I can tell, nobody seems to have the authority or be willing to use that authority to make manufacturers step back until they can prove their systems can be integrated safely into traffic.

    • SpaceNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      64
      arrow-down
      5
      ·
      1 year ago

      It’s just ADAS - essentially fancy cruise control. There are a number of autonomous vehicle companies who are carefully and successfully developing real self-driving technology, and Tesla should be censured and forbidden for labeling their assistance software as “full self-driving.” It’s damaging the real industry.

    • Not_Alec_Baldwin@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      3
      ·
      1 year ago

      It’s not “not-quite-self-driving” though, it’s literal garbage. It’s cruise control, lane assist and brake assist. The robot vision in use is horrible.

      There are Tesla engineers bad mouthing the system openly.

      Musk is a scammer and they need to issue an apology for all of the claims around autopilot, probably pay a great deal of money, and then change the name and advertising around it.

      Oh, and also this guy should never drive again.

  • CaptainProton@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    3
    ·
    edit-2
    1 year ago

    This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

    That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.

    It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

    • chris2112@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called “self driving”. Compare Tesla’s marketing to like GMs of Hyundai’s, both of which essentially have parity with Teslas system in terms of actual features, and you’ll see a big difference

    • socsa@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      There’s no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I’ve experienced it first hand.

      • doggle@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        The headline doesn’t state that the warnings were consecutive.

        Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?

        I’ll grant you, though, 150 warnings is still a little tough tough to believe…

      • lapommedeterre@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that’s in the article). It shows a good bit of the footage, too.

        Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.

    • doggle@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Sounds like the injured officers are suing. It’s a civil case not criminal, so I’m not sure how much the court would actually be asked to legislate. I’d be interested to hear their arguments, though I’m sure part of their reasoning for suing Tesla over the driver is they have more money.

  • zerbey@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    10
    ·
    1 year ago

    150 more warnings than a regular car would give, ultimately it’s the driver’s fault.

      • sugartits@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        1 year ago

        The driver was responding. If he didn’t respond the car would have stopped.

        If this was a normal car he probably would have just crashed earlier.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Do we have any evidence from the driver stating that he didn’t realize he was using a glorified cruise control similar to autopilot on an airplane?

        • wearling0600@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Where I live you can right now go to Tesla’s website and buy a car with “Full Self-Driving Capability” with a small print that includes the disclaimer that it doesn’t make the vehicle autonomous, for whatever that’s worth…

          • PR_freak@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            1 year ago

            FSD is a paid feature that i assume was not being used during the accident, autopilot was being used

            • wearling0600@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Ah I see, now that you’ve been proven wrong you’re pretending you asked a different question.

              You admit that Tesla advertises a “Full Self-Driving Capability” feature, which is basically what the person you said “source or stfu” to.

              Whether or not the feature was used in this instance is not what we’re discussing here.

              We can have this discussion if you’re feeling like you’re up for it in good-faith, I think both are true that people are overall terrible at the activity of driving so more driver aids are overall better, but also current driver aids are very limited and drivers are not necessarily great at understanding and working within those limits.

              They’re not the only ones, but Tesla is really the worst offender at overstating their cars’ capabilities and setting people up for failure - like in this case.

              • PR_freak@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Yes you’re right

                What was used in this accident had nothing to do with my question and yes it looks like tesla advertisement is misleading

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      You’re completely right and I’ve never seen this for traffic stops in Europe, they’ll make you park somewhere safe, at the very worst, in the emergency lane, but even that is rare for traffic stops. The only times I see lanes blocked is when there’s been an accident/breakdown and then the first thing they do is bring massive light panels well ahead of the spot to make everyone clear the lane.

  • thatKamGuy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    34
    ·
    1 year ago

    Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

    Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

  • hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    1 year ago

    Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?

  • Md1501@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    1 year ago

    You know what might work, program the car so that after the second unanswered “alert” the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

    • Technoguyfication@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

        • stealin@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          The system with cars is that you don’t distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

          • Technoguyfication@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            1 year ago

            I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

            Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

            Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

            • NeoNachtwaechter@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              1 year ago

              It’s frustrating to see

              This is why we can’t have nice things

              It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.

              You should get your priorities straight if you ever hope to be taken seriously

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        1 year ago

        The system will warn you to pay attention

        … and if we have learned anything from that incident, it is that the warnings have been worthless.

        The system can be tricked even by the worst drunkards! 150 times in a row.

        for a few seconds before shutting down.

        Few seconds are not enough. The crash was already unavoidable.

        • Technoguyfication@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

          The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

          They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

          This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

          • NeoNachtwaechter@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            This is not a case of autopilot causing an accident, this is a case of an impaired driver

            It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.

            Driving drunk is already forbidden.

            What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.

            • Iheardyoubutsowhat@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              The driver was in autopilot. Auto pilot is cruise control and lane assist. It’s not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents…its user error.

              Still unaware of a proven FSD accident.

    • HalcyonReverb@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 “keep your hands on the wheel” notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn’t do something similar.

  • EndOfLine@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Officers injured at the scene are blaming and suing Tesla over the incident.

    And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

    I hope those officers got one of those “you don’t pay if we don’t win” lawyers. The responsibility ultimately resides with the driver and I’m not seeing them getting any money from Tesla.

  • Snapz@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    5
    ·
    1 year ago

    This source keeps pushing tesla propaganda. There’s always an angle trying to sell that it wasn’t the tesla’s fault

  • N3Cr0@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    1 year ago

    Poor drunk impaired driver falling victim to autonomous driving… Hopefully that driver lost their license.

    • Cyber Yuki@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      1 year ago

      That doesn’t drive the problem of autopilot not taking the right choices. What is the driver wasn’t drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?

      Furthermore, what if the driver got drunk BECAUSE the car’s AI was advertised as being able to drive for you? Think of false publicity.

      If your AI can’t handle one simple case of a driver being unresponsive, that’s negligence on the company’s part.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        How could the company be negligent if someone gets drunk or has a heart attack and crashes their car? No company has a Level 5 autonomous vehicle where no human intervention is needed. Tesla is only Level 2. Mercedes has a Level 3 option (in extremely limited conditions). Waymo claims Level 4 but is geofenced.

  • Jordan Lund@lemmy.one
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    16
    ·
    1 year ago

    Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.

        • coffeebiscuit@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          It was on autopilot, so technically the drunk wasn’t driving it. But he is the one responsible.

          • Jordan Lund@lemmy.one
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            Autopilot doesn’t work that way, the drunk should have known that when he wasn’t drunk and not tried to use it that way.

            It’s like the old shaggy dog story about the guy driving a camper, setting the cruise control, then going into the back to make lunch.

            That’s not the fault of the cruise control.

  • Pablo@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    It’s also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      12
      ·
      edit-2
      1 year ago

      Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.

      In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

      Source

      • tiny_electron@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        1 year ago

        There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          10
          ·
          1 year ago

          Sure. There are always multiple factors in play. However I’d still be willing to bet that there’s nothing in Teslas that makes them inherently unsafe compared to other cars.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          10
          ·
          1 year ago

          Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              5
              ·
              edit-2
              1 year ago

              Tesla model Y scored the highest possible score on IIHS crash test as well as 5 stars on Euro NCAP

              Their other models have similar results. I believe Model X is the safest SUV ever made.

              EDIT:

              More than just resulting in a 5-star rating, the data from NHTSA’s testing shows that Model X has the lowest probability of injury of any SUV it has ever tested," Tesla said in a statement. "In fact, of all the cars NHTSA has ever tested, Model X’s overall probability of injury was second only to Model S.

              Source

              Also might want to check this

              EDIT2: Imagine downvoting the guy providing hard evidence and upvoting the fanatic making baseless claims backed by nothing

                • Thorny_Thicket@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  4
                  ·
                  1 year ago

                  Or maybe you’re so blinded by the hatred towards Musk that you can’t even think straight and no evidence in the world could convince you otherwise?

                  You really should’ve checked the last link.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        1 year ago

        almost 4 times less likely to be involved in a crash than a human driven

        Not relevant at all here, when we are discussing occurences that seem so easily and obviously avoidable.

        (But it’s nice to see that the Fanboi team is awake now)

  • Peanut@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    1 year ago

    i still think tesla did a poor job in conveying the limitations on the larger scale. they piggybacked waymo’s capability and practice without matching it, which is probably why so many are over reliant. i’ve always been against mass-producing semi-autonomous vehicles to the general public. this is why.

    and then this garbage is used to attack the general concept of autonomous vehicles, which may become a fantastic life-saver, because then it can safely drive these assholes around.