Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • @anlumo
    link
    fedilink
    English
    -1311 months ago

    Everybody who has a bit of an idea what an autopilot in a plane actually does is not mislead. Do people really think that commercial airline pilots just hit the “autopilot” button in their cockpit after disengaging the boarding ramp and then lean back until the boarding ramp at the destination is attached?

    • Einar
      link
      fedilink
      English
      18
      edit-2
      11 months ago

      So I need to understand the autopilot of a plane first before I buy a car?

      I would be mislead then, as I have no idea how such autopilots work. I also suspect that those two systems don’t really work the same. One flies, the other drives. One has traffic lights, the other doesn’t. One is operated by well paid professionals, the other, well, by me. Call me simple, but there seem to be some major differences.

      • @Caculon@lemmy.world
        link
        fedilink
        English
        110 months ago

        I would have though people would read autopilot and think automatic. At least that’s what I do. I guess pilot is closely associated with planes but it certainly isn’t what I think of.

      • @CmdrShepard@lemmy.one
        link
        fedilink
        English
        011 months ago

        This is a pretty absurd argument. You could apply this to literally any facet of driving.

        “I have to learn what each color of a traffic light means before driving?”

        “I have to learn what white and yellow paint means and dashes versus lines? This is too confusing”

        God help you when you get to 4-way stops and roundabouts.

        • Einar
          link
          fedilink
          English
          2
          edit-2
          11 months ago

          Not absurd, but reality. We do that in driving school.

          I don’t know where you are from and which teaching laws apply, of course, but I definitely learned all those lessons you mentioned.

          • @CmdrShepard@lemmy.one
            link
            fedilink
            English
            010 months ago

            That’s precisely my argument and why “learning my new car’s features is too confusing” is an absurd argument.

      • @anlumo
        link
        fedilink
        English
        -211 months ago

        Yeah, there are some major differences in the vehicles, but both disengage when there’s anything out of the ordinary going on. Maybe people base their understanding of autopilots on the movie “Airplane!” where that inflatable puppet groped the Stewardess afterwards.

          • @anlumo
            link
            fedilink
            English
            2
            edit-2
            11 months ago

            True, good point. As far as I know, it does turn itself off if it detects something it can’t handle, though. The problem with cross traffic is that it obviously can’t detect it, otherwise turning itself off would already be a way of handling it.

            Proximity detection is far easier up in the air, especially if you’re not bound by the weird requirement to only use visible spectrum cameras.

            (To make things clear, I’m just defending the engineers there who had to work within these constraints. All of this is a pure management failure.)

          • Ocelot
            link
            fedilink
            English
            111 months ago

            I’m sorry, what? If you set an airplane to maintain altitude and heading with autopilot, it will 100% fly you into the side of a mountain if there’s one in front of you.

    • r00ty
      link
      fedilink
      1811 months ago

      They’re not buying a plane though. They’re buying a car with an autopilot that is labeled as “full self driving”. That term does imply it will handle a complete route from A to B.

      People are wrongly buying into the marketing hype and that is causing crashes.

      I’m very concerned about some of the things I’ve seen regarding FSD on Teslas. Such as sudden hard braking on highways, failing to avoid an accident (but it’s OK it disengaged seconds before impact so the human was in control) and of course the viral video of FSD trying to kill a cyclist.

      They should not be allowed to market the feature this way and I don’t think it should be openly available to normal users as it is now. It’s just too dangerous to put in the hands (or not) of normal drivers.

      • Ocelot
        link
        fedilink
        English
        3
        edit-2
        11 months ago

        Autopilot has never been “Full Self Driving”. FSD is an additional $15,000 package on top of the car. Autopilot is the free system providing lane keeping with adaptive cruise, same as “Pro Pilot Assist” or “Honda Sensing” or any of the other packages from other car companies. The only difference is whenever someone gets in an accident using any of those technologies we never get headlines about it.

      • @anlumo
        link
        fedilink
        English
        -111 months ago

        I’ve never sat in a Tesla, so I’m not really sure, but based on the things I’ve read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn’t be any confusion about this.

        • Miqo
          link
          fedilink
          English
          411 months ago

          I’ve never sat in a Tesla, so I’m not really sure

          There shouldn’t be any confusion about this.

          U wot m8?

        • r00ty
          link
          fedilink
          111 months ago

          Well, if it’s just the lane assistance autopilot that is causing this kind of crash. I’d agree it’s likely user error. The reason I say if, is because I don’t trust journalists to know or report on the difference.

          I am still concerned the FSD beta is “out there” though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

          • @anlumo
            link
            fedilink
            English
            211 months ago

            If it were about the FSD implementation, things would be very different. I’m pretty sure that the FSD is designed to handle cross traffic, though.

            I do not trust normal users to understand what beta means

            Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

            When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That’s like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

          • Ocelot
            link
            fedilink
            English
            0
            edit-2
            11 months ago

            Yeah, I don’t trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who’s really good at driving though? Humans. Perfect track record, those humans.

    • @El_illuminacho@lemmy.world
      link
      fedilink
      English
      1011 months ago

      Why do you think companies need to warn about stuff like “Caution, Contents are hot” on paper coffee shops? People are stupid.

      • @anlumo
        link
        fedilink
        English
        -811 months ago

        Those labels are there because people made a quick buck suing the companies when they messed up, not to protect the stupid customers.

        If the courts would apply a reasonable level of common sense, they wouldn’t exist.