New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    That’s not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:

    Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

    The emergency vehicles just happen to be your most frequent kind of obstacles.

    The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.

    The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That’s what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.

    • Blaidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

      Teslas don’t use radar, just cameras. That’s why Teslas crash at way higher rates than real self driving cars like Waymo.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

      I feel like this is bad tech understanding in journalism (which is hardly new). There’s no reason radar couldn’t see stationary vehicles. In fact, very specifically, they’re NOT stationary relative to the radar transceiver. Radar would see them no problem.

      My actual suspicion here is that Tesla actively ignores stationary vehicles (it can know they’re stationary by adding its known speed to the relative speed) not in front of the vehicle. Now, in normal streets this makes sense (or at least those on the non-driver’s side). Do you pay attention to every car parked by the side of the road when driving? You’re maybe looking for signs of movement, or lights on, etc. But you’re not tracking them all, and neither will the autopilot. However, on a highway if you have more than 1 vehicle on the shoulder every now and then it should be making you wonder what else is ahead (and I’d argue a single car on the shoulder is a risk to keep watch on). A long line of them should definitely make you slow down.

      I think Human drivers would do this, and I think an autopilot should be considering what kind of road it is on, and whether it should treat scenarios different.

      I also have another suspicion, but it’s just a thought. If this Tesla was really using radar as well as cameras, haze or not, it should have seen that stationary vehicle further ahead than it did. Since newer Tesla cars don’t have radar, and coming from a software development background, I can actually see a logical (in terms of corporate thinking) reason to remove the code for radar. They would do this simply because they will not want to maintain it if they have no plans to return to radar. Think of it like this. After a few versions of augmenting the camera detection logic, it is unlikely to work with the existing radar logic. Do they spend the time to make them work together for the older vehicles, or only allow camera based AI on newer software versions? I would suspect the latter would be the business decision.