• magnetosphere
    link
    fedilink
    165
    edit-2
    12 days ago

    I’ve often wondered why the FTC allows it to be marketed as “Full Self-Driving”. That’s blatant false advertising.

    • @reddig33@lemmy.world
      link
      fedilink
      English
      7812 days ago

      As is “autopilot”. There’s no automatic pilot. You’re still expected to keep your hands on the wheel and your eyes on the road.

      • @halcyoncmdr@lemmy.world
        link
        fedilink
        English
        2312 days ago

        I am so sick and tired of this belief because it’s clear people have no idea what Autopilot on a plane actually does. They always seem to assume it flies the plane and the pilot doesn’t do anything apparently. Autopilot alone does not fly the damned plane by itself.

        “Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.

        There are more advanced systems available on the market that can be installed on smaller planes and in use on larger jets that can do things like auto takeoff, auto land, following waypoints, etc. without pilot input, but basic plain old autopilot doesn’t do any of that.

        That expanded capability is similar to how things like “Enhanced Autopilot” on a Tesla can do extra things like change lanes, follow highway exits on a navigated route, etc. Or how “Full Self-Driving” is supposed to follow road signs and lights, etc. but those are additional functions, not part of “Autopilot” and differentiated with their own name.

        Autopilot, either on a plane or a Tesla, alone doesn’t do any of that extra shit. It is a very basic system.

        The average person misunderstanding what a word means doesn’t make it an incorrect name or description.

        • @machinin@lemmy.world
          link
          fedilink
          English
          32
          edit-2
          12 days ago

          I say let Tesla market it as Autopilot if they pass similar regulatory safety frameworks as aviation autopilot functions.

        • Captain Aggravated
          link
          fedilink
          English
          2412 days ago

          Flight instructor here.

          I’ve seen autopilot systems that have basically every level of complexity you can imagine. A lot of Cessna 172s were equipped with a single axis autopilot that can only control the ailerons and can only maintain wings level. Others have control of the elevators and can do things like altitude hold, or ascend/descend at a given rate. More modern ones have control of all three axes and integration with the attitude instruments, and can do things like climb to an altitude and level off, turn to a heading and stop, or even something like fly a holding pattern over a fix. They still often don’t have any control over the power plant, and small aircraft typically cannot land themselves, but there are autopilots installed in piston singles that can fly an approach to minimums.

          And that’s what’s available on piston singles; airline pilots seldom fly the aircraft by hand anymore.

        • @reddig33@lemmy.world
          link
          fedilink
          English
          1512 days ago

          “But one reason that pilots will opt to turn the system on much sooner after taking off is if it’s stormy out or there is bad weather. During storms and heavy fog, pilots will often turn autopilot on as soon as possible.

          This is because the autopilot system can take over much of the flying while allowing the pilot to concentrate on other things, such as avoiding the storms as much as possible. Autopilot can also be extremely helpful when there is heavy fog and it’s difficult to see, since the system does not require eyesight like humans do.”

          Does that sound like something Tesla’s autopilot can do?

          https://www.skytough.com/post/when-do-pilots-turn-on-autopilot

          • Captain Aggravated
            link
            fedilink
            English
            712 days ago

            Flight instructor here. The flying and driving environments are quite different, and what you need an “autodriver” to do is a bit different from an “autopilot.”

            In a plane, you have to worry a lot more about your attitude, aka which way is up. This is the first thing we practice in flight school with 0-hour students, just flying straight ahead and keeping the airplane upright. This can be a challenge to do in low visibility environments such as in fog or clouds, or even at night in some circumstances, and your inner ears are compulsive liars the second you leave the ground, so you rely on your instruments when you can’t see, especially gyroscopic instruments such as an attitude indicator. This is largely what an autopilot takes over for from the human pilot, to relieve him of that constant low-level task to concentrate on other things.

            Cars don’t have to worry about this so much; for normal highway driving any situation other than “all four wheels in contact with the road” is likely an unrecoverable emergency.

            Navigation in a plane means keeping track of your position in 3D space relative to features on the Earth’s surface. What airspace are you in, what features on the ground are you flying over, where is the airport, where’s that really tall TV tower that’s around here? Important for finding your way back to the airport, preventing flight into terrain or obstacles, and keeping out of legal trouble. This can be accomplished with a variety of ways, many of which can integrate with an autopilot. Modern glass cockpit systems with fully integrated avionics can automate the navigation process as well, you can program in a course and the airplane can fly that course by itself, if appropriately equipped.

            Navigation for cars is two separate problems; there’s the big picture question of “which road am I on? Do I take the next right? Where’s my exit?” which is a task that requires varying levels of precision from “you’re within this two mile stretch of road” to “you’re ten feet from the intersection.” And there’s the small picture question of “are we centered in the traffic lane?” which can have a required precision of inches. These are two different processes.

            Anticollision, aka not crashing into other planes, is largely a procedural thing. We have certain best practices such as “eastbound traffic under IFR rules fly on the odd thousands, westbound traffic flies on the even thousands” so that oncoming traffic should be a thousand feet above or below you, that sort of thing, plus established traffic patterns and other standard or published routes of flight for high traffic areas. Under VFR conditions, pilots are expected to see and avoid each other. Under IFR conditions, that’s what air traffic control is for, who use a variety of techniques to sequence traffic to make sure no one is in the same place at the same altitude at the same time, anything from carefully keeping track of who is where to using radar systems, and increasingly a thing called ADS-B. There are also systems such as TCAS which are aircraft carried traffic detection electronics. Airplanes are kept fairly far apart via careful sequencing. There’s also not all that much else up there, not many pedestrians or cyclists thousands of feet in the air, wildlife and such can be a hazard but mostly during the departure and arrival phases of flight while relatively low. This is largely a human task; autopilots don’t respond to air traffic control and many don’t integrate with TCAS or ADS-B, this is the pilot’s job.

            Cars are expected to whiz along mere inches apart via see and avoid. There is no equivalent to ATC on the roads, cars aren’t generally equipped with communication equipment beyond a couple blinking lights, and any kind of automated beacon for electronic detection absolutely is not the standard. Where roads cross at the same level some traffic control method such as traffic lights are used for some semblance of sequencing but in all conditions it requires visual see-and-avoid. Pedestrians, cyclists, wildlife and debris are constant collision threats during all phases of driving; deer bound across interstates all the time. This is very much a visual job, hell I’m not sure it could be done entirely with radar, it likely requires optical sensors/cameras. It’s also a lot more of the second-to-second workload of the driver. I honestly don’t see this task being fully automated with roads the way they are.

          • @FiskFisk33@startrek.website
            link
            fedilink
            English
            4
            edit-2
            12 days ago

            At SkyTough, we pride ourselves on ensuring our readers get the best, most helpful content that they’ll find anywhere on the web. To make sure we do this, our own experience and expertise is combined with the input from others in the industry. This way, we can provide as accurate of information as possible. With input from experts and pilots from all over, you’ll get the complete picture on when pilots turn autopilot on while flying!

            This is GPT.

            After that intro I don’t trust a single word of what that site has to say.

            If the writer didn’t bother to write the text, i hope they don’t expect me to bother to read it.

            • @tyler@programming.dev
              link
              fedilink
              English
              111 days ago

              Why in the world would you think that’s gpt? That’s not the normal style of gpt and it’s definitely the style of normal corporate sites.

        • Turun
          link
          fedilink
          English
          1412 days ago

          I’d wager most people, when talking about a plane’s autopilot mean the follow waypoints or Autoland capability.

          Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

          • @halcyoncmdr@lemmy.world
            link
            fedilink
            English
            212 days ago

            I’d wager most people, when talking about a plane’s autopilot mean the follow waypoints or Autoland capability.

            Many people are also pretty stupid when it comes to any sort of technology more complicated than a calculator. That doesn’t mean the world revolves around a complete lack of knowledge.

            My issue is just with people expecting basic Autopilot to do more than it’s designed or intended to do, and refusing to acknowledge their expectation might actually be wrong.

            Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

            Definitely won’t get an argument from me there. FSD certainly isn’t in a state to really be called that yet. Although, to be fair, when signing up for it, and when activating it there are a lot of notices that it is in testing and will not operate as expected.

            At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

            • @machinin@lemmy.world
              link
              fedilink
              English
              912 days ago

              At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

              Volvo seeks to have zero human deaths in their cars. Some places seek zero fatality driving environments. These are cultures where safety is front and center. Most FSD enthusiasts (see comments in the other threads below) cite safety as the main impetus for these systems. Hopefully we would see similar cultural values in Tesla.

              Unfortunately, Musk tweets out jokes when responding to a video of people having sex on autopilot. That is Tesla culture. Musk is responsible for putting these dangerous things in consumers hands and has created a culture where irresponsible and possibly fatal abuse of those things is something funny for everyone to laugh at. Of course, punish the individual users who go against the rules and abuse the systems. You also have to punish the company, and the idiot at the top, who holds those same rules in contempt.

            • Turun
              link
              fedilink
              English
              812 days ago

              Also, it’s hard to argue “full self driving” means anything but the car is able to drive fully autonomously. If they were to market it as “advanced driver assist” I’d have no issue with it.

              Definitely won’t get an argument from me there. FSD certainly isn’t in a state to really be called that yet. Although, to be fair, when signing up for it, and when activating it there are a lot of notices that it is in testing and will not operate as expected.

              At what point do we start actually expecting and enforcing that people be responsible with potentially dangerous things in daily life, instead of just blaming a company for not putting enough warnings or barriers to entry?

              Then the issue is simply what we perceive as the predominant marketing message. I know that in all legally binding material Tesla states what exactly the system is capable of and how alert the driver needs to be. But in my opinion that is vastly overshadowed by the advertising Tesla runs for their FSD capability. They show a 5 second message about how they are required by law to warn you about being alert at all times, before showing the car driving itself for 3 minutes, with the demo driver having the hands completely off the wheel.

        • Saik0
          link
          fedilink
          English
          511 days ago

          “Autopilot” in a plane keeps the wings level at a set heading, altitude, and speed. It’s literally the same as cruise control with lane-centering, since there’s an altitude issue on a road.

          Factually incorrect. There are autopilot systems on planes now that can takeoff, fly, and land the flight on their own. So yes, “autopilot” is EXACTLY what people are assuming it to mean in many cases. Especially on planes that they would typically be accustom to… which is the big airliners.

          Now where you’re missing the point… There are varying degrees of autopilot. And that would be fine and dandy for Tesla’s case if you wish to invoke it. But considering the company has touted it to be the “most advanced” and “Full self driving” and “will be able to drive you from california to new york on it’s own”. They’ve set the expectation in that it is the most advanced autopilot. Akin to the plane that doesn’t actually need a pilot (although one is always present) for all three major parts of the flight. No tesla product comes even close to that claim, and I’m willing to bet they never do in their lifetime.

          • @halcyoncmdr@lemmy.world
            link
            fedilink
            English
            -1
            edit-2
            11 days ago

            Now where you’re missing the point… There are varying degrees of autopilot. And that would be fine and dandy for Tesla’s case if you wish to invoke it. But considering the company has touted it to be the “most advanced” and “Full self driving” and “will be able to drive you from california to new york on it’s own”.

            I have said from the beginning that there are varying levels of Autopilot on planes and that needs to be taken into account when talking about the name and capabilities… that’s my entire argument you illiterate fool.

            You are, at best, failing to acknowledge, or more likely, willfully ignoring the fact that Tesla does differentiate these capabilities with differently named products. All while claiming that a plane Autopilot must inherently be the most advanced version on the market to be compared to Tesla’s most basic offering.

            You are adding in capabilities from the more advanced offerings that Tesla has, like Enhanced Autopilot, and Full Self Driving and saying those are part of “Autopilot”. If you want to compare basic Tesla Autopilot, then compare it to a basic plane Autopilot. Tesla doesn’t claim that basic “Autopilot” can do all the extra stuff, that’s why they have the other options.

            That’s the issue I have with these conversations, people are always comparing apples and oranges, and trying to claim that they’re not to try and justify their position.

            Tesla’s website does indicate these differences between the versions, and has as each added capability was added to the overall offerings.

            • Saik0
              link
              fedilink
              English
              111 days ago

              You are, at best, failing to acknowledge

              No. That whole statement INCLUDING what you quoted was me allowing you to invoke it.

              Literally : “And that would be fine and dandy for Tesla’s case if you wish to invoke it.” Then I stated why that’s bad to invoke.

              You can claim I’m willfully ignorant. But you’re just a moron Elon shill.

              Tesla doesn’t claim that basic “Autopilot” can do all the extra stuff, that’s why they have the other options.

              And there’s why I’m just going to call you a moron Elon shill and move on. You’re full of shit. All they do is claim that it’s amazing/perfect. Then you buy the car and you expect the function and it doesn’t do it, not even close.

              • @halcyoncmdr@lemmy.world
                link
                fedilink
                English
                -1
                edit-2
                11 days ago

                But you’re just a moron Elon shill.

                Ah yes, the classic internet response of calling anyone you disagree with a shill. Because clearly someone disagreeing with you and pointing out issues with claims means they must inherently be defending a company without any valid claims. Easy to ignore when you don’t consider them a real person having a discussion.

                No point in arguing with someone unwilling to have an actual discussion and just resorting to calling someone a shill because they refuse to accept a different point of view can even exist.

                “You’re a shill, so nothing you say matters”.

                • Saik0
                  link
                  fedilink
                  English
                  211 days ago

                  When you outright lie about the facts it’s hard to have any other opinion about you. So yes, you’re a shill

        • @machinin@lemmy.world
          link
          fedilink
          English
          3012 days ago

          But it works and it’s hands off. Tesla can’t even legally do that under any condition.

          And fuck you if you ask Tesla to pay for any mistakes their software might make. It is ALWAYS your fault.

          • @Thorny_Insight@lemm.ee
            link
            fedilink
            English
            -25
            edit-2
            12 days ago

            Might want to check your facts there. FSD works anywhere in the US, both cities and highways. Even on unmapped roads and parking lots.

            “Fuck this guy for bringing facts into our circlejerk” - The downvoters, probably

            • @machinin@lemmy.world
              link
              fedilink
              English
              2312 days ago

              Oops, you fell for the Tesla marketing BS. FSD isn’t actually full self driving like the Mercedes system. With Tesla, you have to keep your hands on the wheel at all times and pay close attention to the road. You are completely responsible for anything that happens. Mercedes takes responsibility for any accidents their software causes.

            • Turun
              link
              fedilink
              English
              1512 days ago

              What Tesla is (falsely IMO) advertising as “full self driving” is available in all new Mercedes vehicles as well and works anywhere in the US.

              Mercedes is in the news for expanding that functionality to a level where they are willing to take liability if the vehicle causes a crash during this new mode. Tesla does not do that.

              • @Thorny_Insight@lemm.ee
                link
                fedilink
                English
                -6
                edit-2
                12 days ago

                works anywhere in the US

                The system Mercedes is using is extremely limited and hardly compareable to FSD in any way.

                Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

                Source

                • @machinin@lemmy.world
                  link
                  fedilink
                  English
                  1112 days ago

                  If I understand that person correctly, you are confusing the two systems.

                  Mercedes has two systems. One of a driver assist system that does everything the current version of FSD can do. It is unlimited in the same way that Tesla’s FSD is unlimited.

                  They have an additional system, that you cite, that is Level 3, a true hands-off self-driving system. It is geographically limited.

                  So, the question is, does Tesla have any areas where you can legally drive hands free using their software?

                • Turun
                  link
                  fedilink
                  English
                  812 days ago

                  That is the new system. Tesla has no equivalent to it. Or to phrase it differently:

                  Drivers can not activate teslas’s equivalent technology, no matter what conditions are met, including not in heavy traffic jams, not during the daytime, not on spec ific California and Nevada freeways, and not when the car is traveling less than 40 mph. Drivers can never focus on other activities. The technology does not exist in Tesla vehicles

                  If you are talking about automatic lane change, auto park, etc (what tesla calls autopilot or full self driving) these are all features you can find in most if not all high end cars nowadays.

                  The new system gets press coverage, because as I understand it, if there is an accident while the system is engaged Mercedes will assume financial and legal responsibility and e.g. cover all expenses that result from said accident. Tesla doesn’t do that.

                • @BeigeAgenda@lemmy.ca
                  link
                  fedilink
                  English
                  412 days ago

                  I would much rather use FSD that is limited to routes and conditions where the developers and testers agree that it’s safe.

                  Compared to a company that says “everything works”, and “those drivers that got killed must have been doing something wrong”.

                • just another dev
                  link
                  fedilink
                  English
                  -612 days ago

                  Good luck going against the circlejerk. People hate anything touched by He-who-should-not-be-named.

            • @machinin@lemmy.world
              link
              fedilink
              English
              612 days ago

              “Fuck this guy for bringing facts into our circlejerk” - The downvoters, probably

              Ha! Just saw this. Did someone get their facts confused?

            • @suction@lemmy.world
              link
              fedilink
              English
              212 days ago

              When you stop using the Tesla kool-aid marketing terms and start to understand the actual state of the technology and more importantly legislation, we might start to listen to what you are trying to say. Hint: using the term “FSD” or “Autopilot” is an immediate disqualifier

        • @conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          1312 days ago

          Because they’re doing shit responsibly.

          For the target audience they chose that thing is a fucking bargain. Do you know how many people making damn good money sit in hours of 4 lane bumper to bumper traffic every day? “You don’t have to drive and we assume liability if our system fucks up” is a massive value add.

          (Not enough that I’d ever consider dealing with that kind of commute no matter what you paid me. But still.)

        • @spamspeicher@feddit.de
          link
          fedilink
          English
          1012 days ago

          Level 3 in the S-Class and EQS has been available since may 2022. And the speed limit is there because that is part of a UN regulation that the Mercedes is certified for. The regulation has been updated since the release of Mercedes Drive Pilot to allow speeds up to 140km/h but Mercedes needs to recertify for that.

        • @suction@lemmy.world
          link
          fedilink
          English
          712 days ago

          Still the most advanced system that is legal to use on public roads, worldwide. Tesla’s most advanced system is many leagues below that, so not sure why it’s so hard to believe for some people that Tesla is nothing but an also-ran.

    • @Thorny_Insight@lemm.ee
      link
      fedilink
      English
      -22
      edit-2
      12 days ago

      You can literally type in an address and the car will take you there with zero input on the driver’s part. If that’s not full self-driving then I don’t know what is. What FSD was capable of a year ago and how it performs today is completely different.

      Not only does these statistics include the way less capable older versions of it, it also includes accidents caused by autopilot which is a different system than FSD. It also fails to mention how the accident rate compares to human drivers.

      If we replace every single car in the US with a self-driving one that’s 10x safer driver than your average human that means you’re still getting over 3000 deaths a year due to traffic accidents. That’s 10 people a day. If one wants to ban these systems because they’re not perfect then that means they’ll rather have 100 people die every day instead of 10.

      • Turun
        link
        fedilink
        English
        2512 days ago

        It also fails to mention how the accident rate compares to human drivers.

        That may be because Tesla refuses to publish proper data on this, lol.

        Yeah, they claim it’s ten times better than a human driver, but none of their analysis methods or data points are available to independent researchers. It’s just marketing.

        • @dgmib@lemmy.world
          link
          fedilink
          English
          512 days ago

          This is the part that bothers me.

          l’d defend Tesla when FSD gets into accidents, even fatal ones, IF they showed that FSD caused fewer accidents than the average human driver.

          They claim that’s true, but if it is why not release data that proves it?

          • @machinin@lemmy.world
            link
            fedilink
            English
            212 days ago

            It isn’t the average driver. Most cars are equipped with driver assist features, we have to say that is should be better than people using current driver assist features from other companies. If Tesla is behind everyone else, but better than a 20 year-old car, it’s still problematic.

        • @machinin@lemmy.world
          link
          fedilink
          English
          412 days ago

          I have a feeling that user blocks people that are critical of Tesla. They are probably oblivious to several comments in this thread. It’s really no wonder why they have no clue about how bad Tesla really is.

        • @Thorny_Insight@lemm.ee
          link
          fedilink
          English
          -412 days ago

          I’m not claiming it is 10x safer than a human - I’m saying that even if it was there would still be daily deaths despite that.

          Tesla has published the data - people just refuse to believe it because it doesn’t show what they think it should. There’s nothing more Tesla can do about it at this point. It’s up to independent researches from now.

          • Turun
            link
            fedilink
            English
            7
            edit-2
            12 days ago

            I would love to see this data, can you link it? Either a paper by unaffiliated researchers or the raw data is fine.
            I am aware their marketing pushes the “10x better” number. But I have yet to see the actual data to back this claim.

            • @Thorny_Insight@lemm.ee
              link
              fedilink
              English
              -512 days ago

              Either a paper by unaffiliated researchers or the raw data is fine.

              Like I said; the only data available is from Tesla itself which any reasonable person should take with a grain of salt. If you want to see it you can just google it. There’s plenty of YouTubers independently testing it aswell but these are all obviously biased fanboys that can’t be trusted either.

              • @ForgotAboutDre@lemmy.world
                link
                fedilink
                English
                612 days ago

                Tesla sues people that criticise them in the media. You really can’t trust most reviews. The reviews are also looking for money from companies like Tesla so their not impartial.

          • @machinin@lemmy.world
            link
            fedilink
            English
            612 days ago

            Comment:

            none of their analysis methods or data points are available to independent researchers.

            Your response:

            It’s up to independent researches from now.

            I think you missed an important point there. Can you show the detailed methods and data points that Tesla used for their marketing materials?

      • @machinin@lemmy.world
        link
        fedilink
        English
        11
        edit-2
        12 days ago

        You can literally type in an address and the car will take you there with zero input on the driver’s part. If that’s not full self-driving then I don’t know what is.

        Who is responsible if there is an accident, you or Tesla? That is the difference from true FSD and regular driver assistance features.

        Regarding driving regulations -

        If we had better raw data, I’m sure we could come up with better conclusions. Knowing the absolutely tremendous amount of BS that Musk spews, we can’t trust anything Tesla reports. We’re left to speculate.

        At this point, it is probably best to compare statistics for other cars with similar technologies. For example, Volvo reported that they went 16 years without a fatal accident in their XC90 model in the UK (don’t know about other places). That was a couple of years ago, I don’t know if they have been able to keep that record up. With that kind of record that has lasted for so long, I think we have to ask why Tesla is so bad.

  • @tearsintherain@leminal.space
    link
    fedilink
    English
    7612 days ago

    Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it’s too late, money already in the bank.

  • @over_clox@lemmy.world
    link
    fedilink
    English
    5112 days ago

    They just recalled all the Cybertrucks, because their ‘smort’ technology is too stupid to realize when an accelerator sensor is stuck…

    • Jesus
      link
      fedilink
      English
      24
      edit-2
      11 days ago

      The accelerator sensor doesn’t get stuck, pedal does. The face of the accelerator falls off and wedges the pedal into the down position.

      • Granite
        link
        fedilink
        2412 days ago

        Pedal, not petal.

        Not trying to be an asshole, just a nudge to avoid misunderstandings (although the context is clear in this case)

      • @over_clox@lemmy.world
        link
        fedilink
        English
        412 days ago

        I realize it’s the pedal that gets stuck, but the computer registers the state of the pedal via a sensor.

        The computer should be smart enough to realize something ain’t right when it registers that both the accelerator and brake pedals are being pressed at the same time. And in that case, the brake should always take priority.

        • Jesus
          link
          fedilink
          English
          211 days ago

          The stories I’ve heard around the recall have been saying that the brakes override the accelerator in the cyber truck.

  • @axo@feddit.de
    link
    fedilink
    English
    3911 days ago

    Accoring to the math in this video: :

    • 150 000 000 miles have been driven with Teslas “FSD”, which equals to
    • 375 miles per tesla purchased with FSD capabilities
    • 736 known FSD crashes with 17 fatalities
    • equals 11.3 deaths per 100M miles of teslas FSD

    Doesnt sound to bad, until you hear that a human produces 1.35 deaths per 100M miles driven…

    Its rough math, but holy moly that already is a completely other class of deadly than a non FSD car

    • @dufkm@lemmy.world
      link
      fedilink
      English
      810 days ago

      a human produces 1.35 deaths per 100M miles driven

      My car has been driven around 100k miles by a human, i.e. it has produced 0.00135 deaths. Is that like a third of a pinky toe?

    • @NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      -1
      edit-2
      10 days ago

      That number is like 1.5 billion now and rising exponentially fast.

      Also those deaths weren’t all FSD they were AP.

      The report says 1 FSD related (not caused by but related) death. For whatever reason the full details on that one weren’t released.

      Edit: There are billions of miles on AP. In 2020 it was 3 billion

      Edit: Got home and I tried finding AP numbers through 2024 but haven’t seen anything recent, but given 3 billion 2020, and 2 billion in 2019, and an accelerating rate of usage with increased car sales, 2023 is probably closer to 8 billion miles. I imagine we’d hear when they reach 10 billion.

      So 8 billion miles, 16 AP fatalities (because that 1 FSD one isn’t the same) is 1 fatality per 500,000,000 miles, or put into the terms above by per 100mil miles, 0.2 fatalities per 100 million miles or 6.75 times less than a human produces. And nearly all of these fatal accidents were from blatant misuse of the system like driving drunk (at least a few) or using their phone and playing games.

  • @curiousPJ@lemmy.world
    link
    fedilink
    English
    3812 days ago

    If Red Bull can be successfully sued for false advertising from their slogan “It gives you wings”, I think it stands that Tesla should too.

  • @froh42@lemmy.world
    link
    fedilink
    English
    29
    edit-2
    12 days ago

    “If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.

    That’s a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.

    Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving.

    So L2 features are for better safety, not for a “wow we live in the future” show effect.

    For example lane keeping in my car - you don’t notice it when driving, it is just below your level of attention. But when I’m unconcentrated for a moment the car just stays on the lane, even on curving roads. It’s just designed to steer a bit later than I would do. (Also, even before, the wheel turns minimally lighter into the direction to keep the car center of lane, than turning it to the other direction - it’s just below what you notice, however if you don’t concentrate on that effect)

    Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)

    It feels like the software in my car could do a lot more, but its features are undersold.

    The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.

    In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can’t say if I might have been fast enough with reaction times)

    What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn’t compromise saftey in the name of marketing and hyperbole.

    If Tesla’s Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.

    I feel they are rather designed to be able to show off “cool stuff”.

    • @ForgotAboutDre@lemmy.world
      link
      fedilink
      English
      2112 days ago

      Tesla’s autopilot isn’t the best around. It’s just the most deployed and advertised. People creating autopilot responsibly don’t beta test them with the kind of idiots that think Tesla autopilot is the best approach.

      • @Thorny_Insight@lemm.ee
        link
        fedilink
        English
        -12
        edit-2
        12 days ago

        If Tesla’s self-driving isn’t the best one around then which one is? I’m not aware of any other system capable of doing what FSD does. Manufacturers like Mercedes may have more trust in their system because it only works on a limited number of hand-picked roads and under ideal conditions. I still wouldn’t say that what essentially is a train is better system for getting around than a car with full freedom to take you anywhere.

        • @machinin@lemmy.world
          link
          fedilink
          English
          19
          edit-2
          12 days ago

          All throughout these comments, you seem deeply, deeply confused. Let’s go over this sloooowly.

          Mercedes has two autonomous systems. Let’s call them MB FSD and MB Autodrive.

          MB FSD has similar features to Tesla’s. It isn’t geo-restricted. You have to pay attention, just like Tesla. It isn’t true autonomous driving, just like Tesla. If you have an accident, you are responsible, just like Tesla.

          MB Autodrive is another feature set. It is L3 autonomy, which means it is limited geographically and the driver should be available to take over when prompted. It also means that the driving is completely autonomous. The driver can be reading, playing on their phone, or simply laying there with their eyes closed. Mercedes will even take legal and financial responsibility for any accidents that happen on their system.

          So, to summarize:

          FSD -type systems: Mercedes and Tesla (and many other car makers)

          Level 3: not Tesla, Mercedes

          True autonomous driving is when the manufacturer takes responsibility for the car’s actions. Anything else is assisted driving. Until Tesla takes responsibility for accidents, you can’t consider them to have certified autonomous driving.

          Is that any clearer to you? After seeing some of your other shilling for Tesla in other posts, maybe there is a reason you don’t want to recognize the advantages of other systems?

          • @suction@lemmy.world
            link
            fedilink
            English
            412 days ago

            Absolutely correct. It’s so disheartening how many guys like him out there are hurting us all with their admiration for con-men like Trump and Musk and absolute inability to fact check

        • @suction@lemmy.world
          link
          fedilink
          English
          412 days ago

          It’s level 2 automation, a lot of other makers have that. You need to look past the juicy marketing language, there’s standards and norms which Tesla cannot go beyond because then it’ll be illegal to drive the cars on public roads.

  • @set_secret@lemmy.world
    link
    fedilink
    English
    2711 days ago

    VERGE articles seem to be getting worse over the years, they’ve almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn’t bad just because it’s Tesla.

    It doesn’t really give us the full picture. For starters, there’s no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla’s tech actually measures up.

    Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla’s systems are compared to good old human driving.

    We’re left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).

    It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.

    I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).

    • @PersnickityPenguin@lemm.ee
      link
      fedilink
      English
      2
      edit-2
      10 days ago

      A couple of my criticisms with the article, which is about “autopilot” and not fsd:

      -conflating autopilot and dad numbers, they are not interoperable systems. They are separate code bases with different functionality.

      -the definition of “autopilot” seems to have been lifted from the aviation industry. The term is used to describe a system that controls the vector of a vehicle, is the speed and direction. That’s all. This does seem like a correct description for what the autopilot system does. While “FSD” does seem like it does not live up to expectations, not being a true level 5 driving system.

      Merriam Webster defines autopilot thusly:

      “A device for automatically steering ships, aircraft, and spacecraft also : the automatic control provided by such a device”

    • @WormFood@lemmy.world
      link
      fedilink
      English
      -211 days ago

      a more genuine take would have included a series of scenarios (e.g. drunk/distracted/tired driving)

      I agree. they did tesla dirty. a more fair comparison would’ve been between autopilot and a driver who was fully asleep. or maybe a driver who was dead?

      and why didn’t this news article contain a full scientific meta analysis of all self driving cars??? personally, when someone tells me that my car has an obvious fault, I ask them to produce detailed statistics on the failure rates of every comparable car model

      • @mojofrododojo@lemmy.world
        link
        fedilink
        English
        -211 days ago

        a driver who was fully asleep. or maybe a driver who was dead?

        why does it need to become a specious comparison for it to be valid in your expert opinion? because those comparisons are worthless.

  • @nek0d3r@lemmy.world
    link
    fedilink
    English
    2411 days ago

    I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.

    • @machinin@lemmy.world
      link
      fedilink
      English
      19
      edit-2
      11 days ago

      Grey had the right idea when he said they didn’t need to be perfect, just as good as or better than humans.

      The better question - is Tesla’s FSD causing drivers to have more accidents than other driving assist technologies? It seems like a yes from this article and other data I’ve linked elsewhere in this thread.

      • @nek0d3r@lemmy.world
        link
        fedilink
        English
        211 days ago

        I appreciate this response amongst all the malding! My understanding of the difference in assistive technologies across different companies is lacking, so I’ll definitely look more into this.

    • JackbyDev
      link
      fedilink
      English
      1311 days ago

      CGP Grey also seems to believe self driving cars with the absence of traffic lights is the solution to traffic as opposed to something like trains.

      • @Skates@feddit.nl
        link
        fedilink
        English
        -7
        edit-2
        11 days ago

        /c/fuckcars is that way, thanks for stopping by

        Cars will never be dethroned. Yes, trains are cool - choo choo motherfucker. Yes, bikes are environmentally friendly. Yes, the car is a truly fucking horible answer to the question “how to get from A to B”.

        But that’s because cars are the answer to the question “how to get from A to B comfortably”. I don’t want my baby and my in-law to get on the back of my bike when we’re going camping. I don’t want to take the train and then walk 2 miles from the station every single fucking day with 20kg of tools in my hand, because shit, the train doesn’t stop next to my house, and it doesn’t stop next to my work. I want to be able to have acces to comfortable transportation.

        So the answer will still be the car. Even with everyone crying about it. Cause the cat’s out of the bag with cars, we made them efficient and cheap enough to not be considered luxury items anymore. And some countries (see: US) have their entire infrastructure built with cars in mind. You’re never putting the lid back on this, even if it’s a decent idea.

        • ghoti
          link
          fedilink
          English
          1311 days ago

          The solution to broken infrastructure isn’t to double down. Nobody wants your baby and in-law on the back of your bike or for you to walk 2 miles per day, that isn’t the criticism of cars. The criticism is that cars are more expensive and more dangerous than public transportation solutions, period.

          Ideally, we develop towards a both/and solution in the future. We have cars, bus systems, and bike infrastructure which can do last-mile transportation, then we have high-speed rail between major cities. This reduces upkeep cost and makes travel safer for everyone.

          This also isn’t saying to rip everything up to implement this system, but we already have crumbling infrastructure in the US due to lack of federal and state funding which will need to be replaced. As we expand and maintain our infrastructure, we can start to implement better, safer ideas for transportation, rather than doubling down on what is convenient yet unsustainable.

        • @nek0d3r@lemmy.world
          link
          fedilink
          English
          910 days ago

          To kind of piggyback off this, some newer cities in the US do get built with curbing cars in mind. But there’s definitely no easy fix for our systemic problem with infrastructure, and even if there was, cars are so deeply engraved in Americana that people here would fight it. It’s an uphill battle, and self driving cars can help mitigate existing issues while we figure the rest out.

          In smaller and mid size cities where I live, buses are the pretty decent form of public transportation, and I could absolutely see self driving sneak its way into there.

          I get that conditions aren’t ideal and that sucks, but progress comes in baby steps, and as long as the larger problems remain out of reach, these smaller ones help.

        • JackbyDev
          link
          fedilink
          English
          310 days ago

          What the fuck are you on about? Where did I ever say anything close to anything you are talking about? You clearly have some sort of beef that you need to deal with. I wish you peace.

    • kingthrillgore
      link
      fedilink
      English
      1110 days ago

      A comment above points to a nearly 11x increase over human caused fatalities

  • @kava@lemmy.world
    link
    fedilink
    English
    2312 days ago

    Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

    I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

    Because while it’s clear by now Teslas aren’t the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

    We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

    • @Blackmist@feddit.uk
      link
      fedilink
      English
      3012 days ago

      The question isn’t “are they safer than the average human driver?”

      The question is “who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?”

      Because if the answer is “nobody”, they shouldn’t be on the road. There’s zero accountability, and because it’s all wibbly-wobbly AI bullshit, there’s no way to prove that the issues are actually fixed.

        • @Blackmist@feddit.uk
          link
          fedilink
          English
          912 days ago

          Accountability is important. If a human driver is dangerous, they get taken off the roads and/or sent to jail. If a self driving car kills somebody, it’s just “oops, oh well, these things happen, but shareholder make a lot of money so never mind”.

          I do not want “these things happen” on my headstone.

          • @Tja@programming.dev
            link
            fedilink
            English
            411 days ago

            So you would prefer to have higher chances of dying, just to write “Joe Smith did it” on it?

          • @ipkpjersi@lemmy.ml
            link
            fedilink
            English
            2
            edit-2
            11 days ago

            But if a human driver is dangerous, and gets put in jail or get taken off the roads, there are likely already more dangerous human drivers taking their place. Not to mention, genuine accidents, even horrific ones, do happen with human drivers. If the rate of accidents and rate of fatal accidents with self-driving vehicles is way down versus human drivers, you are actually risking your life more by trusting in human drivers and taking way more risks that way. Having someone be accountable for your death doesn’t matter if you’ve already died because of them.

            Is it any better if you have “Killed by Bill Johnson’s SUV” on your headstone?

      • dream_weasel
        link
        fedilink
        English
        1212 days ago

        The answer is the person behind the wheel.

        Tesla makes it very clear to the driver they you still have to pay attention and be ready to take over any time. Full self driving engages the in cabin nanny cam to enforce that you pay attention, above and beyond the frequent reminders to apply turning force to the steering wheel.

        Now, once Tesla goes Mercedes and says you don’t have to pay attention, it’s gonna be the company that should step in. I know that’s a big old SHOULD, but right now that’s not the situation anyway.

        • @exanime@lemmy.today
          link
          fedilink
          English
          612 days ago

          Now, once Tesla goes Mercedes and says you don’t have to pay attention, it’s gonna be the company that should step in

          That doesn’t give me warm and fuzzies either… Imagine a poor dude having to fight Mercedes or Testla because he was crippled by a sleeping driver and bad AI… Not even counting the lobbying that would certainly happen to reduce and then eliminate their liability

          • dream_weasel
            link
            fedilink
            English
            612 days ago

            There will be legal battles for sure. I don’t know how you can argue for anything besides the manufacturer taking responsibility. I don’t know how that doesn’t end up with auto pilot fatalities treated as a class where there’s a lookup table of payouts though. This is the intersection of liability and money/power, so it’s functionally uncharted territory at least in the US.

      • @ipkpjersi@lemmy.ml
        link
        fedilink
        English
        7
        edit-2
        12 days ago

        The question isn’t “are they safer than the average human driver?”

        How is that not the question? That absolutely is the question. Just because someone is accountable for your death doesn’t mean you aren’t already dead, it doesn’t bring you back to life. If a human driver is actively dangerous and get taken off the road or put in jail, there are very likely already plenty more taking that human drivers place. Plus genuine accidents, even horrific ones, do happen with human drivers. If the death rate for self-driving vehicles is really that much lower, you are risking your life that much more by trusting in human drivers.

        • @ShepherdPie@midwest.social
          link
          fedilink
          English
          512 days ago

          Yeah that person’s take seems a little unhinged as throwing people in prison after a car accident only happens if they’re intoxicated or driving recklessly. These systems don’t have to be perfect to save lives. They just have to be better than the average driver.

          • @Tja@programming.dev
            link
            fedilink
            English
            211 days ago

            Hell, let’s put the threshold at “better than 99% of drivers”, because every driver I know thinks they are better than average.

        • @sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          412 days ago

          Exactly.

          We should solve the accountability problem, but the metric should be lives and accidents. If the self-driving system proves it causes fewer accidents and kills fewer people, it should be preferred. Full stop.

          Throwing someone in jail may be cathartic, but the goal is fewer issues on the road, not more people in jail.

      • @Maddier1993@programming.dev
        link
        fedilink
        English
        311 days ago

        I don’t agree with your argument.

        Making a human go to prison for wiping out a family of 4 isn’t going to bring back the family of 4. So you’re just using deterrence to hopefully make drivers more cautious.

        Yet, year after year… humans cause more deaths by negligence than tools can cause by failing.

        The question is definitely “How much safer are they compared to human drivers”

        It’s also much easier to prove that the system has those issues fixed compared to training a human hoping that their critical faculties are intact. Rigorous Software testing and mechanical testing are within legislative reach and can be made strict requirements.

      • @kava@lemmy.world
        link
        fedilink
        English
        212 days ago

        Because if the answer is “nobody”, they shouldn’t be on the road

        Do you understand how absurd this is? Let’s say AI driving results in 50% less deaths. That’s 20,000 people every year that isn’t going to die.

        And you reject that for what? Accountability? You said in another comment that you don’t want “shit happens sometimes” on your headstone.

        You do realize that’s exactly what’s going on the headstones of those 40,000 people that die annually right now? Car accidents happen. We all know they happen and we accept them as a necessary evil. “Shit happens”

        By not changing it, ironically, you’re advocating for exactly what you claim you’re against.

        • @exanime@lemmy.today
          link
          fedilink
          English
          -112 days ago

          Hmmm I get you point but you seem to be taken the cavalier position of one who’d never be affected.

          Let’s proposed this alternative scenario: AI is 50% safer and would reduce death from 40k to 20k a year if adopted. However, the 20k left will include your family and, unfortunately , there is no accountability therefore, nobody will pay to help raise your orphan nephew or help grandma now that your grandpa died ran over by a Tesla… Would you approve AI driving going forward?

          • @kava@lemmy.world
            link
            fedilink
            English
            511 days ago

            A) you do realize cars have insurance and when someone hits you, that insurance pays out the damages, right? That is how the current system works, AI driver or not.

            Accidents happen. Humans make mistakes and kill people and are not held criminally liable. It happens.

            If some guy killed your nephew and made him an orphan and the justice system determined he was not negligent - then your nephew would still be an orphan and would get a payout by the insurance company.

            Exact same thing that happens in the case of an AI driven car hitting someone

            B) if I had a button to save 100k people but it killed my mother, I wouldn’t do it. What is your point?

            Using your logic, if your entire family was in the 20,000 who would be saved - you would prefer them dead? You’d rather them dead with “accountability” rather than alive?

              • @kava@lemmy.world
                link
                fedilink
                English
                311 days ago

                Your thought experiment doesn’t work. I wouldn’t accept any position where my family members die and beyond that, it’s immaterial to the scope of discussion.

                Let’s examine various different scenarios under which someone dies in a car accident.

                1. human driver was negligent and causes a fatal car accident.

                Human gets criminal charges. Insurance pays out depending on policy.

                1. human driver was not negligent and causes a fatal car accident.

                Human does not get criminal charged. Insurance pays out depending on policy

                1. AI driver causes a fatal accident.

                Nobody gets criminal charges. Insurance pays out depending on policy.


                You claim that you would rather have 20,000 people die every year because of “accountability”.

                Tell me, what is the functional difference for a family member of a fatal car accident victim in those 3 above scenarios? The only difference is under 1) there would be someone receiving criminal charges.

                They recieve the same amount of insurance money. 2) already happens right now. You don’t mention that in the lack of accountability.

                You claim that being able to pin some accidents (remember, some qualify under 2) on an individual is worth 20,000 lives a year.

                Anybody who has ever lost someone in a car accident would rather have their family member back instead.

                • @exanime@lemmy.today
                  link
                  fedilink
                  English
                  0
                  edit-2
                  11 days ago

                  Your thought experiment doesn’t work

                  The point of a thought experiment is to think about that proposition, not to replace with whatever you think makes sense

                  1. AI driver causes a fatal accident.

                  Nobody gets criminal charges. Insurance pays out depending on policy.

                  Now here is my concern… You are reducing a human life to a dollar amount just like Ford did with the Pinto. If Mercedes (who is apparently liable), decides they are making more money selling their cars than paying out to people injured or killed by their cars, what’s left to force them to recall/change/fix their algorithm?

                  PS: I also never claimed I rather have 20000 more people die for accountability… So, I guess you have to argue that with the part of your brain that made it up

          • @sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            412 days ago

            Yes, unless you mean I need to literally sacrifice my family. But if my family was randomly part of the 20k, I’d defend self-driving cars if they are proven to be safer.

            I’m very much a statistics-based person, so I’ll defend the statistically better option. In fact, me being part of that 20k gives me a larger than usual platform to discuss it.

            • @exanime@lemmy.today
              link
              fedilink
              English
              -212 days ago

              No, I do mean literally your family. Not because I’m trying to be mean to you, I’m just trying to highlight you’d agree with a contract when you think the price does not apply to you… But in reality the price will apply to someone, whether they agree with the contract and enjoy the benefits or not

              It’s the exact same situation with real life with the plane manufacturers. They lobby the government to allow recalls not to be done immediately but instead on the regular maintenance of the planes. This is to save money but it literally means that some planes are put there with known defects that will not be addressed for months (or years, depending on the maintenance needed)

              Literally, people who’d never have a loved one in one of those flights decided that was acceptable to save money. They agreed, it’s ok to put your life at risk, statistically, because they want more money

              • @Tja@programming.dev
                link
                fedilink
                English
                411 days ago

                If there are 20k deaths vs 40k, my family is literally twice as safe on the road, why wouldn’t I take that deal?

              • @sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                3
                edit-2
                12 days ago

                Then it’s not a fair question. You’re not comparing 40k vs 20k, you’re comparing 40k vs literally my family dying (like the hypothetical train diversion thing), that’s fear mongering and not a valid argument.

                The risk does not go up for my family because of self-driving cars. That’s innate to the 40k vs 20k numbers.

                So the proper question is: if your family was killed in an accident, what would be your reaction if it was a human driver vs AI? For me:

                • human driver - incredibly mad because it was probably preventable
                • AI - still mad, but supportive of self-driving improvements because it probably can be patched

                The first would make me bitter and probably anti-driving, whereas the second would make me constructive and want to help people understand the truth of how it works. I’m still mad in both cases, but the second is more constructive.

                Seeing someone go to jail doesn’t fix anything.

                • @exanime@lemmy.today
                  link
                  fedilink
                  English
                  -1
                  edit-2
                  11 days ago

                  Yes, it’s a thought experiment… Not a fair question, just trying to put it in perspective

                  Anyone who understands stats would agree 40k death is worse than 20k but it also depends on other factors. All things being equal to today, the 20k proposition is only benefit

                  But if we look into the nuance and details emerge, the formula changes. For example, here it’s been discussed that there may be nobody liable. If that’s the case, we win by halving death (absolutely a win) but now the remaining 20k may be left with no justice… Worse, it absolutely creates a perverse incentive for these companies, without liability exposure, to do whatever to maximize profit

                  So, not trying to be a contrarian here… I just want to avoid the polarization that is now the rule online… Nothing is just black and white

      • @slumberlust@lemmy.world
        link
        fedilink
        English
        112 days ago

        The question for me is not what margins the feature is performing on, as they will likely be better than human error raters, but how they market the product irresponsiblely.

    • @machinin@lemmy.world
      link
      fedilink
      English
      19
      edit-2
      12 days ago

      I was looking up info for another comment and found this site. It’s from 2021, but the information seems solid.

      https://www.flyingpenguin.com/?p=35819

      This table was probably most interesting, unfortunately the formatting doesn’t work on mobile, but I think you can make sense of it.

      Car 2021 Sales So Far Total Deaths

      Tesla Model S 5,155 40

      Porsche Taycan 5,367 ZERO

      Tesla Model X 6,206 14

      Volkswagen ID 6,230 ZERO

      Audi e-tron 6,884 ZERO

      Nissan Leaf 7,729 2

      Ford Mustang Mach-e 12,975 ZERO

      Chevrolet Bolt 20,288 1

      Tesla Model 3 51,510 87

      So many cars with zero deaths compared to Tesla.

      It isn’t if Tesla’s FSD is safer than humans, it’s if it’s keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).

        • @petrol_sniff_king@lemmy.blahaj.zone
          link
          fedilink
          English
          311 days ago

          If not, that would indicate that this newfangled self-driving is more dangerous than a little ol’ “caught in the stone-age” Nissan Leaf, wouldn’t it?

      • dream_weasel
        link
        fedilink
        English
        -412 days ago

        That’s kind of a tough article to trust if I’m being honest. It may in fact be true, but it’s an opinion piece.

        I find it a little weird to look only within sales for the year and also not to discuss the forms of autopilot or car use cases.

        For example, are we talking about highway only driving, lane keeping assist, end to end residential urban, rural unmarked roads? Some of these are harder problems than others. How about total mileage as well? I’m not sure what the range is on a Nissan leaf, but I think comparing it to a Taycan or mach e seems disingenuous.

        All that being said, yeah Tesla has a lot of deaths comparatively, but still way less than regular human drivers. I worry that a truly autonomous experience will not be available until and unless a manufacturer like Tesla pushes the limits on training data and also the fed responds by making better laws. Considering Elon douchiness, I’m also kinda happy Tesla is doing that and catching flak, but paving the way for more established manufacturers.

        We were early adopters of Tesla, and trust me the cars are made cheap and the “autopilot” drives like shit even now, but it’s amazing the progress that has been made in the last 6 years.

        • @machinin@lemmy.world
          link
          fedilink
          English
          612 days ago

          You’re happy that a racist, misogynist billionaire whose companies have some of the worst employee safety data in the industries he’s involved in is pushing these cars onto public roads? Musk doesn’t care about our safety. Like everything else, he lies about it to make money.

          We have no clue if Tesla’s are safer than humans drivers in any other car. Tesla publishes those charts, but the data is no where to be found.

          Musk lies to make money. You can’t trust anything Tesla publishes.

          I don’t want Tesla testing their shit on the public roads and putting me at risk so that Musk can make more money. I don’t opt in to be one of his beta testers.

          • dream_weasel
            link
            fedilink
            English
            -712 days ago

            We get it, you hate Elon Musk. That’s a fine position to take.

            You are beta testing for anyone and everyone who is doing anything on the road. You can say “look at this lending tree report” and see accident rates, or look at the article you posted, and compare to human drivers to know which is safer. Or you can say it’s an unknowable lie in which case why are we citing anything besides you saying I hate Musk? Again valid.

            He’s making money regardless, so yeah I’m glad that spaceX lands reusable boosters and Tesla pushes the limits of what is possible with an EV so at least we get something back. Considering how many other people hate the shit out of Tesla, I’m sure every time someone hits a raccoon in a Tesla we will get to read about it.

            • @machinin@lemmy.world
              link
              fedilink
              English
              2
              edit-2
              12 days ago

              It’s not just hatred for Musk. Yes, he is a racist that had a place in his factory called “the plantation” for black workers. He swatted the wife and children of a whistleblower. There is so much shit he does, but that isn’t what makes Teslas dangerous.

              Teslas are dangerous because he creates a culture that despises safety engineering practices. When someone has sex on autopilot and endangers everyone on the road around them, does Musk rebuke them? No, he makes a joke. Now, good followers think that the silly little warning that pops up every time probably doesn’t mean much. If a worker says that something probably needs more testing before release, do you think he pauses to consider the safety implications? I can guarantee he doesn’t care.

              So, you get someone who runs into a fireman on the road and kills them because they were using autopilot while distracted. Or you back over a motorcycle driver and kill them, or plow into a firetruck and kill some more people.

              Musk and sycophants like you that think it’s okay to have a cavalier attitude about safety because people just have to be sacrificed for technology. You are menaces. We don’t have to sacrifice passengers to make airlines safer. We have proper testing and systems in place to integrate better technology at very little risk. In the same way, we don’t have to sacrifice motorcycle drivers, first responders, other drivers or pedestrians just because you think your technology is worth it. Other car manufacturers have implemented those safety test systems. Tesla just doesn’t want to spend the money so Musk can get his payout.

              • @vaultdweller013@sh.itjust.works
                link
                fedilink
                English
                111 days ago

                You also forgot to mention that the damned things are rolling death traps since the doors arent properly mechanical. Why the fuck should I trust something that requires power to work in an emergency. Any number of things can knock out power and disable the doors if I back my 20+ year old jeep into a fucken river I could still open the door the seals are all shot as well so reduced pressure issues.

              • dream_weasel
                link
                fedilink
                English
                -112 days ago

                There is no obligation to sacrifice anybody. This is a question of risk vs law vs driver requirement which has got to be sorted out. Sure, point out that musk is shit and his factories are shit, it’s true. He’s also a liar. All true. What I take issue with is saying that the cars are 4 wheeled death machines killing everyone in their path. That is not true. It is also not true that other companies are solving the same problem without risk. They are solving a different problem of highly mapped cities and solutions for specific scenarios.

                It’s a people problem and drivers (people) are irresponsible. I bet lift kits have killed more people than Tesla has had autopilot accidents by people not adjusting headlights. People are gonna fuck up. It has to happen, then laws have to be implemented and revised. There’s no hop skip and jump that solves autopilot on a closed course and has zero incident in practice. Conditions on the whole are just too varied. Of course, machine learning is my job so maybe I’m just a pessimist in this regard.

                • @machinin@lemmy.world
                  link
                  fedilink
                  English
                  312 days ago

                  What I take issue with is saying that the cars are 4 wheeled death machines killing everyone in their path. That is not true. It is also not true that other companies are solving the same problem without risk.

                  I never said that. It isn’t black or white. I said musk creates a culture that despises safety engineering. Other companies like Volvo embrace it. Different companies embrace it to different degrees. As a result, you have wildly different fatality rates. Teslas happen to be the worst (although, like you said, it’s impossible to get good data that accounts for all the factors).

                  Yes, it is a people problem, but it is also a systems problem. Volvo has aimed for zero fatalities in their cars. They engineer for problematic people. They went 16 years without a fatality in the UK in one of their models. Tesla simply doesn’t care about problematic people. In fact, problematic people may even get a boost from a Musk re-tweet.

                  I agree, zero incidents may be impossible and people are problematic. But attitudes, practices, cultures and systems can either amplify those problems or dampen their effects. Musk and Tesla amplify the negative effects. It doesn’t have to be that way.

    • @ipkpjersi@lemmy.ml
      link
      fedilink
      English
      18
      edit-2
      12 days ago

      I know this is going to sound bad but bear with me and read my entire post. I think in this case it might be that people are trying to hate on Tesla because it’s Elon (and fair enough) rather than self-driving itself. Although there’s also the side of things that self-driving vehicles are already very likely safer than human-driven ones, have lower rates of accidents, etc but people expect there to be zero accidents whatsoever with self-driving which is why I think self-driving may never actually take off and become mainstream. Then again, there’s the lack of accountability, people prefer being able to place the blame and liability on something concrete, like an actual human. It’s possible I’m wrong but I don’t think I am wrong about this.

      edit: I looked further into this, and it seems I am partially wrong. It seems that Tesla is not keeping up with the average statistics in the automotive industry in terms of safety statistics, the self-driving in their vehicles seem less safe than their competitors.

    • Chaotic Entropy
      link
      fedilink
      English
      1212 days ago

      I would highlight that not all Teslas will be being driven in this mode on a regular basis, if ever.

      • NιƙƙιDιɱҽʂ
        link
        fedilink
        English
        611 days ago

        For example, I dont really trust mine and mostly use it in slow bumper to bumper traffic, or so I can adjust my AC on the touchscreen without swerving around in my lane.

    • @suction@lemmy.world
      link
      fedilink
      English
      4
      edit-2
      12 days ago

      Only Elon calls his level 2 automation “FSD” or even “Autopilot”. That alone proves that Tesla is more guilty of these deaths than other makers are who choose less evil marketing terms. The dummies who buy Elon’s crap take those terms at face value and the Nazi CEO knows that, he doesn’t care though because just like Trump he thinks of his fans as little more than maggots. Can’t say I blame him.

  • @unreasonabro@lemmy.world
    link
    fedilink
    English
    19
    edit-2
    10 days ago

    Obviously the time to react to the problem was before the system told you about it, that’s the whole point, THE SYSTEM IS NOT READY. Cars are not ready to drive themselves, and obviously the legal system is too slow and backwards to deal with it so it’s not ready either. But fuck it let’s do it anyway, sure, and while we’re at it we can do away with the concept of the driver’s license in the first place because nothing matters any more and who gives a shit we’re all obviously fucking retarded.

    • @dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      512 days ago

      When I see this comment it makes me wonder, how do you feel when you see someone driving a car?

      Should I feel guilty for owning a car. I’m 41 and I got my first car when I was 40, because I changed careers and it was 50 miles away.

      I rarely used it outside of work and it was a means to get me there. I now work remote 3 days so only drive 2.

      I don’t have social media or shop with companies like Amazon. I have just been to my first pro-Palestine protest.

      Am I to be judged for using a car?

      • @hydration9806@lemmy.ml
        link
        fedilink
        English
        812 days ago

        I believe what they mean is “fuck car centric societal design”. No reasonable person should be mad that someone is using the current system to live their life (i.e. driving to work). What the real goal is spreading awareness that a car centric society is inherently isolating and stressful, and that one more lane does absolutely nothing to lessen traffic (except for like a month ish)

      • @machinin@lemmy.world
        link
        fedilink
        English
        512 days ago

        Probably not you personally, but the system, oil companies, and people like Musk and his followers that want to prioritize private driving over public transportation.

        I say fuck cars, and I have one too. I try to avoid using it, but it’s easy to be lazy. I’m also fortunate to live someplace with great public transportation.

        Don’t take it personally, just realize life can be better if we could learn to live without these huge power-hungry cargo containers taking us everywhere.

      • @PlexSheep@infosec.pub
        link
        fedilink
        English
        211 days ago

        That’s a good question!

        The short answer is no. Cars suck for many reasons, but it’s a fact in many parts of the world that you cannot be a functioning member of a society without one, especially if your government doesn’t get that cars suck or you live somewhere remote.

        How do I feel when I see someone driving a car? Mostly my feelings don’t change, because it is so normalized. But I get somewhat angry when I see uselessly huge cars that are obviously just a waste of resources. I have fun ridiculing car centric road and city design, but it’s the bad kind of fun.

        I am also very careful around cars, both while I’m in and outside of them. Cars are very heavy and drivers are infamous for being bad at controlling them. This isn’t their fault, it’s super easy to make mistakes while driving, you just have to move your feet a little too fast or move your hand a little too far and boom, someone is dead.

        Think about driving on a highway. If the guy next to you accidentally moves the wheel a little more than usual, that car will crash into you, creating a horrendous scene. It’s just too prone to failure, and failure will probably mean person damages. For this reason, cars are legitimately scaring me, even if I have to deal with it.

        Sorry if that does not make sense to you. I’m still trying to figure all this out for myself and I’m not always rational about these topics, because seeing the potential of our cities being wasted by car centric design makes me angry.

  • @TypicalHog@lemm.ee
    link
    fedilink
    English
    1111 days ago

    It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

    • @NIB@lemmy.world
      link
      fedilink
      English
      4811 days ago

      If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.

      Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.

      Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.

      https://www.youtube.com/watch?v=Gm2x6CVIXiE

      • @TypicalHog@lemm.ee
        link
        fedilink
        English
        -4711 days ago

        What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots. Also, this article is not even about camera only.

          • @TypicalHog@lemm.ee
            link
            fedilink
            English
            -3411 days ago

            Because that’s expensive and can be done with a camera. And once you figure the camera stuff out - you gucci. Now you can do all kinds of shit without needing a lidar on every single robot.

            • @AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              1011 days ago

              Because that’s expensive and can be done with a camera.

              Expensive, as in probably less than $600? Compared to the $35000 cost of a tesla?

              (comparing the cost of the iPhone 12 (without lidar) and iPhone 12 pro (with lidar), we can guess that the sensor probably costs less than $200, so 3 of them (for left, right, and front) would cost probably less than $600)

              lidar can actually be very cheap and small. Unfortunately, Apple bought the only company that seems to make sensors like that (besides some other super high end models)

              There have been a lot of promising research papers on the technology lately though, so I expect more, higher resolution and cheaper lidar sensors to be available relatively soon (next couple years probably).

              • @Grippler@feddit.dk
                link
                fedilink
                English
                2
                edit-2
                11 days ago

                Yeah that’s not even remotely the same type of sensor used in robotics and autonomous cars. Yes lidar is getting cheaper, but for high detail long range detection they’re much more expensive than the case of your iphone example. The iPhone “lidar” is less than useless in an automotive context.

              • @TypicalHog@lemm.ee
                link
                fedilink
                English
                -411 days ago

                Perhaps. Idk, maybe I’m wrong. But it for sure seems it would be so much better if we achieved the same shit with a cheaper and more primitive simpler sensor.

                • @BURN@lemmy.world
                  link
                  fedilink
                  English
                  310 days ago

                  To get the same resolution and quality of image in all lighting scenarios, cameras are actually going to be more expensive than LiDAR. Cameras suffer in low light, low contrast situations due to the physical limitations of bending light. More light = bigger lenses = higher cost, when LiDAR works better and is cheaper

            • @Zink@programming.dev
              link
              fedilink
              English
              610 days ago

              My eyes are decent, but if I had a sixth sense that gave me full accurate 3D 360 spatial awareness regardless of visibility, I would probably not turn it off just to use my eyes. I’d use both.

        • @howrar@lemmy.ca
          link
          fedilink
          English
          610 days ago

          I’ve heard Elon Musk (or was it Karpathy?) talking about how camera should be sufficient for all scenarios because humans can do it on vision alone, but that’s poor reasoning IMO. Cars are not humans, so there’s no reason to confine them to the same limitations. If we want them to be safer and more capable than human drivers, one way to do that is by providing them with more information.

          • kingthrillgore
            link
            fedilink
            English
            510 days ago

            We built things like Lidars and ultrasound because we want better than our eyes at depth and sight.

        • @mojofrododojo@lemmy.world
          link
          fedilink
          English
          611 days ago

          Camera only should obviously be the endgame goal for all robots.

          I can’t tell if you’re a moron or attempting sarcasm but this is the least informed opinion I’ve seen in ages.

          • @TypicalHog@lemm.ee
            link
            fedilink
            English
            -311 days ago

            I wasn’t attempting sarcasm, so maybe I’m a moron idk. Fair, it likely I’m uninformed. I just know my daddy Elon said something about how solving shit with camera only is probably the best path and will pay off.

    • @Geobloke@lemm.ee
      link
      fedilink
      English
      2311 days ago

      No it doesn’t. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them

    • @mojofrododojo@lemmy.world
      link
      fedilink
      English
      2111 days ago

      this is bullshit.

      A human can be held accountable for their failure, bet you a fucking emerald mine Musk won’t be held accountable for these and all the other fool self drive fuckups.

      • @sabin@lemmy.world
        link
        fedilink
        English
        -211 days ago

        So you’d rather live in a world where people die more often, just so you can punish the people who do the killing?

        • @mojofrododojo@lemmy.world
          link
          fedilink
          English
          311 days ago

          That’s a terrifically misguided interpretation of what I said, wow.

          LISTEN UP BRIGHT LIGHTS, ACCOUNTABILITY ISN’T A LUXURY. It’s not some ‘nice to have add-on’.

          Musk’s gonna find out. Gonna break all his fanboys’ hearts too.

          • @sabin@lemmy.world
            link
            fedilink
            English
            -111 days ago

            Nothing was misguided and if anything your tone deaf attempt to double down only proves the point I’m making.

            This stopped being about human deaths for you a long time ago.

            Let’s not even bother to ask the question of whether or not this guy could ultimately be saving lives. All that matters to you is that you have a target to take your anger out on the event that a loved one dies in an accident or something.

            You are shallow beyond belief.

            • @mojofrododojo@lemmy.world
              link
              fedilink
              English
              411 days ago

              This stopped being about human deaths for you a long time ago.

              Nope, it’s about accountability. The fact that you can’t see how important accountability is just says you’re a musk fan boy. If Musk would shut the fuck up and do the work, he’d be better off - instead he’s cheaping out left and right on literal life dependent tech, so tesla’s stock gets a bump. It’s ridiculous, like your entire argument.

              • @sabin@lemmy.world
                link
                fedilink
                English
                111 days ago

                I don’t give a fuck about musk. I think hos Hyperloop is beyond idiotic and nothing he makes fucking works. In fact I never even said I necessarily think the state of Tesla autopilot is acceptable. All I said was that categorically rejecting autopilot (even for future generations where tech can be much better) for the express purpose of being able to prosecute people is beyond empty and shallow.

                If you need to make up lies about me and strawman me to disagree you only prove my point. You stopped being a rational agent who weighs the good and bad of things a long time ago. You don’t care about how good the autopilot is or can be. All you care about is your mental fixation against the CEO of the company in question.

                Your political opinions should be based on principles, not whatever feels convenient in the moment.

                • @mojofrododojo@lemmy.world
                  link
                  fedilink
                  English
                  110 days ago

                  You stopped being a rational agent who weighs the good and bad of things a long time ago.

                  sure thing, you stan musk for no reason, and call me irrational. pfft. gonna block you now, tired of your bullshit

    • @SirEDCaLot@lemmy.today
      link
      fedilink
      English
      211 days ago

      This is 100% correct. Look at the average rate of crashes per mile driven with autopilot versus a human. If the autopilot number is lower, they’re doing it right and should be rewarded and NHTSA should leave them be. If the autopilot number is higher, then yes by all means bring in the regulation or whatever.

      • @flerp@lemm.ee
        link
        fedilink
        English
        -111 days ago

        Humans are extremely flawed beings and if your standard for leaving companies alone to make as much money as possible is that they are at least minimally better than extremely flawed, I don’t want to live in the same world as you want to live in.

        • @AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          411 days ago

          Having anything that can save lives over an alternative is an improvement. In general. Yes, we should be pushing for safer self driving, and regulating that. But if we can start saving lives now, then sooner is better than later.

          • @flerp@lemm.ee
            link
            fedilink
            English
            110 days ago

            I’m not sure if that was supposed to be in agreement or countering what I said.

            Over the past few decades, some people have noticed and commented on the enormous death toll that our reliance on driving and the vast amount of driving hours spent on our roads and said that that amount of death is unacceptable. Nothing has ever been able to come of it because of that aforementioned reliance on driving that our society has. Human nature cannot be the thing that changes, we can’t expect humans to behave differently all of a sudden nor change their ability to focus and drive safely.

            But this moment in time, when the shift from human to machine drivers is happening, the time when we shift from beings incapable of performing better on a global scale, to machines able to avoid the current death tolls due to their ability to be vastly more precise than humans, this is the time to reduce that death toll.

            If we allow companies to get away with removing sensors from their cars which results in lower safety just so that said company can increase their bottom line, I consider that unacceptable even if the death toll is slightly lower than human driven cars if it could be greatly lower than human driven cars.

            • @SirEDCaLot@lemmy.today
              link
              fedilink
              English
              18 days ago

              One company says they can build FSD with 15 sensors and sensor fusion. Another company says they can build FSD with just cameras. As I see it, the development path doesn’t matter, it’s the end result that matters.

        • @SirEDCaLot@lemmy.today
          link
          fedilink
          English
          18 days ago

          It is not my place or yours or the governments to tell people how to spend their money or not. It IS our place to ensure that companies aren’t producing products that kill people.

          Thus money doesn’t matter here. What matters is whether or not FSD is more dangerous than a human. If it is, it should be prohibited or only used under very monitored conditions. If it is equal or better than a human, IE same or fewer accident / fatalities per mile driven, then Tesla should be allowed to sell it, even if it is imperfect.

          In the US we have a free market. Nobody is obligated to pay for FSD or use it. People can vote with their wallet whether they think it’s worth the money or not, THAT is what determines if Tesla makes more money or not. It’s up to each individual customer to decide if it’s worth it. That’s their choice not mine or yours.

          As I see it, in a free market what Tesla has to prove is that their system doesn’t make things worse. If they can, if they can prove they’re not making roads more dangerous IE no need to ban it, then it’s a matter between them and their customer.

    • @PresidentCamacho@lemm.ee
      link
      fedilink
      English
      -711 days ago

      This is the actual logical way to think about self driving cars. Stop down voting him because “Tesla bad” you fuckin goons.

      • @gallopingsnail@lemmy.sdf.org
        link
        fedilink
        English
        1311 days ago

        Tesla’s self driving appears to be less safe and causes more accidents than their competitors.

        “NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

        Tesla bad.

        • @TypicalHog@lemm.ee
          link
          fedilink
          English
          511 days ago

          Can you link me the data that says Tesla’s competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

        • @Socsa@sh.itjust.works
          link
          fedilink
          English
          211 days ago

          I don’t quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?

          • @nxdefiant@startrek.website
            link
            fedilink
            English
            2
            edit-2
            10 days ago

            The NHSTA hasn’t issued rules for these things either.

            the U.S. gov has issued general guidelines for the technology/industry here:

            https://www.transportation.gov/av/4

            They have an article on it discussing levels of automation here:

            https://www.nhtsa.gov/vehicle-safety/automated-vehicles-safety

            By all definitions layed out in that article:

            BlueCruise, Super Cruise, Mercedes’ thing is a lvl3 system ( you must be alert to reengage when the conditions for their operation no longer apply )

            Tesla’s FSD is a lvl 3 system (the system will warn you when you must reengage for any reason)

            Waymo and Cruise are a lvl 4 system (geolocked)

            Lvl 5 systems don’t exist.

            What we don’t have is any kind of federal laws:

            https://www.ncsl.org/transportation/autonomous-vehicles

            Separated into two sections – voluntary guidance and technical assistance to states – the new guidance focuses on SAE international levels of automation 3-5, clarifies that entities do not need to wait to test or deploy their ADS, revises design elements from the safety self-assessment, aligns federal guidance with the latest developments and terminology, and clarifies the role of federal and state governments.

            The guidance reinforces the voluntary nature of the guidelines and does not come with a compliance requirement or enforcement mechanism.

            (emphasis mine)

            The U.S. has operated on a “states are laboratories for laws” principal since its founding. The current situation is in line with that principle.

            These are not my opinions, these are all facts.

        • @nxdefiant@startrek.website
          link
          fedilink
          English
          -411 days ago

          No one else has the same capability in as wide a geographic range. Waymo, Cruise, Blue Cruise, Mercedes, etc are all geolocked to certain areas or certain stretches of road.

          • @GiveMemes@jlai.lu
            link
            fedilink
            English
            311 days ago

            Ok? Nobody else is being as wildly irresponsible, therefore tesla should be… rewarded?

            • @nxdefiant@startrek.website
              link
              fedilink
              English
              110 days ago

              I’m saying larger sample size == larger numbers.

              Tesla announced 300 million miles on FSD v12 in just the last month.

              https://www.notateslaapp.com/news/2001/tesla-on-fsd-close-to-license-deal-with-major-automaker-announces-miles-driven-on-fsd-v12

              Geographically, that’s all over the U.S, not just in hyper specific metro areas or stretches of road.

              The sample size is orders of magnitude bigger than everyone else, by almost every metric.

              If you include the most basic autopilot, Tesla surpassed 1 billion miles in 2018.

              These are not opinions, just facts. Take them into account when you decide to interpret the opinion of others.

              • @GiveMemes@jlai.lu
                link
                fedilink
                English
                0
                edit-2
                9 days ago

                That’s not how rates work tho. Larger sample size doesn’t correlate with a higher rate of accidents, which is what any such study implies, not just raw numbers. Your bullshit rationalization is funny. In fact, a larger sample size tends to correspond with lower rates of flaws, as there is less chance that an error/fault makes an outsized impact on the data.

                • @nxdefiant@startrek.website
                  link
                  fedilink
                  English
                  1
                  edit-2
                  9 days ago

                  No one’s talking about rates. The article itself, all the articles linked in these comments are talking about counts. Numbers of incidents. I’m not justifying anything because I’m not injecting my opinion here. I’m only pointing out that without context, counts don’t give you enough information to draw a conclusion, that’s just math. You can’t even derive a rate without that context!

      • @doubtingtammy@lemmy.ml
        link
        fedilink
        English
        1111 days ago

        It’s not logical, it’s ideological. It’s the ideology that allows corporations to run a dangerous experiment on the public without their consent.

        And where’s the LIDAR again?

        • @PresidentCamacho@lemm.ee
          link
          fedilink
          English
          1
          edit-2
          11 days ago

          My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn’t your strong suit, maybe sit this one out…

        • @TypicalHog@lemm.ee
          link
          fedilink
          English
          -711 days ago

          I SAID: “IT ONLY MATTERS IF AUTOPILOT CAUSES MORE NET DEATHS PER MILE TRAVELED RATHER THAN LESS, WHEN COMPARED TO HUMAN DRIVERS!”

  • @Numberone@startrek.website
    link
    fedilink
    English
    10
    edit-2
    11 days ago

    Is linked to excess deaths? Technically it could be saving lives at a population scale. I doubt that’s the case, but it could be. I’ll read the article now and find out.

    Edit: it doesn’t seem to say anything regarding “normal” auto related deaths. They’re focusing on the bullshit designation of an unfinished product as “autopilot”,and a (small) subset of specific cases that are particularly aggregious, where there were 5-10 seconds of lead time into an incident. In these cases a person who was paying attention wouldn’t have been in the accident.

    Also some clarity edits.