Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Transportation

Tesla Crash Won't Stop Driverless Car Progress: Renault-Nissan CEO (cnbc.com) 96

Problems Issues with Tesla's self-driving software that were linked to the death of a driver this year would not block the development of autonomous vehicles, Carlos Ghosn, the chief executive of Renault-Nissan, said on Tuesday. From a report on CNBC: In September, Tesla revealed the death of a man in one of its cars in a crash in the Netherlands and said that the "autopilot" software's role in the accident was being investigated. "In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic," Tesla said in a blog post at the time. This incident shone a spotlight on autonomous driving features currently in cars as automakers are in a race to bring fully driverless cars on the road. During an interview at the Web Summit technology conference in Lisbon, Ghosn said that the teething problems with Tesla's autonomous software would not derail the industry's push.
This discussion has been archived. No new comments can be posted.

Tesla Crash Won't Stop Driverless Car Progress: Renault-Nissan CEO

Comments Filter:
  • oh... good (Score:5, Funny)

    by The-Ixian ( 168184 ) on Tuesday November 08, 2016 @01:07PM (#53239155)

    I thought for sure that we had seen the last of this push for self driving cars.... I sure am glad that this guy was here to tell us that a possible mistake at a different company won't derail their plans.... whew...

    • Self driving cars in the purest sense are a long way away. the country will need to redo the road and traffic light systems.
  • by Anonymous Coward

    Get your enjoyable cars now before the pleasure of driving becomes a thing of the past.

    • But they may make the road far more fun. All the driver less cars get out of the way of people actualy driving. No more hyper milers doing 45 on the highway, or granny's etc. Speeding by them all in a 53 vet etc will be fun.

      Longer term it will lead to a push to up speed limits to what people are actualy comfortable at (80 ish on most highways).

      • by Anonymous Coward

        Suppose there are a lot of them. Everyone of them is doing 53. Because they are a lot of them, they will be in all the lanes. Now you can't pass any of them. You will be stuck in their matrix doing 53 as well. For you to be able to pass, one lane needs to be going faster than the other. Not going to happen with self driving cars.

        • Roads today have more than enough capacity to fit everyone into (N-1) lanes. The reason they can't and don't is because human drivers are bad and inefficient.

    • Get your enjoyable horse breeds now before the pleasure of horse back riding becomes a thing of the past.

    • You'll always be able to drive your own car.
      Just not on any tax-supported road.
  • The concept of autonomous cars is attractive, but the Tesla development model is all wrong. These are not products to test through the cheapest possible constructions, using untrained drivers and public roads. The NHTSA needs to ban all use of the autopilot features beyond a simple cruise control until it is proven reliable through in a real statistical sense with adequate power to identify rare events.
    • It has been. It's roughly an order of magnitude safer than humans, statistically.

      If, today, EVERY car was swapped out with a Tesla self driving car in the US, roughly 30,000 lives would be saved within a year.

      But, yeah, it's not perfect. It's only a lot better than humans.

      • by Jzanu ( 668651 )
        You're wrong. Claiming it and demonstrating it with power are different things. Marketing is not statistics. Read this [rand.org] and the report [rand.org] and understand.
        • Ummm, to use your own article.

          Autonomous vehicles would have to be driven hundreds of millions of miles ... to demonstrate their reliability in terms of fatalities and injuries.

          Telsa: We've drive over a hundred million miles. We're still an order of magnitude safer than humans. And we're simulating 3 million miles/day for continued testing.

          So, where am I wrong?

          • by Jzanu ( 668651 )
            They haven't driven enough, if you read the report you would understand the problem is that consumers CAN'T "test-drive" on public roads enough to even demonstrate parity in vehicle safety. There is zero evidence because there is no actual support for Musk's claims. Read the report.
            • I did. You don't seem to know what Tesla has done. Probably because it's not covered in your RAND reports.

              They have driven that many miles, 6 months ago. And simulated 10x more. And they have 100,000 cars on the roads doing this every day.

              http://www.theverge.com/2016/5... [theverge.com]

              Also, " the problem is that consumers CAN'T "test-drive" on public roads enough to even demonstrate parity in vehicle safety" is a lot of horseshit.

              • by HuguesT ( 84078 )

                I was not aware that Tesla owner had given expressed or implied permission to Tesla to spy on their driving, to upload the data to Tesla's servers and to use this data for Tesla's profit without any compensation of any kind.

                I also thought that Tesla was not until recently developing their own autopilot. I thought they had subcontracted Mobileye for this [wccftech.com]. Now Mobileye and Tesla have parted ways, to whom does this data belong ?

            • To support its claim that consumers CAN'T test-drive enough to demonstrate vehicle safety, the report you linked says this:

              To demonstrate that fully autonomous vehicles have a fatality rate of 1.09 fatalities per 100 million miles (R=99.9999989%) with a C=95% confidence level, the vehicles would have to be driven 275 million failure-free miles. With a fleet of 100 autonomous vehicles being test-driven 24 hours a day, 365 days a year at an average speed of 25 miles per hour, this would take about 12.5 years.

              Except, Tesla doesn't have a fleet of only 100 vehicles. It has around 1000 times that, who pump out about 50,000 miles of Autonomous driving every two months (May report was 100 million miles driven on Autopilot, and July report was 150 million miles driven on Autopilot see: https://electrek.co/2016/07/11... [electrek.co]). Given those rates, and considering that they've already amassed 150 millio

              • Most of that autonomous driving is mere duplication in the safest driving conditions you will find, such as on a divided highway. This is why it doesn't prove anything.
                • Mere duplication is exactly what the paper requires, to decrease the odds that outside factors play a role in the end results. And since Autopilot is only supposed to be used on divided highways, it's entirely on point that you would prove that the system is safe in only those conditions.

                  Does it prove that it's safe in all driving conditions? Absolutely not. But it's not meant to be used in all driving conditions, so that's besides the point.
          • If I tie any car to a pole with a long rope, lock down the steering wheel and put a brick
            on the accelerator, it would "autodrive" around and around the pole for thousands
            of miles without incident, as long as you had some way to continuously pump fuel into it.

            Does that make it a safe car? No, because that isn't a realistic driving scenario.
            Why should we believe that Tesla's tests are any more realistic?
            When the NTSA certifies a car as safe to drive automatically, then I will believe it.
      • Autopilot is not full self-driving. It has one forward camera, radar, and ultrasound. They have updated the radar to make accidents such as the collision with the truck unlikely, even with the old autopilot. The newest system has eight cameras, including three forward cameras. The radar is now capable of seeing the car in front of the car in front of you, and it will react if that car begins to slow. Human drivers cannot always do this. Human drivers cannot constantly monitor the surroundings of the c

      • Re:Too bad (Score:5, Insightful)

        by DarkOx ( 621550 ) on Tuesday November 08, 2016 @02:35PM (#53239947) Journal

        Come on you know that isn't the least bit true. Tesla publishes the number of miles auto pilot has safely driven. Its impressive but those are largely the 'easy' miles.

        People unlike auto pilot don't get hand off the responsible for controlling the vehicle to someone/something else when the conditions get hard. I wonder in what situations do human drivers experience more accidents, conditions where you can use auto pilot today or in situations where you can't?

        • > I wonder in what situations do human drivers experience more accidents,

          A little googling gives more questions than answers on that one. Most fatal accidents are at night or at intersections. Seams most minor accidents are close to home or in parking lots. Seams like the drowsy driver and missed traffic control would be covered today by Tesla type system (many equal variants from Ford and GM). The auto system will likely have issues with detecting slick roads, construction, pedestrian interactions.

      • They have to guarantee it is statistically safer than *any* human. Otherwise there will always be someone that they introduce death and injury to. I don't care about some dream world where everyone has one and their perfect. Killing people now is killing people now.
  • They can't take my Red Barchetta from me. I hide it in the barn on my Uncles farm. Good thing these autonomous cars can't cross single lane bridges.
  • Unfortunately (Score:2, Insightful)

    by kackle ( 910159 )
    Self-driving cars will come for one dumb reason or another ("Ooo, shiny tech!" "Ooo, a tiny bit safer!"). And they will be a blight on our roads. They probably won't kill many people, but they will slow traffic everywhere except on wide-open highways: They will never be able to instantly read road signs written in $YOUR_LANG (partially obstructed by snow), they will forever be baffled (call slowCarDown( )) by non-standard roadway situations and conditions (which occur frequently), and millions will accep
    • by Nemyst ( 1383049 )
      Or none of those things will happen because technology is moving at a much more rapid pace than you seem to think. You sound like a horse carriage driver looking at cars and going "Pah! Those things will never work out!"
      • I have to say I agree with the poster. In religion we are expected to believe a god because a book exists, no more encouraging are the results anyone has seen from autonomous driving so far.
    • Re:Unfortunately (Score:4, Insightful)

      by eheldreth ( 751767 ) on Tuesday November 08, 2016 @02:21PM (#53239837) Homepage
      This is a bit short sighted. The number of problems caused by autonomous cars will be inversely proportional to the number on the road. There will be a critical mass beyond which insurance companies will begin charging extravagant fees for a manually operated vehicle. Autonomous vehicles will communicate with each other. They will know miles in advance when there is an accident, construction, or other hazard and be capable of responding accordingly (including re-routing if possible). Imagine a Network of cars alerting other vehicles behind them about road conditions, say an icy spot. Your car would then essentially have a map of areas to apply more caution in. They will be capable of monitoring for wild life with heat and infrared sensors. Grid lock on roads will be virtually eliminated because cars will be able to tell each other what they are about to do before they do it. Issues with reading signs are a non starter. Once adoption begins to pick up you will quickly see digital information systems added to existing road signs. All of this tech exist right now and most of it is mature. It just hasn't been put together yet. In about 20 years people will be complaining about how manual drivers are always causing accidents and issues with traffic flow.
      • by Kyont ( 145761 )

        You are absolutely spot-on, in every point. This IS how it's all going to shake out, it's only a question of how much resistance people will put up along the way because driving is supposed to be fun. I for one look forward to a world of smoothly running, quiet roads, in which I summon my car (or maybe a clean auto-Uber) then simply zone out/read books/catch up on my Twitter feed via the chip in my head/nap until I arrive at my destination. We already have the expensive part of the infrastructure here (n

      • by kackle ( 910159 )

        The number of problems caused by autonomous cars will be inversely proportional to the number on the road.

        Respectfully, I don't think you pay close attention to all the little "hiccups" that occur during daily driving - no one does, because our brains handle them with ease. As a old firmware guy, I know digital computers won't be able to do this because there are too many variables, forever changing. When you drive from now on, imagine you're blind, and have perfectly memorized the road and could drive it with no sight. In the future, look to see what alters your path during your commute (or what has chang

    • Maybe you don't live in an area that has old people. But there are vehicles on the road that already can't do that.

      The self braking cars are already better than most of them.

    • Can you honestly say that 100% of human drivers can read partially obstructed signs and respond perfectly to non-standard roadway conditions?

      Self-driving cars don't have to be perfect, they just have to better than the average human driver. And the average human driver sucks.

      • Self-driving cars don't have to be perfect, they just have to better than the average human driver.

        Maybe for you, but you don't get to choose.
        The NTSA and the insurance companies will decide this,
        and will only license a robocar when it is proven to be safer than any human driver.

  • I hate it when there are Problems Issues with things. Like the grammar in unedited Slashdot article summaries.
  • On the first run of George Stephenson's locomotive a Member of Parliament was killed, without that this slowed down the railway becoming the future of transport.

    "William Huskisson PC (11 March 1770 – 15 September 1830) was a British statesman, financier, and Member of Parliament for several constituencies, including Liverpool.[1]

    He is best known as the world's first widely reported railway casualty as he was run over and fatally wounded by George Stephenson's pioneering locomotive engine Rocket."

    https [wikipedia.org]

  • I live in the Netherlands, and I have never heard about the death of the tesla driver. It would have been major news, but yet none of our tech-sites nor news-sites have reported on it...
    • by pvk113 ( 4734087 )
      The part about the death is indeed correct. It was a Dutch IT Entrepeneur who crashed into a tree on the Hilversumsestraatweg (N415). Tesla Netherlands was asked to investigate the logs and they declared that the card was driving at a speed of around 160 km/h (on an 80 km/h road) and that autopilot was not activated when the crash occurred. Here's one news item about the crash: https://www.rtlnieuws.nl/neder... [rtlnieuws.nl]
      • thanx, really hadn't read about it, which is strange.. I found the part about the firemen not cutting the car due to being afraid of being electrocuted an interesting part of the story. It does mean it will be time for the firepeople to learn how to handle a crashed electric car as they will become more and more popular..
  • The comments from Tesla in the CNBC article were not about the accident in The Netherlands (where autopilot was not engaged) but about the earlier accident in Florida that caused Josh Brown's unfortunate death. Why are these journalists so sloppy?
  • Curso NR 10 online [institutosc.com.br] curso NR 10 curso NR 10 online

If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol

Working...