Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Technology

Nvidia Suspends Self-Driving Car Tests in Wake of Uber Crash (theverge.com) 113

Nvidia said on Tuesday it will suspend its autonomous vehicle testing on public roads in the aftermath of Uber's fatal crash in Arizona. Uber is a customer of Nvidia's, using the chipmaker's computing platform in its fleet of self-driving cars. From a report: Nvidia had been testing its self-driving cars in New Jersey, California, Japan, and Germany. The company is hosting its annual GPU Technology Conference in San Jose this week, where it is expected to make several announcements regarding its automotive products. "Ultimately AVs will be far safer than human drivers, so this important work needs to continue," a Nvidia spokesperson said in an email. "We are temporarily suspending the testing of our self-driving cars on public roads to learn from the Uber incident. Our global fleet of manually driven data collection vehicles continue to operate."
This discussion has been archived. No new comments can be posted.

Nvidia Suspends Self-Driving Car Tests in Wake of Uber Crash

Comments Filter:
  • Nvidia? (Score:3, Insightful)

    by Anonymous Coward on Tuesday March 27, 2018 @12:32PM (#56335077)

    Why is every tech company thinking they have the domain expertise to get into the car industry?

    Tesla is proving they have no idea how to scale manufacturing. This seems like the kind of things you partner with an actual car maker instead of just grafting this on later.

    Because at this rate we're going to end up with dozens of different self-driving cars, all of which have their own quirks and warts.

    What could possibly go wrong?

    • Re: (Score:2, Insightful)

      by Calydor ( 739835 )

      I suspect a company like Nvidia aren't in it to create actual autonomous vehicles. They want to get one or two really great ideas, patent them, and earn money from that particular piece of tech or program.

    • by Anonymous Coward

      Sheesh.... if Elon listens to a loser like you Tesla would never amount to anything.

      Canâ(TM)t dream... no vision....

    • by Anonymous Coward

      Nvidia makes "the" processing unit for this application. I's only natural that they would want to understand the user needs and maybe supply some low level software.

    • Because at this rate we're going to end up with dozens of different self-driving cars, all of which have their own quirks and warts.

      I brought that up in another thread about this. Apparently that's a good thing, and a situation where every car runs the same software is a bad thing.

    • Re:Nvidia? (Score:4, Insightful)

      by darkain ( 749283 ) on Tuesday March 27, 2018 @01:56PM (#56335797) Homepage

      nVidia is one of the top companies in the world doing artificial intelligence research and development. GPUs are no longer GPUs. They're now GPGPUs, allowing for massive parallel processing of data on their hardware. nVidia has been at the forefront of non-graphics usage of GPUs for quite some time. Obviously here on Slashdot, we're all aware of GPUs being used for crypto mining, but they're also used for a wide array of other technologies.

      Or, if you want a TLDR: who better to do image processing from cameras on a car, than a company that their entire core business is based around image processing?

    • by be951 ( 772934 )

      Why is every tech company thinking they have the domain expertise to get into the car industry?

      First of all, they're not getting into the car industry. They're trying to get into the self-driving systems industry, which is in large part a decision-making-systems industry.

      Because at this rate we're going to end up with dozens of different self-driving cars

      Seems like there is a word for that..... competition? Sounds like that might be a good thing.

    • Mobile Eye is not a car company. They are however the leader in computer vision for vehicles. Does that answer your question??
    • by Nemyst ( 1383049 )
      Nvidia has a huge amount of expertise in machine learning, makes some of the best machine learning hardware out there, and designs the hardware for many higher end car entertainment/smart systems. Their research division has been branching out into machine learning for years by now and puts them at the forefront of the wave, they're just less outspoken about it outside of the industry. Self-driving cars are an obvious application of machine learning and thus prime territory for Nvidia to investigate.
    • Because the car industry is dragging its feet on this. I grew up in Detroit, its old and slow and hates change. Its better to have dozens of independent self-driving systems instead of one monolithic one. I can't believe i have to explain this on slashdot.
    • Wait... what? Are you seriously arguing that "competition is bad" and that we should only allow existing car companies to make cars?

      I'm... I don't even understand how you can say that. Are you also mad that IBM didn't stay dominant in the home PC market in the early 80s? Are you upset that Linus Torvalds made his own OS in the early 90s rather than going to work at Sun or Microsoft? What about that fucking Apple company, thinking they could create a phone that was better than Nokia's? I mean, obviously they

  • I think the weaknesses in the driving system are probably more than likely Uber's fault but Nvidia is probably suspending testing just in case there's an issue with their hardware. It's the responsible thing to do when lives are on the line. After all no one wants the negative publicity associated with accidentally killing someone in testing. Note how google's cars haven't run anyone over and at worst have been involved in minor fender dings that still made news.

    • Nvidia managment has finally woken up to the criminal and civil liability side with nonexistent internal engineering standards and product safety standards...
  • Why though? (Score:2, Informative)

    by wardrich86 ( 4092007 )
    1. The pedestrian was J-Walking
    2. The driver was paying no attention to the road.
    3. The sensor wasn't able to respond in time.

    I think the nVidia chip is the last thing that should be faulted here. We have two clear cases of human stupidity to blame before the chip comes in. They should just re-brand it as "Computer Assisted Driving" instead of "Self-Driving".
    • are you a troll or just that stupid as well as ignorant and illiterate that you have not read or understood any comments of the past week+?
      • If the person could handle the simple task of legally crossing the street, this wouldn't have happened. As a pedestrian myself it absolutely blows my mind just how stupid people are... Even if I have the right of way, in the event of Car VS Human, I'll probably be the one in worse shape. People are impatient as fuck, and are willing to risk their entire life just to shave a few minutes off of their trip.

        I get that the sensor fucked up, but if the company had 100% trust in the hardware, they wouldn't ne
    • 1. The pedestrian was J-Walking

      slow moving person, dragging metal reflector, crossing 3-4 lanes of road at 90 angle, directly under two street lamps, on an empty road with good visibility (actual one, not the lol dashcam) - PERFECT scenario for self driving technology.

      2. The driver was paying no attention to the road.

      self driving part

      3. The sensor wasn't able to respond in time.

      car didnt respond AT ALL

      I think the nVidia chip is the last thing that should be faulted here.

      "NVIDIA Titan V Reportedly Producing Errors in Scientific Simulations" https://wccftech.com/nvidia-ti... [wccftech.com]

      • 1. The person was STILL breaking the law, and risking their life to shave a few minutes off of their trip.

        2. The vehicle may have been branded as "self driving" but it was also branded as being tested, hence the person that was supposed to be taking control in the event of a hardware failure.

        Yeah, it's a tragedy that somebody died, but the hardware was still being tested - it was expected that there would still be some hiccups and failures along the way. The true cause of the fatality is 45% negligence
  • by Bobrick ( 5220289 ) on Tuesday March 27, 2018 @01:06PM (#56335335)
    At this point I'm just waiting on Nestlé, Pfizer and Coca-Cola to also announce they're pausing their autonomous vehicle development. Is Burger King also working on AV tech?
  • How many people have to die before we start to observe common sense and decide, once and for all, that putting self-driving cars on the same roads with non-self-driving cars is a bad idea? Every time we go here, someone observes that "well, we just have to make the roads more hospitable to self-driving cars; we need built in signaling, dedicated paths for them, reflective whatever on all other objects on the road..." Well yeah, we already have roads that are fully built out for a specific type of vehicle
    • How many people have to die before we start to observe common sense and decide, once and for all, that putting self-driving cars on the same roads with non-self-driving cars is a bad idea?

      50,000

      Every time we go here, someone observes that "well, we just have to make the roads more hospitable to self-driving cars; we need built in signaling, dedicated paths for them, reflective whatever on all other objects on the road..."

      Nonsense. You just need to stop walking out in front of moving vehicles.

    • How many people have to die before we start to observe common sense and decide, once and for all, that putting self-driving cars on the same roads with non-self-driving cars is a bad idea?

      26. The number of people killed by non-self-driving cars in 1899 (the first year the USA kept records on such things). IOW, about three orders of magnitude lower than were killed by non-self-driving cars last year.

      Note that if we'd used the same sort of "common sense" in 1899, we'd be using horse and buggy today....

      • by dgatwood ( 11270 )

        Or we'd have someone walking out ahead of the self-driving cars carrying a flag.

      • by amorsen ( 7485 )

        Note that if we'd used the same sort of "common sense" in 1899, we'd be using horse and buggy today....

        Also note that it is almost certain that more than 26 people were killed in the US by accidents involving horses in 1899.

      • 46. That was life expectancy back then. You're saying you want to go back to those days?
  • I find it kind of odd seeing how the Uber car had both LiDAR and sonar sensors that should have detected the pedestrian despite the darkness, but the car still didn't stop. The obvious suggestion as to why it didn't stop was that the control system simply couldn't react fast enough to the input from these sensor inputs, but I have a feeling it may be something else.

    The thing I don't like about Tesla's Autopilot system is that at even slightly higher speeds it relies completely on the visible light camera
    • by amorsen ( 7485 )

      It wasn't the back of the trailer. If only it had been the back, the radar would have mitigated the accident, quite likely saving the driver. US lorries do not have side impact protection, which means you can drive right under them if your car is low enough. Alas, the Tesla was only almost low enough.

      In Denmark, the number of cyclists killed by lorries went down quite a lot when side impact protection was added to the lorries, in many cases stopping the bike from going beneath the lorry wheels. It seems lik

    • > The obvious suggestion as to why it didn't stop was that the control system simply couldn't react fast enough to the input from these sensor inputs, but I have a feeling it may be something else.

      Their is no way (in my opinion) that this system got off the test track not being able to detect this type of object entering the roadway without detecting and stopping in time. I would bet money on a failure in the regression testing somewhere. Either they updated software, and their was a undetected incomp

  • Humans drove 3.22 trillion miles in 2017 in the US and there are 32,000 deaths a year. This means all self driving cars need to drive 100,625,000 miles without a death to be as safe as a human. I wonder how close the Uber car got?
    • Not to mention, the human fatality rate is on all roads, all vehicles, all weather, all circumstances.

      Also, (I wasn't done) there are around 16,000 accidents per day, or 5,840,000 per year, meaning a self driving company needs to achieve a rate of 551,370 miles without an interaction in all conditions in order to be as safe as a human.
  • Private companies should get out of technology that enhances our lives. It should be reserved to car companies who turn wrenches and government agencies. What can go wrong?!?!ðY¦ââ(TM)ï
  • What the fuck kind of nonsense is that?

    Either it needs to continue, or it needs to be suspended.

    Or are they saying that the work exists in some kind of superposition of both states simultaneously?

  • I'm just going to leave this here and say, that a really likely outcome of this investigation is that Uber is a shit company. Just my two cents though.

    • The hell? Link is here. [reuters.com] Apparently Slashdot commenting is an art I'm incapable of. That aside, Uber, more than likely, is a shit company.

Genius is ten percent inspiration and fifty percent capital gains.

Working...