

Xiaomi EV Involved in First Fatal Autopilot Crash (yahoo.com) 62
An anonymous reader quotes a report from Reuters: China's Xiaomi said on Tuesday that it was actively cooperating with police after a fatal accident involving a SU7 electric vehicle on March 29 and that it had handed over driving and system data. The incident marks the first major accident involving the SU7 sedan, which Xiaomi launched in March last year and since December has outsold Tesla's Model 3 on a monthly basis. Xiaomi's shares, which had risen by 34.8% year to date, closed down 5.5% on Wednesday, underperforming a 0.2% gain in the Hang Seng Tech index. Xiaomi did not disclose the number of casualties but said initial information showed the car was in the Navigate on Autopilot intelligent-assisted driving mode before the accident and was moving at 116 kph (72 mph).
A driver inside the car took over and tried to slow it down but then collided with a cement pole at a speed of 97 kph, Xiaomi said. The accident in Tongling in the eastern Chinese province of Anhui killed the driver and two passengers, Chinese financial publication Caixin reported on Tuesday citing friends of the victims. In a rundown of the data submitted to local police posted on a Weibo account of the company, Xiaomi said NOA issued a risk warning of obstacles ahead and its subsequent immediate takeover only happened seconds before the collision. Local media reported that the car caught fire after the collision. Xiaomi did not mention the fire in the statement. The report notes that the car was a "so-called standard version of the SU7, which has the less-advanced smart driving technology without LiDAR."
A driver inside the car took over and tried to slow it down but then collided with a cement pole at a speed of 97 kph, Xiaomi said. The accident in Tongling in the eastern Chinese province of Anhui killed the driver and two passengers, Chinese financial publication Caixin reported on Tuesday citing friends of the victims. In a rundown of the data submitted to local police posted on a Weibo account of the company, Xiaomi said NOA issued a risk warning of obstacles ahead and its subsequent immediate takeover only happened seconds before the collision. Local media reported that the car caught fire after the collision. Xiaomi did not mention the fire in the statement. The report notes that the car was a "so-called standard version of the SU7, which has the less-advanced smart driving technology without LiDAR."
Refreshing (Score:4, Interesting)
It's refreshing for a company to be this open and honest about what happened. When Tesla autopilot kills someone they will only ever say that it disengaged, probably milliseconds before the accident, and not give any further details.
Re: (Score:2, Insightful)
It's refreshing for a company to be this open and honest about what happened. When Tesla autopilot kills someone they will only ever say that it disengaged, probably milliseconds before the accident, and not give any further details.
The report notes that the car was a "so-called standard version of the SU7, which has the less-advanced smart driving technology without LiDAR."
Sorry to burst your biased bubble here, but that isn't exactly a "refreshing" corporate response. That is a disgusting excuse, suggesting customers might have been saved if only they wouldn't have been so "cheap" and upgraded beyond the "standard" version.
Re: (Score:2)
The summary seems to have mislead you. Xiaomi did not blame the customer, that was the journalist noting that the model of car they had was one without lidar.
Probably because Tesla is a vision-only system and has similar issues.
Re: (Score:2)
I share the assessment contained in your comment, but isn't this in fact a reasonable prediction? They're definitely not going to want to admit fault, as they want to sell these systems. The corporation also has a history of blaming users [gizchina.com].
Re: (Score:2)
According to my sources, the driver was taking with his mother over the phone when the accident occurred and his mother told my sources that the last sentence he said was: "Look Ma, no hands!"
Re: (Score:2)
The accident occurred in China, which has a very different legal system.
Re: (Score:2)
A lot of the "cheaper" versions are just pretty much "advanced" cruise control and so should not be used for auto driving.. (of course its not marketed that way, but its an important distinction because its the computer version of "I can't find my glasses" (while driving down the highway)) In fact, I wish cars would just say that (like my mother used to) since that tells YOU to be alert and watch the road.
Re: (Score:2)
Actually it does make sense.. A lot of the "cheaper" versions are just pretty much "advanced" cruise control and so should not be used for auto driving.. (of course its not marketed that way, but its an important distinction because its the computer version of "I can't find my glasses" (while driving down the highway)) In fact, I wish cars would just say that (like my mother used to) since that tells YOU to be alert and watch the road.
We will ultimately find (through litigation) that humans are far too stupid to understand ANY marketing behind "auto" drive/pilot/cruise-anything, and will force companies to stop offering assisted solutions of ANY kind until autonomous solutions are good enough to not even require a licensed driver behind the wheel.
Naturally when that happens, human override won't even be an available option anymore. Humans, get what they deserve.
Re: (Score:3)
My Honda's optional lane-keeping feature is pretty good. So is the on-by-default road and lane departure features along with automatic emergency braking. Adaptive cruise control is a godsend. Parking proximity warnings and backup cross-traffic alerts are very useful.
Are you saying these features should all be deleted?
Re: (Score:2)
My Honda's optional lane-keeping feature is pretty good. So is the on-by-default road and lane departure features along with automatic emergency braking. Adaptive cruise control is a godsend. Parking proximity warnings and backup cross-traffic alerts are very useful.
Are you saying these features should all be deleted?
No, I’m more saying an ignorant licensed driver who refuses to read the instruction manual, should not have the ability to sue anyone and everyone because they were too stupid to read the manual and decided themselves that “autopilot” means “take a nap”.
Its not just the ignorant stupid drivers. It’s their ability to litigate their ignorance and stupidity today, and somehow win. Make it “idiot-proof” they say? You couldn’t achieve that goal today even
Re: (Score:2)
What does it mean?
I may be biased having worn glasses since before puberty, got used to putting them on in the same set of actions as turning off the alarm clock and disentangling the other arm from under the wife, and almost never seeing the world except through a set of glasses.
(To forestall some comments : yes, I haver tried contact lenses - see "almost all" not "all" above. Yes,
Re: (Score:3)
I've seen further details out of tesla a number of times, in more detail than this even. But it does take time for them to do the analysis, and normally the news stations don't pick it up because it is normally no longer fresh news.
Re: (Score:2)
It's refreshing for a company to be this open and honest about what happened.
We are talking about a Chinese State-owned company here, right? And you are really believing they are being open and honest? Exactly would a communist-owned company gain by being open and honest?
How do you know there haven't been thousands of other people killed due to this autopilot failing in China so far?
Re: (Score:2)
We are talking about a Chinese State-owned company here, right?
No. Xiaomi is not a state-owned enterprise.
Re: (Score:2)
Re: (Score:3)
When Tesla autopilot kills someone they will only ever say that it disengaged, probably milliseconds before the accident, and not give any further details.
Common finish the story. When Tesla autopilot kills someone they say it disengaged, refuse to cooperate with the investigation, do the legal bare minimum when it looks like they will face penalties, donate millions to help win an election, take over the government with the blessing of a bought president and attempt to shutdown the department which was investigating them.
Xiaomi are n00bs at this!
Re: (Score:2)
"and then when the country collapses and a furious mob surrounds them, escape to Mars."
Humans can't take over (Score:5, Interesting)
The whole point of having a car "drive itself" is that you aren't doing it.
Expecting someone to go from not doing it at all to doing it at full highway speed immediately is bananas.
The correct action is usually lean back and nail the brakes, and let ABS and crumple zones do what they will. In "a few seconds" the vehicle could ostensibly have done that for them and the outcome would have been better. The vehicle could have reduced speed more than that in three seconds, if it chose to do that instead of shutting off and leaving the human in charge.
Re:Humans can't take over (Score:4, Interesting)
I have to agree with you there. So long as they're expecting the human to take control "if needed", these systems will be impractical. I can accept that they might not be 100% accurate all the time and accidents, even fatal ones, are inevitable. The incidence of them just needs to be less than that of a human drivers to be acceptable. If they can't drive autonomously without human input though then to me its pointless.
I'll buy one when they get to a point where they don't even have manual controls anymore.
Re: (Score:2)
that is put it so they don't have to pay out anything.
Re: Humans can't take over (Score:2, Funny)
If it can't drop the kids off at piano lessons and pick up the groceries from hypermarket on it's own, what's the point?
Re: (Score:2)
Picking up groceries leads to an interesting thought.
What if instead of you sending your car to do it, the hypermarket has a van, perhaps something similar to a schwan truck with refrigerated and freezer sections, and it delivers to your whole neighborhood that day?
Heck, consider a self-driving pizza delivery vehicle, with pizza ovens inside.
Re: Humans can't take over (Score:2)
The future just ain't what it's supposed to be.
Personally, I'd rather stay old school and ride my bike to the market and use it as a shopping cart / bags than
Re: (Score:2)
You said :
If they can't drive autonomously without human input though then to me its pointless.
I'll buy one when they get to a point where they don't even have manual controls anymore.
Down below, RockDoctor said :
There's not enough information in the report to determine how long the driver spent thinking about what to do, including the decision to "take over" and to "floor the brakes".
And there are other similar comments.
They made me realize that today, most people driving cars actually know how to drive (good vs bad drivers is not the point at the moment). So, if they need to take over, they can.
You can argue whether the vehicle engineering was good or not, if the drivers have appropriate warning and response times, and similar points that many here have already pointed out. But, if needs be, a driver who knows what to do can take over.
But, at so
Re: (Score:2)
I have to agree with you there. So long as they're expecting the human to take control "if needed", these systems will be impractical. I can accept that they might not be 100% accurate all the time and accidents, even fatal ones, are inevitable. The incidence of them just needs to be less than that of a human drivers to be acceptable. If they can't drive autonomously without human input though then to me its pointless.
Which is why I don't think I'll see one in my life time (which is potentially another 40-60 years, maybe more if medical science get significantly better). Any autonomous system that requires a human to be able to take control within a minutes notice, let alone seconds, is not an autonomous system. It's a human monitored and controlled system.
Re: (Score:3)
Re:Humans can't take over (Score:4, Insightful)
Like aeronautical autopilot systems, these are driver aids. They are great in situations like a long cruise down a highway, where they take care of making tiny adjustments to keep the car in the centre of the lane and following the curves, and in stop-start traffic. The same sort of thing pilots use them for - hands off the flight controls, let the autopilot maintain altitude and heading, or follow the glide slope in to land.
The idea with the aircraft systems is that not only is the autopilot very good at those things, it reduces the workload on the pilots so that they can concentrate on monitoring other things, or be more rested and alert when they need to take over.
You make a fair point about the automatic braking though. It depends why it disconnected, it could have been because the driver took over, rather than because it detected a situation that it couldn't cope with.
Re: (Score:2)
That is a bad comparison. An aeronautical autopilot system is fare more than a driver aid, and the plane is actually perfectly capable of doing a myriad of things by itself cruising for virtually the entire flight unattended. Additionally an airline control scheme makes small movements to a massive vehicle that does not result in quick reactive movements. It doesn't need to deal with traffic, changing lanes, pedestrians, etc.
An autopilot is an autopilot. A driver aid is a driver aid. A pilot may be able to
Re: (Score:2)
>"Like aeronautical autopilot systems, these are driver aids"
Agreed. But I hate comparisons to airplane autopilot systems. They are radically different. In most cases with planes, they are used at cruising. And in an airplane, minor changes in any direction or speed are 99.9999% irrelevant, or even just keeping things exactly the same are 99.9999% irrelevant, because they are in a 3-dimentional area of almost always nothing. In a car, a small change in direction can result in a crash almost immedia
Re: Humans can't take over (Score:2)
The problem isn't that it wouldn't have been right to do as you say this time, it's that it's not a good response every time.
Sometimes there really is nothing wrong, it's justvthe perception of the robot / car that's off. And the car "knows" that something's off, just can't judge what exactly. Suddenly braking might turn a perfectly safe situation into a dangerous one (e.g. if driving in dense traffic).
Damned if it stops, damned if it doesn't... well, that's why it's called "not fully self-driving capable"
Re: (Score:2)
Suddenly braking might turn a perfectly safe situation into a dangerous one (e.g. if driving in dense traffic).
One answer to this is that the AEB system should be a separate system from the self-driving system, and it should win. And if someone stacks you from behind because they were following too closely, that's why you have rear crumple zones.
We have to get to the point where we no longer expect to drive unsafely because our transportation systems are bullshit and we otherwise won't all fit on the road, because that's what's causing the accidents which aren't caused by serious drug use or medical emergency. (The
Re: (Score:2)
The whole point of having a car "drive itself" is that you aren't doing it.
Expecting someone to go from not doing it at all to doing it at full highway speed immediately is bananas.
ADAS and Level 2 requires constant eyes on the road to take over immediately. Level 3 does not have an immediate takeover requirement. The only commercially offered Level 3 cars only work at low speeds on limited access highways and allow several seconds for human takeover. That's also why all Tesla cars are Level 2.
Re: (Score:2)
That's the law in this country. No "if's", no "but's", no uncertainty. The driver must to be prepared to take over from any driver assistance device at any time. (The government's driver instruction documentation emphasises the "must" and distinguishes it from the less commanding "should", also emphasised, to make it very brutally clear that must means must not should.
I gather there are cars marketed h
Re: (Score:2)
How often do you have to re-sit the driving tests and exams in your country?
Practically never. Also, the exams are pathetic, and so are the driving tests. Only commercial drivers have anything like a reasonable requirement. In fact there are parts of it which are excessive, like having to pass a drug test including cannabis. By all means, test drivers while they are driving, but not permitting them weed practically guarantees they'll be using something that leaves the system quicker and is also more dangerous.
Anyway here in the USA you can drive most vehicles you'll be interested i
Su-7 (Score:2)
https://en.wikipedia.org/wiki/... [wikipedia.org]
Autopilot, yes, but probably dangerous to operate?
What's most important here: (Score:3)
What's most important here is if they actually learn why this accident happened in the first place and how to prevent it from happening again. If nothing was learned then the people that died have died in vain. If the data from the collision can be used to improve the autiopilot so that it will avoid similar situations then this may have been the unfortunate cost of progress. This may seem cold but the unfortunate reality is that people die in car accidents every day. The problem we've faced is that all the experienced gained by the driver or any experience that which could have been gained from being involved in an accident is lost when the driver dies. The obvious difference is that with autopilot systems, there is the potential to learn from every collision, fatal or otherwise. It's really up to humans to decide if it's worth improving the autopilot or not.
TL;DR: It's tragic when innocent people die but what matters most is how we move forward: sacrificing more innocent lives for profit or saving more people from a similar fate in the future.
At least they call it driver assist (Score:2)
At least they are calling it a form of driver assist, which is what it is, just like cruise control, or lane-keeping. Requires the driver to still be actively in charge of the car's operation, at least in a supervisory capacity. Musk's refusal to stop calling their driver-assist system full self driving has always been disingenuous. China has banned Tesla's "Full self driving" feature because of this.
As far as the car itself goes, Ford CEO has been driving an SU7 for six months as his daily driver in the
Re: (Score:1)
Wait a minute, Tesla has 2 options, FSD and Autopilot, which are completely different options. Tesla's Autopilot is what comparable to the system mentioned in the article.
So before bitching on Tesla again, make sure you are talking about the same options.
FSD is not banned, it just isn't certified yet, just like there is no country in the world where FSD is certified.
Driverless Vehicles-Solving Paying Drivers Problem (Score:2)
Scared? Humans are far worse. (Score:2)
This is scary, but do remember that 260,000 people die in Chinese traffic accidents every year. Reference: https://www.scmp.com/news/chin... [scmp.com] (the US number is 40,000 btw) due to an error a human made. The vast majority of those dying are not the at-fault driver, but some other human .. a passenger, other driver, or pedestrian. And let's not get into debilitating injuries. We really need computers to take over driving ASAP.
Re: (Score:2)
"We really need computers to take over driving ASAP."
No we don't. Life has risks, get over it. You might as well say we should all be an AI driven wheelchairs instead of walking just in case we do something stupid like step out in front of a vehicle.
Generally it tends to be people who don't or can't drive who shill the most for fully automated vehicles which once they do become common will be a signal for insurance companies to jack up the price of human driven vehicles to levels people can't afford or for
Re: Scared? Humans are far worse. (Score:2)
Go hide under your duvet then snowflake. I dont want to be treated like some overgrown toddler incapable of being trusted to operate machines.
Re: (Score:2)
Umm, ok .. you want to drive that's fine. People liked riding horses too, they didn't protest against cars (sort of: https://www.saturdayeveningpos... [saturdayeveningpost.com] ). And by the way, nobody banned horse riding. I guess you don't have to commute daily into a city in stop and go traffic. I guess you have nothing else to do with your productive time that you don't mind sitting in a car driving it instead of doing work. But don't go around calling people snowflake just because you're scared of automated cars.
Re: (Score:2)
"you don't mind sitting in a car driving it instead of doing work"
Ever heard of metro systems?
"just because you're scared of automated cars."
Visit europe sometime and take a guess as to how well an automated cars is going to hope with heavy traffic on narrow roads. Its all very easy on wide straight US roads with 90 degree junctions.
LIDAR should be required by law (Score:1)
LIDAR should be required by law for "autonomously" driving cars.
The Tesla crashing into the Wile E Coyote painted tunnel is even more proof of this.
First? (Score:1)
How is this the "first?" I can think of 3 prior autonomous driving fatal crashes off the top of my head:
- Waymo test car hit a woman pushing a bicycle across a road at night [wikipedia.org]
- Tesla crashed into the side of a trailer while the driver was watching a Harry Potter movie [theguardian.com]
- Tesla crashed into a highway barrier it had previously shown strange behavior around [theguardian.com]
Maybe the Tesla ones don't count because Teslas tend to conveniently disable self-driving functions a fraction of a second before impact, but that doesn't expla
Re: (Score:2)
How is this the "first?" I can think of 3 prior autonomous driving fatal crashes off the top of my head:
"The incident marks the first major accident involving the SU7 sedan, which Xiaomi launched in March last year"
Re: (Score:2)
I hope per-model stats aren't considered newsworthy or this will get tiresome quickly...
China (Score:2)