Musk Says Tesla Is 'Very Close' To Developing Fully Autonomous Vehicles (bloomberg.com) 260
Tesla's Elon Musk said the carmaker is on the verge of developing technology to render its vehicles fully capable of driving themselves, repeating a claim he's made for years but been unable to achieve. From a report: The chief executive officer has long offered exuberant takes on the capabilities of Tesla cars, even going so far as to start charging customers thousands of dollars for a "Full Self Driving" feature in 2016. Years later, Tesla still requires users of its Autopilot system to be fully attentive and ready to take over the task of driving at any time. Tesla's mixed messages have drawn controversy and regulatory scrutiny. In 2018, the company blamed a driver who died after crashing a Model X while using Autopilot for not paying attention to the road. Documents made public last year showed the National Highway Traffic Safety Administration had issued multiple subpoenas for information about crashes involving Tesla vehicles, suggesting the agency may have been preparing a formal investigation of Autopilot.
and will he setup an auto payout for any deaths? (Score:2, Insightful)
and will he setup an auto payout for any deaths?
Re:and will he setup an auto payout for any deaths (Score:5, Insightful)
and will he setup an auto payout for any deaths?
It's called "insurance".
Re: and will he setup an auto payout for any death (Score:2)
Tesla carries insurance? Why should my insurance company pay out when Tesla's level 5 autonomous vehicle does something stupid? I'm just a passenger. No controls. No my fault.
Re: and will he setup an auto payout for any deat (Score:2)
Because if you die but 3 other people didnâ(TM)t, then the insurance company made a profit.
Re: (Score:2)
Re: (Score:2)
That's actually a good question. Who will carry the liability of a self driving car? There's a lot of unanswered regulatory questions concerning self driving cars actually. Such as will you still need a driver's license to operate one? Will self driving cars require the ability to manually drive it? What happens when the car has determined there is service that needs to be performed? Will it drive itself off to a service center without the owner's consent? Will it refuse to go anywhere except to an approved
Re: (Score:3)
Misconception... Its not just drivers. In a collision; the OWNER of the property, the at-fault vehicle is liable... in this case the OWNER made the decision to purchase the vehicle and put it on the road and use the features, And they have responsibility for the consequences of that decision – Tesla is responsible only for the products performance as represented to buyers (If you ignore the safety manuals and use a manufacturer's product in a manner that goes against their instructions, then you
Re: (Score:2)
but as a property owner I can make a judgement call about someone borrowing my car, are they responsible are they attentive, are they a trained and licensed driver. Its a lot harder to make those calls for an AI you had no hand in the creation or training of.
While there is certainly a legal framework for any accidents caused by an AI to be the responsibility of the owner (and certainly you would still need insurance if there is ever a time the AI is not controlling the car) I think if a company is not will
Re: and will he setup an auto payout for any death (Score:4, Insightful)
Tesla carries insurance? Why should my insurance company pay out when Tesla's level 5 autonomous vehicle does something stupid? I'm just a passenger. No controls. No my fault.
Sure. I don't know why people think this is a difficult area. If Ford sell a car with faulty brakes, they're liable and they're insured. What magic ingredient makes people think Tesla would/should be different?
Re: (Score:3)
You're full of shit [wikipedia.org]. The Falcon9 has the best reliability of any heavy lift rocket system, at 97.8% full success over its lifetime.
Rockets from the Falcon 9 family have been launched 91 times over 10 years, resulting in 89 full mission successes (97.8%)
The Soyuz-U [wikipedia.org], the comparable heavy lift system to the Falcon9, has only a 97.3% full success rate over its lifetime.
Over its operational lifetime, the Soyuz-U variant flew a total of 786 missions, another world record. Soyuz-U has also been one of the most reliable launchers, with a success rate of 97.3%.
Re: (Score:2)
Re: (Score:2)
Re:and will he setup an auto payout for any deaths (Score:4, Insightful)
Is it more or less Russian Roulette than driving the car yourself, though?
Re: (Score:3)
Re: (Score:2)
The difference is that when you're the one driving, your finger is the one on the trigger. When the car is driving itself, some (possibly buggy) code's finger is on the trigger.
When you're driving, some (possibly buggy) code's finger is on the trigger as well; granted it's wetware instead of software, but it's demonstrably much less than 100% reliable.
Re: (Score:2)
There's hardware, wetware, and software involved in the operation of every modern car. They have ABS, throttle by wire, ESP, blah blah blah. Any of that stuff could kill you if it went wrong enough.
Re: (Score:2)
Re: (Score:3)
Indeed.
I'm holding onto my gas burning, manual shifting, non-connected to the internet cars for as long as I possibly can.
I don't want them connected to a company, once I buy it, they don't need to know fuck all about where it is, where I drive it or how I drive it, and neither does the government.
They do not need to update anything on it, unless "I" give full consent EACH time.
And I guess....I"m one of a dying breed that has most always bought fun cars, 2
Re: (Score:2)
and it's Death Race 2000 for people on street / sidewalk
Re: (Score:2)
He will do it just like the police does it when they kill civilians...
Call it collateral damage, and God praise America!
It's the American way of life (and death).
Re:and will he setup an auto payout for any deaths (Score:5, Insightful)
Human drivers, mostly through negligence and distraction, murdered 500,000 people in the world (36,000 in the US) just last year. You aren't interested in saving a large percent of those lives? Solutions like "better driver license testing" is nonsensical, you can't test for one-time negligence.
Re: (Score:2)
However as of right now self driving is a theoretical improvement over humans. While Tesla are currently the safest cars on the roads, with a combination of its active safety measure, and its passive bottom heavy design. However the question is will it get to a point where you can safely remove the steering wheel. Or should there always be a driver ready to take control. Even having it requiring a driver to be on the ready, I think it would be much safer than the driver driving all the time.
I know when
Re: and will he setup an auto payout for any death (Score:3, Insightful)
Anything below level 5 requires a driver present, awake, aware, and ready to take over with essentially no warning.
You want level 5. You can't rest your eyes at 4 or less. Not safely.
I know how you feel about the long boring drives but we're not there yet.
Re: and will he setup an auto payout for any death (Score:5, Informative)
Anything below level 5 requires a driver present, awake, aware, and ready to take over with essentially no warning.
You mean anything below level 3.
Level 3 requires that a driver be present and available to take over if the car requests it, but the driver does not have to be paying attention, for example can be watching a movie or reading a book, etc. This means the system has to be good enough that it can give the driver warning and even enough time to figure out how to handle the situation, the driver doesn't have to maintain readiness.
Level 4 requires that a driver be in the vehicle, but not necessarily awake or even in the driver's seat. The driver still has to be around because level 4 is fully automated only in certain areas or circumstances. For example a "freeways only" level 4 mode would make sense, where the car can handle everything from entrance ramp to exit ramp, without any human oversight or even attention.
Level 5, of course, is so completely automated that the controls could simply be removed from the vehicle.
https://en.wikipedia.org/wiki/Self-driving_car#Classification
Re: (Score:3)
To emphasize: Volvo, whose safety promises I trust, is working on Level 4. Their goal is a car that can pull over safely if the computer enters conditions it can't handle (e.g. snow) and the driver doesn't take over. To me that's optimal: if you're actually paying attention, there's no disruption in driving when the auto-drive gets out of its depth, but if you're not then it's still safe.
pull over safely in death valley run till power di (Score:2)
pull over safely in death valley run till power dies and put someone in 110 heat to die?
Re: (Score:2)
As the driver of a level-4 self-driving vehicle, operating the vehicle is still your responsibility. If you sleep too deeply for the "driver intervention required" alert to wake you up, then you shouldn't sleep while your car is driving through Death Valley, or any other area where sleeping on the side of the road is potentially lethal.
Re: (Score:3)
And what happens if you have a regular gas-burner with no automation at all, and have a problem in death valley? The exact same thing. At least with the electric car, you've got a good shot at the AC still working even if something has happened to disable drivability - with a traditional ICE-powered vehicle you're probably fucked on that score too because the engine needs to run to turn the AC compressor pulley.
The good news is that mobile phones still exist in a world where autonomous vehicles of any lev
Re: and will he setup an auto payout for any deat (Score:2)
Mmmm, yes. However, would you really put your life in the "hands" of a level 4 system and take a nap? By definition there are situations it will fuck up and can't handle and the -hope- is it will get you into a safe spot and idle until you're there. However, the fact that it's already in a situation it doesn't understand makes it just as likely it won't be able to find a safe spot, too. Then what happens while you're asleep? Level 4 is the most dangerous of all 5 levels due to the sense of false safety
Re: (Score:2)
Who said anything about pulling over? Obviously you'd hope that it would try to pull over if it's sure it's safe to do so, but stopping in the middle of the road is still a huge improvement over continuing when confused or out of its league, in almost all circumstances.
Narrow winding roads with poor visibility are one of the very few exceptions, but speed limits generally factor in visibility - as a driver you should always be prepared to stop for a child sitting in the middle of the road. (admittedly driv
Re: (Score:3)
If someone is killed by a vehicle through negligence and distraction that is not murder. It's called manslaughter. That is not to say people can't be murdered by vehicle, only that your usage isn't correct.
You aren't interested in saving a large percent of those lives?
We have 7.5 billion people on the planet. Do we need to protect everyone from possible mishap?
you can't test for one-time neglig
Re: (Score:2)
"We have 7.5 billion people on the planet. Do we need to protect everyone from possible mishap?"
I didn't say we should protect them from "every possible mishap" .. trying to expand the scope of argument? If we can prevent some deaths, we should. Is that too much incovenience for you?
"Like that one time an autonomous vehicle's software decides not to work, or the sensors don't work, or something in how both interact with each other doesn't work?"
It's about which one happens more often. If software errors kil
Re: (Score:2)
you can't test for one-time negligence.
Let's treat it like drugs, then. You have to list all possible side-effects, and the chance of a fatal side effect happening.
Believe it when you see it (Score:4, Insightful)
Re:Believe it when you see it (Score:5, Insightful)
...laughably poorly
Okay, sure, it's far from perfect and definitely funny sometimes, but don't think of it as not working as well as a human driver. Instead ask how it works at all. I mean, the car freaking drives itself! And sure, Musk definitely over-sells and under-delivers tech as much as he does with timelines, but his desire and determination to succeed is advancing this field as much as Google's efforts are. I believe his vision and efforts are laudable, even if sometimes clumsy and self-serving. These are the baby steps that will take us to fully autonomous vehicles.
Re: Believe it when you see it (Score:5, Funny)
My roomba also drives itself around my living room and kitchen.
Sometimes it doesn't even get stuck.
Re: (Score:2)
his desire and determination to succeed is advancing this field as much as Google's efforts are.
Are they? Last time I interviewed with an AI Vision company, the employees there had 0% trust in self-driving cars.
Re: (Score:2)
Are you familiar with the 80/20 rule? We're approaching self-driving being 80% complete after 5 years. So the 80/20 rule suggests self-driving is two decades out.
Re: (Score:2)
The summon feature is much harder than 95% of full self driving if you think about it.
Re: (Score:2)
Because clearly each and every Tesla on the road is using the latest nightly builds of software that the development team has cooking, right? And in no way could there be big improvements under test that haven't been released which improve performance, right?
They could release a new version tomorrow that is dramatically better than what you are talking about. As it turns out, the CEO has better inside information about the development of this software than some random slashtard does.
Re: (Score:3, Informative)
The CEO also has invested interest in getting people to take the Upsale to the self driving. I think is $7k extra to your car. So he would want to be sure people are going to get that feature for the future vs not getting it because it currently isn't worth it to them.
Re: (Score:2)
He never claimed it was full self driving. For the most part it has been advertised the way it is.
Phase 1. Cruse control that will speed up and slow down to the car in front of you.
Phase 2. We add the ability for it to self steer and keep your car in its lane.
Phase 3. On Multi-lane roads to be able to pass other cars
Phase 4. Be able to drive on and off the ramp for a highway all by itself
Phase 5. Able to stop at stop signs and traffic lights (This part is in Live Beta)
Phase 6. Able to turn onto intersectio
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
Claiming and selling what? When was it ever billed as "fully autonomous?"
Tesla currently sells what they call the "full self-driving package" for $8000.
Yes, somehow they're getting people to pay thousands of dollars for a feature that not only does not actually exist, it almost certainly will not exist within the lifetime of the vehicle that they're buying. Claims that "it's almost there, honest" are just to try to scam more money out of suckers.
Re: (Score:2, Redundant)
Claiming and selling what? When was it ever billed as "fully autonomous?"
Tesla currently sells what they call the "full self-driving package" for $8000.
Yes, somehow they're getting people to pay thousands of dollars for a feature that not only does not actually exist, it almost certainly will not exist within the lifetime of the vehicle that they're buying. Claims that "it's almost there, honest" are just to try to scam more money out of suckers.
Conjures up "Early Access" on Steam in all the worst ways.
Re: (Score:2)
Re: (Score:2)
Even the term "autopilot" is taken from aircraft systems that are only tasked with keeping the vehicle straight and level.
The problem is while that is *technically* correct, the lay person is not necessarily so well versed in aviation. Hence why when cars got the equivalent they called it 'cruise control' instead of 'autopilot'. Cruise control is the counterpart to autopilot, and despite that equivalence Tesla still opted for 'autopilot' to imply more than cruise control, which means they are explicitly referencing the lay person's understanding of autopilot to differentiate between it and cruise control.
Re: (Score:2, Troll)
Autopilot and Full Self-Driving Capability [tesla.com].
I don't think "full self-driving" is at all ambiguous.
That's stronger language than I realized they use, but it's also clear from the material
the currently enabled features do not make the vehicle autonomous.
Progressing (Score:5, Insightful)
I have a Model 3, only with autopilot (not FSD), and the object detection and identification keeps getting better and better. Yesterday, I noticed that a orange cone lying on its side was rendered as a cone, actually lying on its side. Impressive.
Re:Progressing (Score:5, Insightful)
I have a Model 3, only with autopilot (not FSD), and the object detection and identification keeps getting better and better. Yesterday, I noticed that a orange cone lying on its side was rendered as a cone, actually lying on its side. Impressive.
My new Model Y is the same configuration, where you can do FSD "preview" so you can see on the screen what it identifies.
The things that it is good at are: un-obscured cars and trucks, traffic lights, stop signs, road paint, roadside "cones" and "trashcans." It can't seem to discern between a trash can from something like a road-side equipment box.
The things that seem good but not perfect: pedestrians, trash in the road, bicyclists. It does have a curiously high-fidelity ability to see the difference between a bicyclist and a motorcycle.
Things that never show: Lamp posts, signs other than stop signs, fences and barriers/walls, curbs and overpasses. Nearby birds seem to be invisible.
So if anyone is going to use this for FSD I can only conclude that the display does absolutely not show everything that the dynamic 3d model includes. If what was on the display was all it knew it would be totally unreliable.
That would mean that Tesla is holding a lot of cards close to the chest so anyone making remarks about its efficacy without inside knowledge doesn't know what they are talking about.
At the end of the day the decision to deploy is a financial decision and not an engineering one. The CFO doesn't really care about how exciting a new toy is he/she is going to look at what the effect on the balance sheet is going to be. I am pretty sure that CFO sign-off isn't going to be capricious on this.
Re: (Score:2)
It should go without saying on a tech site that the "visualization" on the display is not the entire dataset that the autopilot hardware is working with. It's far more important to have spare compute cycles for processing the camera images to see if you're about to run into stuff than rendering out every single thing you're going past on the side of the road. I'm sure you know that, but some shrub around here will reply to what you said as "SEE!! It only sees a few things, you're going to run straight in
Re: (Score:3)
You've got it backwards. Animal brains start by assuming everything is a dangerous hazard, and the learning process is how to stop flinching at things that don't warrant it. Watch kids starting to drive - or even better watch young kids learning to, say, catch something. You have to work to get rid of that flinch reflex.
We're doing it backwards with computers - we're starting with not flinching at anything and trying to learn when to flinch.
The not being able to distinguish between truck/road/sky is not an
Re:Progressing (Score:5, Interesting)
So, I have a Model 3 as well. Every few months I try out enhanced autopilot or whatever and TBH it tries to kill me every time. I live in a location where the roads are very winding, have no shoulders, and are poorly marked. There is no way that FSD is anywhere near ready for life outside of cities, towns and interstates. In fact I'd say we are still at least 5-10 years away and I have no confidence we'll get there without AI technology that hasn't been developed yet.
It's going to be dealing that that last 20% that is going to cause all the headaches. That said, I appreciate Musk's enthusiasm and optimism (it's necessary to get anything extraordinary done).
Re: (Score:2)
Yep. It used to be worse. Then, it got better. Then, a little later, even better.
The problem is Autopilot is about 0.09% reliable for a trip without intervention even within the domain it's supposedly able to handle (Interstates).
If it's going to achieve human level safety (and therefore be fully autonomous, even within that domain let alone ALL domains aka L5). So the "Problem" the engineers are facing is improving a system that is 0.09% to 99.99% reliable a factor of 10,000x better.
If your engineering problem was that you need a processor 10,000x better than what you have that's easy
Re: (Score:2)
(Hint *whispering*: that's why the subject of my post was "Progressing").
Yeah... Elon says a lot of thing.... (Score:5, Insightful)
Like an overly optimistic engineer.... take what he says, double, and then double it again.
But, it doesn't need to be perfect. This is what a lot of people and media seem to miss. Yes, the system will break, yes, some one will die in an autonomous vehicle accident. It only needs to happen less frequently than with meat bag drivers. Last time I checked the public statistics, it's actually already doing that.
But anyhow, yeah, we're still years away from the highway safety bureau allowing it.
"less frequently" == "never" in people's minds (Score:2)
ZipprHead wrote:
...Yes, the system will break, yes, some one will die in an autonomous vehicle accident. It only needs to happen less frequently than with meat bag drivers.
From everything I've seen about Teslas the bar is set much higher than with other vehicles. When three caught fire, it was a massive failure of the technology, when somebody crashed while watching a Harry Potter movie from a laptop, it was Autopilot's fault. If you look at the rate at which these incidents happen in regular, gas powered "meat bag" driven vehicles and compare it to the rate at which Teslas have problems you'll see that Teslas are a lot safer and more reliable.
Tesla and Musk
Re: (Score:2)
getting human buyers (not to mention regulators who are political entities) out there to accept any kind of failure that is greater than zero is going to be "difficult".
Clearly not, otherwise there wouldn't be videos on youtube of people reading with autopilot on.
Re: (Score:2)
Re:Yeah... Elon says a lot of thing.... (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2, Funny)
Plenty of Tesla drivers can claim the same thing: they use AutoPilot and they are still alive, ergo AutoPilot is proven to be 100% safe. But since AutoPilot has a lot more mileage (3+ billion miles) under its belt than you (probably less than 300k, 1/10000th of AutoPilot) , it's probably 1million% safer than you.
Isn't flawed logic fun?
Re:Yeah... Elon says a lot of thing.... (Score:5, Insightful)
So I guess you don't carry auto insurance then, because past performance dictates future performance with a 100% degree of certainty, right?
By your logic, walking tightropes between skyscrapers is also 100% safe unless you fall.
That might be the single stupidest thing I've read all week, and there's a lot of stupid shit going on in the world right now.
Re: (Score:2)
Yeah, I agree. I'm not sure how comfortable I would be with an AI controlling my car either.
But here's the thing, I've been in accidents. Both were my fault and a car with auto breaking would have prevented them. Both accidents occurred on perfect weather days. They happened because I momentarily looked away from what is in front and subsequently bumping the cars ahead of me.
Re: (Score:2)
a car with auto breaking would have prevented them
Do not want.
Re: (Score:2)
Additionally, not only is it that Tesla's dataset based exclusively on newer cars versus newer and older cars (collision avoidance, adaptive cruise control, backup cameras and all sorts of other widely available advances are not controlled for), but thanks to regenerative breaking, an electric car will take much longer to get bad brakes. Similarly the drivetrain is less likely to sputter out randomly on the road even for comparably aged vehicles.
Further, autopilot will disable if it finds it cannot function
Re: (Score:2)
Firstly, the only place where Tesla drivers engage autopilot is on highways.
I use it on city streets, quite often. In the rural area where I live it doesn't work, but as soon as I get to "real" city streets where it does, I typically use it.
Re: (Score:2)
Is it end of 2019 yet? (Score:3)
At last check over a year ago we were promised FSD by end of 2019.
We will have real FSD with no human control required right after we get cold fusion powering our flying cars. Then at least autopilot might make sense.
Re:Is it end of 2019 yet? (Score:5, Funny)
Re: (Score:2)
I hope it isn't the end of 2019 yet, I've heard bad things about 2020.
Maybe this is what it feels like to live in the Singularity? So far this year has felt REALLY long, and we're just barely halfway through it.
Time flies when you're having fun. NOT now.
They're close now (Score:3)
They're close now, and probably will be close for the next 10-15 years.
Re:They're close now (Score:4, Insightful)
Re: (Score:2)
Aren't they still convinced that the hardware in their original vehicles are going to be eligible for this as just a software update? Their aversion is that they'd have to admit they overpromised.
Re: (Score:2)
I thought Lidar basically doesn't work when it is raining or snowing.
But will they use it? Probably not (Score:2)
How many Teslas are actually driven around using even the existing assist features? None as far as I can tell. I get cut off and tailgated by Teslas just as much as by Dodge pickups.
Re: (Score:2)
Considering that Tesla has literally hundreds of millions of autopilot miles driven worth of data, yeah it gets used. Your personal anecdote of shitty drivers doesn't discount that they've sold hundreds of thousands of vehicles. Also, all it takes is moving the steering wheel a bit and autopilot disengages, so if they want to be shitty drivers, there is nothing preventing a shitty driver from driving shitty.
Security much? (Score:2)
Unfortunately (Score:3)
No, no he's not. (Score:2, Insightful)
Stuff and nonsense. More marketing hype. Empty promises. Don't fall for it, lads.
Re: (Score:2)
Re: No, no he's not. (Score:2)
And flamethrowers and money and electricity and sex and Mars.
Edge cases are a bitch (Score:3)
Re:Edge cases are a bitch (Score:5, Funny)
To be fair, neither are his cars.
Re: Edge cases are a bitch (Score:2)
Lol, ouch!
Re: (Score:2)
typical media false exaggeration (Score:2)
A lying Cheetah (Score:2)
Can't change it's spots.
In other news (Score:2, Informative)
In other news, Elon Musk endorses Kanye West's Presidential bid [marketwatch.com] after a night of fat blunts and margaritas with Kim K.
Should be a lot easier now (Score:2)
Calling it here (Score:2)
A decade ahead... (Score:2)
What a lot of observations miss is that - once there's a critical threshold of vehicles with autonomous capabilities on a road, they should wirelessly mesh and no longer just rely on environmental input. This would dramatically improve the safety of this technology, enabling a "full route" image of every vehicle's perspective being sent to all within the immediate network. Any changes ahead are sent to upcoming cars long before the sensors detect it locally.
This is also where driving becomes more trai
Re: (Score:3)
What a lot of observations miss is that - once there's a critical threshold of vehicles with autonomous capabilities on a road, they should wirelessly mesh and no longer just rely on environmental input.
... assuming they can afford to trust the data provided to them by the other cars, of course. It only takes a few malfunctioning (or hacked) data-transmitters telling everyone else that driving off the side of a bridge is a good shortcut, to ruin the whole party.
Yeah, he also said he could land ROCKETS on BARGES (Score:2)
Well his timelines are crap, his social skills are as you'd expect an old-time Slashdotian to be (brought up in a cellar with minimal human contact). ;)
BUT he seems to always pull the rabbit out of the hat, eventually... Elon Time. Personally I've given up on seeing autonomous driving in my lifetime, I could yet be surprised... I can never forget the "impossible" first successful barge landing... or land-ing
How will Tesla validate when full AV is ready? (Score:2)
So far, Tesla's Autopilot is Level 2+. If they are claiming to be "close" to Level 4/5, they must have a quantitative metric to represent how close they really are. That metric is a really tricky challenge, and if Tesla has a reliable metric, that itself is a huge advance. In fact, such an advance would go a long ways in solving the historically tricky challenge of validating complex software systems. Such complex software systems have many inputs along with sequential state over a long period of time.
Re: Technically (Score:2)
Yes, but the cement barriers and parked vehicles tend not to get out of the way.
Re: (Score:2)
Re: Technically (Score:2)
No Tesla has EVER run into a Boeing 737 MAX during normal operations.
Re: (Score:2)
Do they work in rain
Yes.
much less icy roads with four inch ice ruts AFTER the plow passes by?
No, but it will also tell you that there is inclement weather and not engage autopilot. So you drive it like a normal car in those conditions, except with the best traction control available due to infinitely variable torque, adjusting thousands of times a second.
What about unmarked roads on properties?
Roads with no markings causes autosteer to become unavailable. Drive like a normal car.
It's almost like the team of engineers that have been working on this shit for years may have thought of this without your input, what with "weather" and "
Re: (Score:2)