by stopads on 1/18/20, 5:44 PM with 897 comments
by colordrops on 1/18/20, 9:00 PM
by wwweston on 1/18/20, 8:00 PM
Most of the value in being able to travel by auto without attention is in trips longer than 15 minutes. Commutes, tourism, vacations. Shipping. Most of which is certainly highway driving.
If it's possible to automate highway driving under most conditions (and safely transition either to a stop or human-piloting when those conditions aren't met), then at least 80% of the value is there.
Wrestling with the harder edges of the problem is still the right thing to do for tolerance reasons, but I hope we don't have to see last-mile problems solved before we start reaping the benefits.
by baybal2 on 1/18/20, 8:35 PM
People claiming most of AI hype barely know what the "AI" people mention is.
And people who go as far as drawing rosy pictures of human like general AI being your personal chauffeur are past ridiculous.
The entire idea of human-like general AI for practical applications is like trying to make people using horses for transport in 21st century, by trying to make a horse than if better than a car.
by Booktrope on 1/18/20, 8:36 PM
This is incidentally one reason for Tesla's huge market value. The company actually has a plan to transition from individual ownership to fleet, so when this happens it will be prepared to deal with a new manufacturing reality.
Just try to imagine VW without all those ads to sell a positive self image because you drive a sexy cool car they make.
by jasonhansel on 1/19/20, 2:56 AM
by robbrown451 on 1/18/20, 7:26 PM
> "This is one of the hardest problems we have. This is like we are going to Mars," Hitzinger said in a comment. "Maybe it will never happen."
First of all, it seems obvious that we are going to go to Mars, eventually. Maybe not any time soon, but never? Seriously?
But the bigger thing is that there is about 1000 times more economic benefit to self driving cars than of going to Mars, at least in the near term. To think we'd just give up on it seems absurd.
by akrymski on 1/18/20, 11:04 PM
by odnes on 1/18/20, 11:57 PM
Highway driving is the most constrained normal driving problem, and this is solvable in 99% of cases. But there are so many things that can happen in most other driving situations that make me think that model-based approaches (Tesla) are doomed to fail... Go ahead, train a classifier for every situation you can think of - I guarantee you that you've missed many things.
Elon tweeted the other day that FSD is "coming soon". Either I'm totally wrong about this, and of course I hope I am, or Karpathy + the dev team should be tempering his expectations.
That isn't to say that there isn't immense value in L2/L3, there totally is. But I think that solving driving (being able to drive any situation a human can) is pretty much the same thing as solving intelligence generally.
by sschueller on 1/18/20, 9:18 PM
by listenallyall on 1/18/20, 7:51 PM
We've had autopilot for decades, we're certainly not flying planes without pilots. Or even getting them from runway to gate without humans. Nor is anyone claiming that pilot-less planes are coming soon.
by rgbrenner on 1/18/20, 7:48 PM
This just reflects VWs ideas to accomplish the goal... ie: they have none.
Let me know when Google says it's not possible.
by crusso on 1/18/20, 7:37 PM
He's poo-pooing level 5 autonomous driving and says that they just about have a level 4 autonomous vehicle?
This article makes no sense.
Honestly, the CEO of my company has very little idea of the details of the technology that we produce. If you picked some cutting-edge technology that isn't key to our market-share yet, he'd have even less of a clue.
by KKKKkkkk1 on 1/19/20, 12:29 AM
by emmelaich on 1/19/20, 1:52 AM
Why are "admit", "confess", "announce", "reveal" etc used instead of the usually more accurate "says"?
(rhetorical question)
by LastZactionHero on 1/19/20, 12:12 AM
This is an off-the-cuff remark turned into a clickbait headline for a fairly bland article.
by waynecochran on 1/18/20, 8:05 PM
by yalogin on 1/19/20, 1:18 AM
by aplummer on 1/18/20, 7:28 PM
by jeromebaek on 1/19/20, 10:13 PM
by MarkMc on 1/19/20, 1:01 AM
The article describes Level 5 as 'full computer control of the vehicle with zero limitations' but what exactly does that mean?
(a) Does it mean a vehicle available for sale to the general public that can drive at night when it is snowing on a new road that has not first been mapped by a human?
or
(b) Does it mean that there is at least 1 city of a million residents where at least 50% of vehicle journeys did not require any occupant to have a driver's licence?
To me (and Volkswagen) it's far more useful to forecast when the second criteria will be met than the first.
by anonytrary on 1/19/20, 1:47 AM
by speedplane on 1/19/20, 10:08 AM
by dlkf on 1/19/20, 10:11 AM
1. First, it begs the question. By saying "X admits Y" we are assuming that Y is true. This is very different than writing "X says Y", or "X believes Y" etc etc
2. Then, they carefully chose Y to include the word "may" so that they can claim it is true regardless of its content. As long as the rest of Y isn't literally a tautology, you're good to go. "A teacup may be orbiting Jupiter" etc etc
3. But step 2 is sneaky. The way that the word "may" makes the overall statement true is different than how it functions in the quoted speaker's sentence. In his sentence, he means like "there is a good chance it will never happen." When they use "may" to make Y true, it is invoked in a much weaker sense.
Basically they are reusing the word "may" in two different sentences to achieve a misleading headline. They obviously have skin in the game and I don't expect this to be a fair article.
by antirez on 1/18/20, 10:12 PM
by temporaryvector on 1/18/20, 8:05 PM
I see the future diverging into two paths, fully autonomous commercial vehicles, like taxis, delivery vehicles, semi-trucks, etc. that work within urban areas or other designated, mapped and specially prepared areas. This will possibly involve a centralized system of control and communication, something like an ATC but for cars. These will be owned by corporations and only used by people. It makes no sense for a person to buy one of these fully autonomous vehicles, although I imagine some people would pay extra to have priority access so they always have one available.
The other side of the coin will be privately owned cars that have autonomous capability, or autonomous cars with override. These will be able to go anywhere the driver wants, including unmapped villages, small towns, off road tracks, etc. I suspect these will be the domain of enthusiasts, people who really need them for work (ranchers and farmers, for example) and people who choose to live away from urban centers. They will be more expensive than cars now, but the need for these vehicles will never go away. Even if we get true AI capable of driving anywhere with only the sensors aboard the vehicle, there will still be the need for a human to override it, even if that involves just authorizing a risky maneuver or putting the AI into "unsafe driving" mode.
I think the movie I, Robot (with Will Smith) got the future of autonomous cars surprisingly right, autonomous inside cities and on highways, and using the manual override comes with penalties (higher insurance, being at fault in an accident, etc.).
On a personal note, and this may sound bad, but I would never buy a car which I cannot use to break the law. Even if I never plan to do it, being able to speed, jump the curb, intentionally crash into a wall (or another car) or even run over a person (for example, in self defense) may be at some point required or the least bad of many bad options. In this case any consequences should fall on me, but I don't think a thing I own should be designed to prevent me from breaking the law or doing something stupid if I really want or need to, although providing warnings or an optional safe-mode is fine. I suspect many people feel the same way, even if they don't put it in such an extreme way. This can be seen by the fact that a lot of cars, particularly those focused on performance or off-roading, come with switches to turn traction control off, and if they don't, it will get mentioned as a negative in any review done by publications focused on those audiences.
by jnurmine on 1/19/20, 9:19 AM
Can one take an "autonomous car" up a barely visible car wide path (not a real road) which squiggles through a forest (up to a cottage)?
Can one make an autonomous car understand a free-form textual sign when there's a roadwork or accident?
Drive in a place without marked roads?
There are plenty of edge cases and difficult situations.
Hitzinger says that level 4 might be achievable. I agree, it is conceivable that some day a lane of a motorway could get reserved for semi-autonomous vehicles; those vehicles can communicate with each other and are allowed to drive much faster (say, 250 km/h) since it's mostly a straight way, computers have faster reaction times especially given early warnings from cars ahead in the chain and so on.
by hinkley on 1/18/20, 7:33 PM
If you go up or down, the number and kind of obstacles reduces. The location of interactions between the vehicles is reduced, and the interactions with other classes of vehicle are zero, so you can negotiate.
Solve a simpler problem, if you can.
by shrubble on 1/18/20, 7:34 PM
However that would mean that everyone would have access to it.
The idea of self driving cars now is, a winner take all situation where whichever funded effort that succeeds, generates outsized profits from licensing or going public at a high valuation.
by rhizome on 1/19/20, 1:34 AM
Sure, technological advances will boost the vision resolution or interpretive rigor to insane levels (compared to present day), and this will be used to acquire increased confidence from the public, but is that enough for you to sit your child in the street and trust the driverless car to steer around it? This is the question I would ask of any proponent of any model-interpreting public technology.
by martythemaniak on 1/18/20, 10:21 PM
For example, there's a snowstorm out here today. Unless they really need to, people aren't going out displayed their incredible skill at navigating through snowsquals with centimeters of snow on the ground. They just stay home.
What will determine the success of self driving cars is not philosophical musings but their usefulness in day to day life. And if you can spend 10k on a system that'll work most of the time, but refuse to go out in snow squeals, it'll sell very well. I'd buy it.
by nojvek on 1/19/20, 2:54 PM
The space is incrementally improving. Lots of new ideas.
I’m very bullish on systems that improve human driving and slowly move to more and more autonomy than “replace humans”
Anyone claiming full self driving is around the corner is prolly lying. I have no idea when it will be here. The edge cases are enormous.
But self driving on selected city routes and highways in good weather is here.
I would say Highway lane following and distance keeping is already better than human.
We aren’t great at holding monotonous focus. Machines on the other hand excel at that. We do excel at edge cases though. Marrying the two seems like a good bet.
by aetherspawn on 1/19/20, 1:54 AM
When a work colleague drove home completely wasted and his Mazda CR-V nearly drove itself, you could not tell that his driver input was erroneous. Actually, it felt very safe. It dawned on me that these systems will mitigate a great deal of preventable accidents due to human stupidity.
by dboreham on 1/18/20, 7:21 PM
by holoduke on 1/18/20, 10:53 PM
by ecpottinger on 1/18/20, 7:27 PM
by ec109685 on 1/18/20, 9:05 PM
by stretchwithme on 1/19/20, 1:06 AM
But I am pretty sure it will happen within 20 years. 5 years? Unlikely.
In any case, it will require much better testing than just letting it learn on the street. A vehicle should be able to deal with a snow covered road in the dark going down a hill with babies crawling across the road.
And EMP-induced total electrical failure. No sense in a million people getting injured at once, should that ever happen. Emergency brakes should quickly engage in a predictable way when power is lost.
by trasneoir on 1/20/20, 9:49 AM
I'd expect level 5 autonomy to be slow, because level 4 delivers 80% of the benefits for 20% of the cost.
BUT "never" seems crazy unless you've got a very pessimistic outlook on humanity's medium-term future. Other than extinction or the collapse of civilization, what could cause it to "never" happen?
by linuxhansl on 1/18/20, 10:44 PM
I hope that by the time I'm too old to drive - a few decades from now - self-driving are available, but I'm not betting on it.
by dyeje on 1/18/20, 8:45 PM
by rmason on 1/18/20, 10:51 PM
There's video on YouTube from three years ago with George Hotz predicting that Level 4 or 5 would never be reached without artificial intelligence. It made sense to me then and it still does today.
by rhacker on 1/19/20, 5:03 PM
Volkswagon can speak for its own company. It can't speak for the industry. If it were to speak for the industry the word admit can't be used. It would be a different word, like "believe".
Volkswagon can certainly admit that "they" won't have full self driving capability, but it can't "admit" that for other people.
by oblib on 1/19/20, 5:08 PM
The cost to upgrade roads appears to be a significant hurdle. A google search says:
"There are approximately 4,071,000 miles (6,552,000 km) of roads in the United States, 2,678,000 miles (4,310,000 km) paved and 1,394,000 miles (2,243,000 km) unpaved."
And there's this:
>> einrealist 18 hours ago [-]
>> I am more fearful of trolls, tricking the technology.
That's even more difficult to address.
We have to evaluate the cost/benefit of implementing this once we get the tech to the point of near total awareness of real world conditions. It might make sense to implement it on major highways, but probably not on rural roads and neighborhood streets because it's not something we can skimp on.
To make it cost effective the roadway tech has to be for the most part "dumb". We cannot rely on "smart" tech that requires complex communication systems or dumb tech that's easy to hack, like lines painted on the road or stop signs and traffic lights that AR can "see".
I think we'd be better off working on implementing assistive safety technologies for automobiles and mass transportation infrastructure like high speed railways.
by Causality1 on 1/18/20, 8:40 PM
by acolumb on 1/19/20, 4:32 PM
What won't happen is cars without the ability to have a human take over. There are too many fringe cases to allow cars without steering wheels.
by joshuaheard on 1/18/20, 9:37 PM
by je42 on 1/19/20, 9:56 AM
I were a VW engineer, why would I continue working in the autonomous driving division, if my CEO says the team would never succeed in the end game ?
I'd rather work in a company where the leadership is actually leading instead putting on the brakes.
by 8bitsrule on 1/18/20, 8:56 PM
by wavegeek on 1/18/20, 11:31 PM
https://en.wikipedia.org/wiki/Hype_cycle
The big challenge for self driving cars is that they are held to a standard of perfection. No human driver can meet that standard either, but they are exempted due to incumbency bias.
The benefits of self driving cars are so phenomenally huge that if they are possible they will happen. Apart from the costs of paying people to drive, there are the time benefits of not wasting time driving cars - is there anything more boring and tedious? - and the fact many people, the young, the old, those with poor eyesight cannot drive.
I don't see any evidence it is not possible within 10-20 years. Cars will get smarter and easier to drive and the final step will not seem large. For the edge cases, there is always the possibility of having the vehicle temporarily being taken over by someone remote. Remote driving is already done in some mines.
by classified on 1/22/20, 1:49 PM
by hownottowrite on 1/18/20, 8:00 PM
This is definitely not consistent with statements Alex has made in the past. Seems like more of an off the cuff remark a German engineer would make while confident that the fully realized result is right around the corner.
by peterwwillis on 1/18/20, 10:09 PM
by hotz on 1/19/20, 10:08 AM
by owens99 on 1/18/20, 9:23 PM
If you want to know about the future, don’t ask the incumbents. They will only tell you about the status quo.
by keanzu on 1/19/20, 12:39 AM
admit (v): confess to be true or to be the case.
Correct use in a sentence: "VW admits guilt and pays $4.3bn diesel emissions scandal penalty"
The word admit doesn't apply to predictions of the future.
by j45 on 1/18/20, 11:31 PM
Compare driving somewhere vs taking a ride share and how much more prepared/present you are when you arrive.
by pcarolan on 1/18/20, 11:57 PM
by ogre_codes on 1/18/20, 7:50 PM
[1] Yes, I know about the accidents
by sakopov on 1/19/20, 2:00 AM
by steveharman on 1/19/20, 8:25 AM
Electric, sure, bring it on.
by zhoujianfu on 1/19/20, 6:41 AM
And flying would be superior to driving (faster, no traffic, no need to maintain infrastructure, etc..) so there’d be little incentive to continue even trying to get to level 5 for self-driving.
by yzh on 1/19/20, 2:54 AM
by FpUser on 1/18/20, 9:23 PM
Let's suppose it will happen. Do you have a proof that such "good" tech does not develop a will to one day run over every human it can?
by m0zg on 1/18/20, 11:10 PM
by mbostleman on 1/18/20, 9:21 PM
by mymythisisthis on 1/18/20, 7:45 PM
by gok on 1/18/20, 8:39 PM
by wojciii on 1/18/20, 8:35 PM
by interdrift on 1/19/20, 2:01 PM
by edisonjoao on 1/22/20, 1:48 AM
by OrgNet on 1/18/20, 11:52 PM
by RedComet on 1/18/20, 10:23 PM
by 0xff00ffee on 1/18/20, 8:54 PM
by growlist on 1/19/20, 6:20 PM
by zweep on 1/19/20, 4:08 AM
by jay_kyburz on 1/19/20, 4:49 AM
Updated to clarify message a little.
by euske on 1/19/20, 3:52 AM
by jakeogh on 1/18/20, 9:33 PM
by Geee on 1/18/20, 8:32 PM
Level 5 means that the car can drive autonomously > 95% of the time, and it will absolutely be possible. Level 4/5 distinction doesn't really make any sense once the autonomy gets beyond certain percentage. It doesn't mean that it's level 4 until it hits 100% (which is impossible).
Level 5 car is designed to drive in all conditions, but there are always statistically unlikely corner cases or situations that require high-level decision making, which the car can't handle by itself. A single driver may never hit such case, and for them the experience is full self-driving.
by axguscbklp on 1/19/20, 1:23 AM
It's likely that if humans ever developed the sort of AI necessary for full self-driving, this AI technology would radically transform the world. Using it for cars would be one of the least important applications.