Self driving cars? No thanks.

posted at 11:31 am on April 20, 2013 by Jazz Shaw

I’d heard rumors about this, but hadn’t paid too much attention to it previously. It seems that the idea of “self driving cars” – made more famous by Google – is picking up steam and may be commonplace in little more than a decade.

The consensus among auto industry technologists, gathered in Detroit this week for Society of Automobile Engineers World Congress, is that by the middle of this decade, cars that can largely pilot themselves through traffic jams will be offered for sale. By 2020, cars capable of taking over most of the work of high speed driving could debut, and by 2025, fully autonomous vehicles might hit the streets in meaningful numbers…

[A]uto makers – and safety regulators in the U.S. and Europe – say they’re serious about pushing more autonomous braking and steering systems into cars and trucks, for one overriding reason: Most humans are depressingly bad drivers.

Really? Did none of you people watch Will Smith’s seminal work on the subject, I Robot? As soon as those cars figure out what a bunch of defective, inferior rejects we all are they’ll be running down pedestrians and flinging themselves into walls at maximum speed in no time. Still, for some reason, Dr. James Joyner seems willing to kneel down before our new automotive overlords.

It’ll be a while before this trickles down. While I’d be willing to pay some reasonable surcharge for this technology, I doubt I’ll ever buy a brand new car again. I’ve only done it for myself twice and not since 2001. (My late wife, on the other hand, insisted on having all the latest gadgets, so we bought two brand new vehicles for her.) Still, it tends to take a decade or so for the gee-whiz gear to migrate from an upcharge on a BMW to standard on a Kia.

Regardless, once these technologies are perfected, there’s going to be a heavy impetus to make them mandatory. Once robo-cars become standard, it’s going to be difficult to justify letting humans drive themselves in traffic.

I have an immediate and visceral response to this prospect and it’s completely negative. (And it has nothing to do with the issues others are already having with the Google car ruining their lives.) I probably inherited it from my father. Our younger readers probably don’t remember this, but there was a time when power steering and power brakes were newfangled, optional features on cars. My dad hated them, and resisted getting a car with either for some time.

I think it’s something to do with the basics of human nature and the desire to keep direct control of our fate as much as possible. Dad liked the idea of a pedal you pushed that directly moved a linkage which applied pressure to the breaks. He liked turning a wheel which was directly, mechanically connected to the axle. The idea that some higher tech gizmo was standing between him and the mechanism – a gizmo which could fail at any time and leave him without control of the vehicle – was disconcerting. I tended to agree, though I gave up and adopted them quickly enough.

But a robot car that essentially does all of the driving and makes all of the decisions in quickly evolving, potentially life and death situations? When you combine that with Joyner’s prediction that the government would steadily move to mandate this technology because we simply can’t be trusted with the responsibility, I get a screaming case of the heebie jeebies. No thanks, Google. You and Skynet can take your fiendish plots elsewhere.

EDIT: (Jazz) Yes, I’m leaving “breaks” in just to remind me that I’m far from perfect. Thanks.


Related Posts:

Breaking on Hot Air

Blowback

Note from Hot Air management: This section is for comments from Hot Air's community of registered readers. Please don't assume that Hot Air management agrees with or otherwise endorses any particular comment just because we let it stand. A reminder: Anyone who fails to comply with our terms of use may lose their posting privilege.

Trackbacks/Pings

Trackback URL

Comments

Comment pages: 1 2

That’d be it – a self-driving flying Combi van! lol. Ta mate.

s_dog on April 20, 2013 at 4:50 PM

This is one of those tricky questions that strongly depends how you ask it.

Do you think you can drive better than an automated car with more safety for you?

Well of course I do, I’m awesome.

Do you think new drivers/teenagers texting all the time should have a computer drive for them or do you think they’re safer than a computer?

Yeah, someone else should be handing their driving; even if the computer isn’t perfect it’ll be better than those yahoos.

Ask the question the “correct” way to get the “correct” answer. Easy-peasy.

gekkobear on April 20, 2013 at 4:53 PM

Episode 37, “I, Mudd.”

Axe on April 20, 2013 at 4:58 PM

Don’t we already fly in auto piloted airplanes?

I’ll take the autopilot car so long as the driver module has a Mario Andretti setting.

And I’m going to have a fleet of coaches to rent that will be the nouveau auto-piloted sleeper car. Wake up at your destination.

TexasDan on April 20, 2013 at 5:32 PM

Self driving cars should be able to travel at higher speed per vehicle density, and park at a much higher density as well, since it can drop you at your doorstep and go cram into a narrow slot somewhere. They could theoretically even park themselves several rows deep and then sort things out among themselves when you hail your vehicle.

TexasDan on April 20, 2013 at 5:37 PM

Jazz,
Resistance is futile.

Robot cars will come with an Earned Car Credit where the government will give you $10,000 of other people’s money if you own one.

The government will make everyone else pay for your Robot car as an incentive to buy since it won’t cost you a dime except in increased taxes.

The IRS will impose a $50,000 surtax if you don’t own one.

It will come with 30 days of free Obamacare, estimated to be worth the paper it’s printed on.

The government will require you to drive illegal aliens around in your conventional car. You will have to take them wherever they want to go and buy them lunch.

TSA will force you to sign confessions praising Robot cars.

If you resist, you will sent to recently re-activated secret CIA interrogation facilities (after all this is a matter of national security because liberals say so) and subjected to enhanced interrogation procedures until you sign a contract to buy a Robot car.

You may as well sign an agreement to pre-order your 2025 new Robot car because you will have no alternative.

BMF on April 20, 2013 at 5:38 PM

without having to grouse about supposed statism.

Stoic Patriot on April 20, 2013 at 12:22 PM

Really? ‘Supposed’? Have you looked around you lately?

GWB on April 20, 2013 at 5:41 PM

I personally CANNOT WAIT for delivery of my first self driving car.

How great will it be to be able to get into the car, not even have to hit a button and just say …

“Google, take me to Kyle’s house in Georgia”.

It will then look up my contacts in the Google cloud, find the only person I know in Georgia with the name Kyle… get his address from his contact info and then it just… goes.

8 hours later, after a nice nap, reading hotair.com, playing a game or two perhaps, joining a Google+ “Hangout” video conference — I’m there.

Even better is that it will know the current locations of all my friends around town (if they allow it). So I can just get in the car and speak “Google, take me to Suzanne’s location in town.” and bam, it starts navigating its way automatically to where she is.

We conservatives automatically assume that new technology is BAAAAD. Its not. Its meant to make our lives better. Self-driving cars are NOT Artificial Intelligence. AI implies that the computer would be able to “learn” and while it might have very complex algorithms to compute best route, mileage required to reach destination, etc… it will never exceed the boundaries of its programming.

That being said – driving is a task that computers would be IMMEASURABLY better at doing than humans. Humans are great any many things – driving isn’t one of them.

Last but not least – this SHOULD be something that you can say “Google, I want to drive the car myself” and it will relinquish command of the vehicle to you.

Defenestratus on April 20, 2013 at 6:11 PM

Don’t we already fly in auto piloted airplanes?

TexasDan on April 20, 2013 at 5:32 PM

If airplane pilots were held to the same testing standards as car drivers, the sky would practically rain 747′s.

MelonCollie on April 20, 2013 at 6:19 PM

Yeah, and your great grand-parents thought “machines” were dangerous and smelly.

Within 20 years most cars will be self-driving. Within 50 years they all will. You’ll have the option to drive yourself on on some roads, but e.g. the interstate will be robotic only.

Imagine no red lights, then imagine yourself not liking that.

Intersections where the traffic in both directions is doing 45 mph -with no traffic light- because the cars all up hooked up to the grid and can’t hit one another.

I95 will never come to a standstill due to volume or accidents.

Car insurance will mostly comprehensive coverage, because it will be much more likely that your car get hit by a falling tree limb than by another car. Wouldn’t you like $50 a year coverage?

Traffic deaths will be largely unheard of.

Get used to it. It’s coming.

Akzed on April 20, 2013 at 6:27 PM

Akzed on April 20, 2013 at 6:27 PM

The cars don’t even have to be hooked up to a grid. In fact, it’s better if they aren’t — they will just track the cars around them, communicate with them, and work out the best travel solutions.
Also, traffic lights will still be around, if only to give pedestrians a chance to cross the street.

Count to 10 on April 20, 2013 at 6:57 PM

s_dog on April 20, 2013 at 4:34 PM

Doesn’t it seem odd to see an electro-mechanical process that disproportionately afflicts cars driven by old people?

LOL.

Yes. You are a sane and smart person.

However, if Toyota is making cars for old people to drive, old people better be as safe as possible in them. More importantly, a manufacturer should act fast if people are being injured or killed. THAT was my point. Issues about cause are not relevant.

We just had the whole country hysterical over the horrors in Boston. How does 250 + deaths and 3,000 serious injuries strike you? Look up Ford and Firestone.

Then kindly read about one case involving Toyota:

In 2010, Toyota settled a previous wrongful death lawsuit for $10 million before the current cases were consolidated in U.S. District Court in Santa Ana.

In the earlier case, a California Highway Patrol officer and three of his family members were killed in suburban San Diego in 2009 after their car, a Toyota-built Lexus, reached speeds of more than 120 mph (193 kph), hit an SUV, launched off an embankment, rolled several times and burst into flames.

Investigators determined that a wrong-size floor mat trapped the accelerator and caused the crash.

Read more: http://www.foxnews.com/leisure/2013/01/18/toyota-settles-first-hundreds-wrongful-death-suits-involving-unintended/#ixzz2R39mODPF

Then look at your own source:

NHTSA officials said the causes were the ones they suspected all along — bulky floormats, sticking gas pedals and driver mistakes. “We found that when a complaint alleged the brakes didn’t work, what most likely happened was pedal misapplication,” said deputy NHTSA administrator Ron Medford.

Floormats, old folks and delay in acting suddenly aren’t funny when you have young men, wives and even kids being dismembered or burning alive. I got in trouble some time ago as I observed that such tragedies are too often ignored by Asian firms.

IlikedAUH2O on April 20, 2013 at 7:45 PM

Then we have the people on threads like this cursing President Obama and gov’t motors (GM) while salivating about manuacturers and cars I know are questionable.

One of the writers on this moniker has some small expertise in engineering and invention and a conscience.

IlikedAUH2O on April 20, 2013 at 7:49 PM

A car smart enough to drive itself can be smart enough to have a chemical sensor unit and detect if it is loaded with explosives. It can then simply call 911 and report it. It would also be smart enough to detect if its chemical sensor was disconnected — and report that.

And anyone technically advanced enough to get around those factors doesn’t need a self driving car to cause mayhem anyway. But. Notice that the root problem of these terrorists is Islam. Almost all terrorism is jihadist related. And as a group, their sophistication level is low. Islam is not an ideology that works with intelligence and creativity.

SunSword on April 20, 2013 at 8:03 PM

That being said – driving is a task that computers would be IMMEASURABLY better at doing than humans. Humans are great any many things – driving isn’t one of them.

Defenestratus on April 20, 2013 at 6:11 PM

No, it isn’t! A computer cannot do the processing that I do merely in my hindbrain when driving that tells me when people are going to do stupid things, that tells me (because I look much further forward than a car’s sensors can) when things are going to require changes to my driving. A computer can do a better job of driving than people who only look as far forward as the car in front of them and don’t pay enough attention behind them to see the car moving up from a half-mile back. But, those people aren’t even the problem. The system that allows them to continue to drive when not in actual control of their vehicle is the problem. Replacing those people with computer-controlled cars might be an improvement (at least *something* would be in control of their car besides inertia), but it would be better in many ways to remove them from the road entirely (except as passengers in a bus or taxi).

If airplane pilots were held to the same testing standards as car drivers, the sky would practically rain 747′s.

MelonCollie on April 20, 2013 at 6:19 PM

No kidding. And autopilots are not used in places that are as crowded as a roadway at rush hour. And, when they are used where precision is important, there is a large infrastructure to ensure that precision is available.

I’ll just point out how often I have to sit at a traffic light when there is absolutely no one coming for a mile because the software says so. I don’t want that happening because of my car, too.

GWB on April 20, 2013 at 8:15 PM

I am shocked – shocked – that conservatives are deathly afraid of new things that might benefit society as a whole but would change things in the process.

triple on April 20, 2013 at 11:38 AM

Nope. Some of us are actual computer programmers who understand how software is developed and tested. I suppose if your soul is prepared, then feel free to be a beta tester. But before you do, I should let you in on a little inside secret of software development… it’s all beta, all the time.

Besides, I don’t really see why this would benefit society as a whole. Yes, I’m sure you’ll cite accidents and bodily injuries… but I just see another instance where people willfully give up taking personal responsibility because its just so much easier if everything is done for them so they don’t have to take risks…

Where does that type of thinking lead? …certainly not toward more freedom.

So your sarcastic remark has a tiny mustard seed of truth in it… conservatives are not for new things that turn humans into timid little drones. And I am happy to resemble that remark.

dominigan on April 20, 2013 at 8:31 PM

That being said – driving is a task that computers would be IMMEASURABLY better at doing than humans. Humans are great any many things – driving isn’t one of them.

Defenestratus on April 20, 2013 at 6:11 PM

Says the person who seems to be unaware that software is written by HUMANS, usually in corporate cubicles, and who are unable to predict all circumstances under which that software will attempt to navigate a vehicle.

If you think it is so great… describe to me in simple pseudo code flow what the decision tree would be to route the vehicle when arriving at a destination with a one-way brick street lined with vehicles, with the destination parking lot closed for construction with a chain across it, and all spots filled in the surrounding streets filled with parked cars, located near a college campus where kids are moving in? I just described my son’s college area, which is extremely difficult to navigate through. I would like to hear your predictive logic for determining the route, especially if the “passenger” has fallen asleep… and explain to me why I should prefer this logic to making my own decision.

I’ll await your expert response to my comment.

Btw, do you know how many “bugs” are identified in most software? It’s because the software is being used in ways unimaginable during the original programming. What you observe as a bug is simply the user using it in ways that do not match with the original requirements (also developed by humans).

dominigan on April 20, 2013 at 8:42 PM

That being said – driving is a task that computers would be IMMEASURABLY better at doing than humans. Humans are great any many things – driving isn’t one of them.

Defenestratus on April 20, 2013 at 6:11 PM

Maybe I shouldn’t keep harping on this comment, but this really bugs me.

Computers are hardware… glorified calculators. They don’t do anything but flip micro switches from 0 to 1 and back.

It is the SOFTWARE that makes the decisions that you seem so blind to. Software is written by humans, usually in a corporate cubicle setting, to meet requirements written by other humans, and tested against those requirements. But key to all this is understanding that those requirements… how a user will use the system… must first be envisioned before it can be programmed.

Your comment seems to violate my spidey (common) sense because you are willing to trust other people thousands of miles away, that you don’t even know, to make a decision that is happening to you right now. You are trusting that they somehow envisioned the conditions through which your vehicle is navigating, and trusting that they got everything right.

Sorry, but my software development experience tells me that the more complex you create software, the more fragile it becomes.

dominigan on April 20, 2013 at 8:58 PM

I should let you in on a little inside secret of software development… it’s all beta, all the time.

and

Sorry, but my software development experience tells me that the more complex you create software, the more fragile it becomes.

dominigan on April 20, 2013 at 8:58 PM

Ditto and then some.

AesopFan on April 20, 2013 at 9:16 PM

So Jazz Shaw is opposed to automated cars for the extremely dispassionate and logical reasons that (1) he saw a purely fictional movie about robots and (2) way back when his father was against newfangled power brakes and power steering (and probably electric starters instead of hand-cranking as well).

Even a robot could write posts that were less moronic – especially since there will come a time in the next decade or two when millions of aging but healthy baby boomers who live in suburbs will still need cars but no longer be able to drive safely.

bgoldman on April 20, 2013 at 9:51 PM

bgoldman on April 20, 2013 at 9:51 PM

Congratulations on completely ignoring the responses from people who are actual experts in the field and who have forgotten more than you (and I!) have ever learned about software.

Everyone is so bloody gung-ho for the convenience of turning all vehicles into glorified taxis.

MelonCollie on April 20, 2013 at 11:12 PM

A car smart enough to drive itself can be smart enough to have a chemical sensor unit and detect if it is loaded with explosives. It can then simply call 911 and report it. It would also be smart enough to detect if its chemical sensor was disconnected — and report that.

SunSword on April 20, 2013 at 8:03 PM

So there are no legitimate uses for explosives that would require transportation in a vehicle? What about ammonium nitrate fertilizer? Gunpowder?

James on April 20, 2013 at 11:14 PM

MelonCollie on April 20, 2013 at 11:12 PM

So a part-time blogger who draws supposedly factual conclusions from a science-fiction movie and his father’s 60-year-old prejudices about automotive improvements is “people who are actual experts in the field and have forgotten more than you (and I!) have ever learned about software”?

bgoldman on April 21, 2013 at 2:47 AM

I have never driven a car that you could not tern of the engine while in drive. The interlock merely prevents the key from turning far enough to lock the steering wheel.

Slowburn on April 21, 2013 at 5:31 AM

I don’t like power steering and power brakes because if the engine quits it suddenly becomes much harder to steer and stop the car.

Toyota out sold Government Motors in the disastrous destruction of wealth called Cash for Clunkers and then they suddenly became unstoppable runaways except stepping on the brake stopped them, turning them off stopped them taking them out of gear stopped them.

Slowburn on April 21, 2013 at 5:51 AM

bgoldman on April 21, 2013 at 2:47 AM

Shut up before you make an even BIGGER idiot of yourself. Here’s ONE example of what I was talking about:

Sorry, but my software development experience tells me that the more complex you create software, the more fragile it becomes.

dominigan on April 20, 2013 at 8:58 PM

MelonCollie on April 21, 2013 at 9:31 AM


MelonCollie on April 21, 2013 at 9:31 AM

Does your “software development experience” lead you to base conclusions on science-fiction movies and a father’s old biases against newfangled automobile options the way Shaw did?

bgoldman on April 21, 2013 at 11:45 AM

I doubt this will happen. Too many lawsuits.

virgo on April 21, 2013 at 12:44 PM

I’m with Jazz on this–no thanks!

How many recalls of cars have we heard about due to sticking gas pedals, runaway acceleration, or cruise control that gets stuck at high speed with no manual override?

While there may be some interest in a car that would automatically brake if an obstacle crossed in front of it at close range, a “self-driving” car which applied its own gas pedal is inherently dangerous. Would we then have drivers in accidents then claiming, “I tried to stop but the car wouldn’t let me?”

Even automatic braking for obstacles or red lights could have its pitfalls. If a light turns yellow at fairly close range, a driver on a slippery road will probably continue through the intersection to avoid skidding, or possibly being hit by another car from behind. A “robo-car” that automatically braked might cause a skid, or an accident from a tailgating driver that could be avoided.

Also, how many accidents could be caused by a car that braked for a squirrel, and then a tailgating driver hits it from behind? Computers can’t make squirrels any smarter, and they’re rather stupid!

Steve Z on April 21, 2013 at 9:44 PM

Does your “software development experience” lead you to base conclusions on science-fiction movies and a father’s old biases against newfangled automobile options the way Shaw did?

bgoldman on April 21, 2013 at 11:45 AM

Does your IQ being at the level of a lobotomized liberal prevent you from having a rational thought?

MelonCollie on April 22, 2013 at 10:54 AM

That a network of network controlled cars could operate safer is probably quite true. Eventually the network and algorithims would be robust enough to handle the overwhelming majority of all system and control failures in a fail safe mode.

But this would likely be true only for travel on a fixed network of roads. Certainly, going off-road, even to drive up to your house that is on a gravel road, would be a problem since the operator would have to be able to turn OFF the automatic control systems and assume manual control or be limited to whatever the system will allow. The presence of a manual override would also imply a real danger of malicious intent on the part of an operator.

The real problem initially will be that there will be a hybrid system — automatic control and manual control — and the relative lack of a fail safe would mean that the automatic car occupant must always be instantly ready to take manual control. This could be a very nerve wracking situation to be in — akin to having a plane on auto-pilot then when you may not expect it, the auto-pilot shuts down and now you, the pilot have to immediately take action (fortunately the plane would be in a stable situation so you have a bit of time to adjust).

Another problem would be if a car dropped out of the network while still in the area where the network was otherwise functioning. If the car’s fail safe is to stop immediately, the network would have to be fast enough to stop everyone else who might collide with the stationary car or there would be a collision.

Let us not forget that if you leave the “civilized” area with its automatic driving to a less “civilized” area that does not then you have very inexperienced drivers at the hands of old-fashioned cars with all that implies.

So the transition would be the problem area, since the new tech would tend to be expensive, the networks over the roadways would have to be extensive and reliable (to some extent at least) and if you allowed for manual override then, well you can see. It might happen, but right now it is a potentially frightening prospect.

Russ808 on April 22, 2013 at 2:23 PM

Not only do I insist on driving my own car, but it must have three pedals on the floor.

J Baustian on June 3, 2013 at 1:11 AM

Comment pages: 1 2