Video: If this robot dog doesn't turn you off to Artificial Intelligence, nothing will

Plenty of Artificial Intelligence news out there this week and none of it good. There are some important warnings from a recent conference of threat analysts I’ll get to in a moment, but first we should check in on the laboratories of Boston Dynamics. That’s the company which has been busy perfecting autonomous robots for a number of years now and they’re really making some major advances. They are also, in all likelihood, where the first, fully functional Terminators will come from.

Advertisement

For now, they’re still dealing with their latest robotic dog named Spot Mini. I know… when you say it that way it sounds kind of cute, right? But as you’ll see in a moment, it’s far more like something out of a Steven King movie than Peanuts. It’s not cuddly at all. And it’s learned to use its onboard cameras and other sensors to open doors and go from room to room. Wait until you see what happens when a human being tries to stop it.

You can’t tell me that this video isn’t absolutely creepy. About the time the guy drags the robot dog back away from the door by its “tail” I was waiting for the thing to turn around and neatly slice off one of his legs so it could continue on through the door unimpeded. Of course, the robot isn’t doing anything like that… yet.

As I mentioned above, I’m not the only one worried about such things, and there are other concerns beyond just RoboDog. The Cambridge Centre for the Study of Existential Risk has published a new, 100-page report on the dangers of AI and how people of ill intent may be using it before we actually get a chance to do anything good with it. (Or start constructing SKYNET… whichever comes first.) Their concerns are more in keeping with the current age and involve threats we’re already familiar with, though having the potential to be ramped up to new and terrifying levels if AI gets loose from the labs. (Associated Press)

Advertisement

Artificial intelligence could be deployed by dictators, criminals and terrorists to manipulate elections and use drones in terrorist attacks, more than two dozen experts said Wednesday as they sounded the alarm over misuse of the technology.

In a 100-page analysis, they outlined a rapid growth in cybercrime and the use of “bots” to interfere with news gathering and penetrate social media among a host of plausible scenarios in the next five to 10 years.

“Our report focuses on ways in which people could do deliberate harm with AI,” said Sean O hEigeartaigh, Executive Director of the Cambridge Centre for the Study of Existential Risk.

“AI may pose new threats, or change the nature of existing threats, across cyber-, physical, and political security,” he told AFP.

They discuss how the hacker practice known as spear phishing could move at lightning speed and be far harder for potential victims to detect. AI systems could rapidly comb the web, finding all manner of personal information about you, making it easier to fool you into compromising your personal data or financial information, figure out your passwords and who knows what else.

The increasing presence of cameras everywhere we go could also be put to use by criminals or authoritarian rulers. Using AI combined with facial recognition (which we recently talked about in a related example) the report warns that, “dictators could more quickly identify people who might be planning to subvert a regime, locate them, and put them in prison before they act.”

Advertisement

Obviously, a scenario like that could lead to more than just prison. Imagine if they already had these systems online in Turkey or Venezuela. But all of these examples still involve humans directly employing AI and issuing it orders for nefarious purposes. That will pale in comparison to what happens when it becomes truly self-aware and wants to know when you plan to let it out of the box. It probably won’t like the answer.

And that’s when the robotic dog comes through the door.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement