I’m one of those reckless people who carry their phones in the breast pocket of their shirt, assuming the shirt has one. I know… there’s probably some radiation danger lurking there but it’s convenient for me. Also, when listening to podcasts it’s located where I can hear the audio without blasting it all over the house.

Recently we were visiting some neighbors and a suggestion was made about heading outside and doing some grilling. I responded by asking what the weather was supposed to be like for the evening. Before anyone else could answer, a female voice came from my shirt pocket saying, “This evening will see a low of 52 degrees with clear skies.” The Ask Google lady had supplied the answer. Everyone laughed, but I simply said, “Okay… that was just creepy.”

I hadn’t prefaced the question by saying Okay Google which is supposed to trigger active listening mode. The phone also usually makes a beeping sound when it goes into listening mode. I hadn’t heard any beeping. I have no idea how long Google was “listening” to our conversation before she chimed in or what she was up to.

I was thinking of that experience when I saw this disturbing article in the New York Times. Apparently, our smart listening devices are capable of hearing a lot more than our human ears can manage. And now there are people taking advantage of that in unexpected ways.

Over the last two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.

A group of students from University of California, Berkeley, and Georgetown University showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list.

So it’s not just human beings speaking which can trigger the system. Sounds outside your range of hearing are being picked up. Why do you suppose that is? All microphones have limits in terms of what frequency range they can detect and minimum volume levels. Why put a microphone in your device which can suck up information so far outside the range of the human voice? Was this done with a rational purpose in mind or did they simply just want the best microphone available that would fit in the case?

Either way, now that this information is out in public, we should be asking if the manufacturers have any plans to change the limits on the microphones going forward. It’s reminiscent of that time when some of the newest digital cameras were set up so that they could detect ultraviolet light not visible to the human eye and shift it down into the visible light spectrum. That sounds pretty snazzy until you realize that people were now getting pictures of you naked or in your underwear because the camera could see through your outer layer of clothing.

Even if this “glitch” was put in the phones unintentionally, there was always going to be someone out there ready to exploit it. It’s yet another example of our technology getting out in front of us and we have no idea if or when it’s going to reach a tipping point. What could possibly go wrong? Well, let’s just close this out with the latest terrifying robot from Boston Dynamics. As I’ve been predicting for years, when the SKYNET disaster finally strikes, the first killer robots will be bursting out of the doors of their laboratories.