The biases baked into artificial intelligence

Why it matters: One study from MIT Media Lab found that leading facial recognition systems correctly identified male faces 99 percent of the time, but made mistakes up to 35 percent of the time with dark-skinned female faces. This has major consequences for people, because algorithms can help with decisions from who can travel freely to who is arrested and how long they go to jail.

Advertisement

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement