Header image

5 Ways You Already Rely On Artificial Intelligence

January 29, 2016 blog-post government business education

Artificial intelligence is typically represented in pop culture as a robot with human characteristics—from Hal in Stanley Kubrick’s classic “2001: A Space Odyssey", to Arnold Schwarzenegger’s “Terminator," to Samantha and Ava in “Her" and “Ex Machina."

Truth is, many of the technologies we use today have roots in the field of artificial intelligence or AI.

These depictions limit us to thinking of artificial intelligence as talking human-ish robots. But the truth is, many of the technologies we use today have roots in the field of artificial intelligence or AI. Just yesterday, Google announced creating an AI that was able to beat a top human player at Go, one of the last games that people still play better than machines.

But what is AI? The Stanford Encyclopedia of Philosophy defines the field as “the subfield of Computer Science devoted to developing programs that enable computers to display behavior that can broadly be characterized as intelligent.” This encompasses any immediate applications such as speech-to-text, computer vision, and classification algorithms that contributes to the long term goal of developing intelligent, autonomous agents.

It’s important to understand how applications in artificial intelligence are used today so we can better prepare for how they are already changing our world. The forecasts are mixed. While the World Economic Forum warns that AI could kill off 5 million jobs by 2020, tech leaders like Mark Zuckerberg continue to pursue AI research “for the amazing amount of good it will do in the world. It will save lives by diagnosing diseases and driving us around more safely. It will enable breakthroughs by helping us find new planets and understand Earth's climate. It will help in areas we haven't even thought of today.”

Either way, AI is here to stay. Here are five ways that artificial intelligence has been integrated into technology that we use every day:

Apple’s Siri

AI Sub-fields Used: Natural Language Processing

Apple’s Speech Interpretation and Recognition Interface or Siri first made her debut on the iPhone4S in 2011, and has been an integral feature of Apple’s smartphone ever since. Siri can interpret voice commands to make phone calls, find directions, suggest nearby restaurants, create reminders, and more. Artificial intelligence is at the heart of Siri. It takes spoken language, converts it to text, parses and analyzes it for keywords signifying certain commands, and then executes and responds to those commands. Over continued use, Siri even learns to recognize a user’s speech patterns and preferences. Siri can even beatbox, snarkily respond to inane math questions, and coyly deflect marriage proposals.

That said, anybody with an iPhone can tell you that Siri isn’t always accurate at interpreting commands. But Apple’s acquisition of the speech-related artificial intelligence company called VocalIQ in late 2015 may help nudge Siri a little closer to becoming Scarlett Johansson.

Facebook’s facial recognition technology

AI Sub-fields Used: Computer Vision

Have you ever uploaded a photo to Facebook only for it to automatically recognize and correctly suggest tagging your friends? Facebook’s facial recognition algorithm uses your profile pictures and any photos in which you’ve been tagged to calculate a unique number or “template" based on your features, like the distance between your eyes, nose and ears. Their software learns more about what you look like with every photo of you that gets posted and tagged.

As a result, Facebook now has a huge facial database of their 1.6 billion monthly users. And it’s only going to get even more sophisticated -- and creepier. Last year, Facebook’s head of AI Yann Le Cunn announced that the company had developed an experimental algorithm that could correctly identify people in photos even when their faces were partially hidden or couldn’t be seen at all. “There are a lot of cues we use. People have characteristic aspects, even if you look at them from the back,” LeCun said.

Self-Parking Cars

AI Sub-fields Used: Computer Vision, Robotics

Parallel parking in a tight spot doesn’t come easily to most people. Thankfully, self-parking cars have been around for years. For example, one luxury car brand’s “intelligent parking assist system” lets drivers just pull up beside a parking space and the system will do the rest. Sensors measure the length of the space and the distance of your front and rear bumper from obstacles or other cars. Press a button and the car’s computer system will calculate the best angles at which to steer into position.

The next frontier is self-driving cars, on which a number of companies like Google and Tesla are working furiously. Some predict that there will be 10 million self-driving cars on U.S. roads by 2020, which is a heartbeat away.

Google Translate

AI Sub-fields Used: Natural Language Processing

Nuanced translation remains one of the things with which machines have difficulty. Google, with its unprecedented access to vast amounts of human-translated text on the web, has been a pioneer in machine translation. Google Translate uses a method called statistical machine translation to search for patterns in billions of words of already human-translated text, such as books and United Nations records, and applies these to doing new translations. As of 2013, Google Translate was providing a billion translations a day for over 200 million users. Today it supports translation of text, speech, and even images between any two of 90 supported languages.

Netflix Content Recommendations (& Others)

AI Sub-fields Used: Machine Learning

Today, sophisticated algorithms predict what we might like by spotting patterns in choices made by millions of other people similar to ourselves.

Before the digital age, people turned to librarians, radio DJs, and film critics to discover new entertainment. Today, sophisticated algorithms predict what we might like by spotting patterns in choices made by millions of other people similar to ourselves. Recommendation engines like those used by Netflix (as well as Amazon, Spotify, and others) are built on the fact that our taste is rarely as unique as we like to believe. About 70% of the content watched on the video platform are recommendations. Another advantage machines have over people is that they’re impervious to pretentiousness. “A lot of people tell us they often watch foreign movies or documentaries. But in practice, that doesn’t happen very much,” says Netflix’s VP for product innovation and personal algorithms, Carlos Gomez-Urribe.

To quote William Gibson: “The future is already here — it's just not evenly distributed."

What now?

To quote William Gibson: “The future is already here — it's just not evenly distributed.” The language processing technology behind Siri, the facial recognition technology behind Facebook’s photo tagging, and the recommendation engines backing Netflix represent serious advances in the field. These technologies are currently locked into the companies that developed them, but soon a tech company could create a product that combines all these existing pieces to develop a talking house/butler like Jarvis.

Want to learn more? Join us in Manila this Feb 18 & 19 for a conference on how artificial intelligence applications are impacting the world around us. We’re flying in data scientists with a global view of the field. Get inspired, get thinking, and get ready for the decade ahead.

Photo Credits:
- Cover image from Ex Machina
- Star Wars

MORE STORIES

Thinking Machines’ Climate Action Commitment

Being data-driven means taking climate action. We’re trying to find our niche in the climate fight.

Why We Think Sabbatical Leave Benefits Are Important

Sabbaticals have led many of us at Thinking Machines to return feeling recharged, with fresh eyes, and full of new ideas.

Building a Twitter Big Data Pipeline with Python, Cassandra & Redis

We use some of our favorite tech to build a data ingestion, storage, and analysis infrastructure.