What do you think of when someone says “AI” or “Artificial Intelligence”? For most of us, it conjures up an image of the future. Of movies and robots and technological magic. It doesn’t much evoke the here and now.
But that’s not so. Artificial intelligence is already out of the box. And while it might not be as slick as the movies, it has vast applications in almost every field, from business to medicine, traffic jams to Facebook photos. Most of us use or benefit from artificial intelligence every day. And if you’re running an online business, you’re definitely benefiting from it every day.
Before we dive too deep into how artificial intelligence is shaping your life, let’s set up some context. Because while artificial intelligence is already happening, that’s not the term most computer scientists use for it. The better term is “machine learning”.
Machine learning is really the only kind of AI we have. And in computer science circles, “AI” is a bit of a squishy term. It’s got too many popular, sci-fi connotations. So programmers prefer machine learning. It’s more well-defined. And so while “machine learning” might not sound quite as sexy as artificial intelligence, it is basically the same thing. Machine learning is real-world AI.
To best understand how machine learning works in the here and now, we’ll need a brief history lesson.
Machine learning started out as simple pattern recognition. It was focused on tasks as simple as identifying the numeral three in a photo, for instance.
That took quite a lot of time to figure out. We understand a three as soon as we see it, but for a computer or a program, it has to measure dark and light shapes, then sort those out into a pattern it’s been told is a three.
This somewhat “simple” task of processing images into processable information has big implications, and applications. One everyday application would be OCR software – optical character recognition software. That’s used to take a scan of a printed piece of paper, and then interpret that scan into words, sentences and paragraphs. It’s thanks to OCR software that we were able to digitize the world’s libraries way faster than if someone had to retype all those pages.
So that’s basic pattern recognition – or at least one form of pattern recognition. There are other types of patterns to interpret. Many machine learning programs can be distilled down to these tasks:
- Signal processing
- Voice recognition
- Text recognition
- Image recognition
Several things separate machine learning from simple pattern recognition. They include:
Much larger data sets.
Like your company’s entire order history, for example.
Like customer interactions from emails, social media, retail stores, and your website.
Pattern matching and data sorting.
The ability to take those data sets and inputs and line up the data in a way to see relationships and patterns that a human operator might not have seen, or might not know to ask for.
The ability to “learn” as it is given more information.
The more data machine learning applications have access to, the more accurate they become.
An example of this would be how Google’s search engine shows us search results based on what we’ve clicked before and other preferences or behavior. So what you see when you search for “red coat” is different than what I’d see.
That’s already some pretty powerful capabilities, but there is one new evolution of machine learning that takes this all even further. It’s called “deep learning”.
Deep learning is still fundamentally a pattern recognizer, but with deep learning software can pull in vast data sets, and look for and identify patterns that a human might not even know to request. These patterns are called “indicators”.
Deep learning machines also employ “neural nets” – a series of filters that are actually based on how our brains are designed. Those neural nets pass information and metrics through them to see similarities and patterns too sophisticated for a human mind to grasp. Then they use those similarities and patterns to execute their tasks with “spooky” precision – whether that’s delivering one perfect page for a search query from 100,000,000,000 possible pages, or pinpointing a fraudulent charge out of 50 million transactions just processed in the last hour.
While deep learning is being used in some applications right now , it is truly cutting edge. It is the closest we are to the “artificial intelligence” of science fiction. That’s exciting stuff, but so is what we can already do with “plain old” machine learning now.
Here are just a few of the real world applications for machine learning that are being used right now. You’ll almost certainly use a few of them just today.
1) Text recognition: Google’s RankBrain for better search results
One of the most widely talked about applications of machine learning in the digital marketing world comes from Google. It’s called RankBrain, and Google has been testing it in the search results since October of last year.
Google’s use of machine learning, and deep learning, is largely focused on understanding text. It’s trying to understand “search intent”, ie, what people actually want when they type a search phrase in. For instance, if someone types in “pizza”, are they looking for a recipe, a delivery service, or the history of Italian food?
Understanding search intent is considerably more sophisticated than what Google used to do even a few years ago. At the beginning, keywords were static things – so a search for red coat would bring up pages that
a) had the term “red coat” in them and
b) had lots of links pointing to that red coat page
As Google’s algorithm got smarter, it started to weigh some links more heavily than others, and then throw in factors like average visitor’s time on page, how many words were on the page – the works. At current count, there are about 200 different signals Google uses to rank a page.
But RankBrain is different. It literally interprets what a user’s search query means. But RankBrain does not currently affect search engine rankings. It’s exclusively being used to understand user queries and then feed those pre-processed queries into the Google search algorithm. In other words, RankBrain is helping the Google algorithm interpret what people want when they enter a given search phrase.
As one Google engineer put it:
Lemme try one last time: Rankbrain lets us understand queries better. No affect on crawling nor indexing or replace anything in ranking
— Gary Illyes (@methode) March 18, 2016
If you want a more animated and in-depth discussion of what RankBrain is and where machine learning for search engines is going, watch this video from the folks at Moz:
By the way – Google isn’t the only company that applies machine learning to its search function. Shutterstock also uses machine learning to help its users sift through the millions of photographs on its site.
2) In “recommendation engines” like content or product recommendations.
Machine learning is particularly good at showing related information. So it would be great at suggesting related news stories. That’s one of its applications now, but as you can guess, this could also be applied to showing related products on an ecommerce site. That could be taken a step further by customizing those product recommendations based on who’s viewing the page.
At Respondr, we’re set up to take that even further – to recommend not just which product someone might also like to see, but to recommend that product through whichever channel (email, text, web page, social) the user is most likely to respond to that recommendation in.
And that’s not it. We’re also set up to time that recommendation. So you’re showing the right product to the right person via the right channel at the right time.
3) Face recognition: Facebook’s “DeepFace”.
Ever uploaded a photo of a bunch of people at a party, only to have Facebook recognize most of them, even though you barely recognize a couple of them from the photo? At it’s simplest, that’s image recognition. But Facebook has brought it to such a level that they’ve named their machine learning face recognition program. It’s called “DeepFace”.
DeepFace isn’t just machine learning – it’s a true deep learning application. DeepFace uses a nine-layer deep neural network that draws from 120 million parameters. It was (and is?) trained “on the largest facial dataset to-date, an identity labeled dataset of four million facial images belonging to more than 4,000 identities.”
It’s stuff like this that keeps international spies off Facebook. And might even keep some of us regular people off it, too.
Of course, Google can also recognize you in photographs, too.
4) Detecting security breaches and online fraud.
Ever got a call from your bank about a suspicious transaction? That’s probably the result of machine learning. The “machine” has been set up to read and analyze billions of transactions. It tracks the transactions on your card and then compares them to your typical purchasing behavior. If there’s an input (aka a purchase) the algorithm senses is out of line, you get a call.
This is one application complex enough to sometimes warrant the use of deep learning. There are simply so many transactions occurring, and it’s so important to not miss a fraudulent charge, that large transaction processors are willing to invest in anything that gives them an edge.
5) To detect computer viruses and other malware.
There’s a crazy amount of malware programs lurking on the Internet. Fortunately, most of them aren’t very original – there’s only about 2% of unique code from malware program to malware program. That’s good news for us. It allows a program driven by machine learning to recognize and deal with these nasty programs fast.
DeepInstinct is one of the companies offering this type of cybersecurity.
6) To scour legal contracts
Hate reading legalese? You’re not alone. Some of the people at Legal Robot dislike it so much that they’re trying to automate the work of reading and assessing contracts.
How do they do it? They offer access to a machine learning-backed program that can sift through legal documents. The program can translate contracts into plain English, but it can also identify basic sections of a legal contract, like whether the terms royalties have been included. It’s even sophisticated enough to assess if one party is being given an advantage over the other. Maybe in later iterations, they’ll teach it to tell lawyer jokes.
7) Customer service
If Google is already interpreting search queries with machine learning, surely customer service departments would also like to leverage the technology. It would let them recognize and respond to customer help requests more intelligently. Machine learning applications are especially attractive for “automated” or “self-serve” customer service requests. And they’re particularly needed because of all the different inputs customer service agents now handle – there’s the phone, email, social media, website support forums, online chat – it’s a lot to keep up with.
These applications would basically be a recommendation engine. But instead of recommending products or news stories, a machine learning customer service application would recommend the right answers to questions. Ideally, it would be smart enough to know which model device you had or which operating system you were using that device with. And it would know if you’d had issues with your device before.
Want to see an example of this in action? Check out Wise.io. it uses machine learning to create a predictive customer service experience.
Got big data? Machine learning to the rescue
That’s not the limit of what machine learning can do. In fact, there are so many applications it would take days to list them. Just think of any huge data set that needs sorting, and there’s a machine learning application for it already, or it’s being built. Some of the most noteworthy are:
Even short-term weather predictions rely on massive computer processors and inputs. Predicting sunshine or rain up to two weeks in advance might be a great prize for some young computer scientist.
Stock market predictions.
I don’t have to tell you how much money could potentially be in this. Bet on it that people are trying to write this code.
Drug interactions, medical trends and more.
Maybe it will be an application, not a human, that finally figures out why some people get cancer and some don’t.
Mobile devices are making this technology more and more widespread. Google is investing a lot in voice search, and you know how much Apple is dedicated to Siri. Even Amazon is getting in on voice recognition – their new Echo device uses machine learning to better understand a user’s requests.
Coming soon to a processor near you
These machine learning applications aren’t quite loose on the street yet. But they will be here before you know it.
Alas, it’s not happening yet. But we all sure wish it was. A traffic system operated by machine learning might be able to orchestrate traffic lights to optimize traffic flow, rather than forcing people to wait in line while an intersection sits empty until the traffic light’s timer runs out. But even now, machine learning-driven programs are analyzing traffic accidents and transportation networks. It’s only a matter of time until they’re rolled out.
If I’m going to mention traffic control, I can’t leave out autonomous cars. As you surely know, Google is making a big push on this front. They’re not the only manufacturer racing toward this, either. It’s expected that by 2010, there will be 10 million autonomous cars on the road. They’ll all use some type of machine learning, and probably use it in several different systems.
Anyone who wants to manage and mine a huge data set should be interested in machine learning. So it’s no surprise marketers are especially interested in its applications – we’re awash in big data, but many of us struggle to sort out actionable insights from it. Machine learning was tailor made to help with that.