Machine Learning

The capability of a machine to improve its own performance
machine learning definition

What does machine learning mean?

The term machine learning (abbreviated ML) refers to the capability of a machine to improve its own performance. It does so by using a statistical model to make decisions and incorporating the result of each new trial into that model. In essence, the machine is programmed to learn through trial and error.

Where did machine learning come from?

The term was conceived in the 1950s—about the same time scientists begin to use artificial intelligence (AI) for the simulation of intelligent behavior in computers. Although contemporaneous, the two technologies are notably different: whereas AI generally refers to the capability of a machine to carry out tasks as a human would, ML specifically denotes a computer application used to process and learn from data, as exhibited in a self-driving car's ability to detect and process nearby objects. Other common applications of machine learning involve internet search personalization, fraud protection, and identity security—all of which require a machine to learn particular behaviors.

How is machine learning used?

A more agile variety of machine learning, which identifies complex, nonlinear patterns in large data sets and makes it possible to create more accurate risk models nearly in real time, is beginning to be used in these types of applications. You've already run across this type of machine learning in other environments, such as your email application's spam recognition algorithm, Amazon's product recommendations and the suggestions you get on Netflix. Now very similar technology is being deployed to combat card fraud.
— Jamie Swedberg, Credit Union Management, 1 June 2018

Update: This word was added in January 2019.

Words We're Watching talks about words we are increasingly seeing in use but that have not yet met our criteria for entry.