What is artificial intelligence? A glossary
What does a machine need to “think like human beings”
Artificial intelligence, machine learning, deep learning... Everybody has heard these terms, they have a high marketing potential, they’re what will distinguish tomorrow’s technology from what is currently available today, they’ll be the difference between being in and out in many business sectors. But what do they mean in practice? Are these terms used inappropriately or correctly? Let's try to draw up a sort of glossary to make some clarity.
Let’s begin with artificial intelligence, the macro set that contains all the others. The notion of AI has evolved over time and, for non-professionals, is inevitably conditioned by science fiction: HAL in Stanley Kubrick's Space Odyssey, or more recently Data in Star Trek: The Next Generation. The Hollywood idea of a sentient machine, intelligent to the point of being a threat to man, is not entirely out of place, but it is out of time, and does not help make for greater clarity. If we have to give an initial definition - which fits both science and science fiction - we could say that artificial intelligence is the idea of building machines that are able to think like humans.
The key word in this definition is 'think'. When we refer to machines that replace the activity of human beings in the industrial and intellectual fields by performing tasks that are very complex but do not require the simulation of thought, we are not talking about artificial intelligence but of simple automation.
Where is the boundary? In order to be able to 'think like humans', a machine must be able to know and interpret the reality that surrounds it and have the power to evolve based on experience. This is however a definition that can also be adapted to describe what simple ‘intelligence’ is. Going even further into the detail, those working on artificial intelligence want to simulate human beings’ ability of abstract, creative and deductive thinking by allowing a computer - where each process must be traced back to binary code - to learn independently.
This is an overall definition, but artificial intelligence contains many distinct areas of work. For the purpose of developing an operating glossary, and excluding the field of automation, we must draw a first line of demarcation between two large subsets of artificial intelligence: machine learning and data mining. Then we will try to explain what an algorithm is and what relationship it has with artificial intelligence. Finally we will provide definitions for some other hyper-inflated terms such as deep learning and chatbot.
Artificial intelligence is the study of tools able to perceive the world around them, capable of planning behaviors, and making decisions aimed at achieving a goal. Artificial intelligence has its foundations in mathematics, in logic, in philosophy, in probabilistic science, in linguistics, in neuroscience and in decision theory. Many other areas can be traced back to artificial intelligence, including computer vision, robotics, machine learning, or natural language processing.
Machine learning is a subset of artificial intelligence. It aims to analyze large amounts of data and then make predictions, which become more and more precise with the analysis of additional data samples. The goal is to teach computers to learn independently. The algorithms used in the field of machine learning allow to identify patterns in a sample of data, to build models to explain the world and to make predictions based on the experience without having preconfigured rules and models available.
When we talk about data mining, another subset of artificial intelligence, we refer to the processes by which computers are able to recognize patterns by analyzing large amounts of unstructured data. The results of data mining can then be used as a basis for marketing or guidance decisions in various sectors; but these decisions are not taken by machines, they are left to human beings.
A mathematical formula or a series of programming commands that allow a computer to solve problems. Algorithms - the term derives from the medieval Latin algorismus mediated by al-Khuwārizmī, of the Arab mathematician Muḥammad ibn Mūsa of the ninth century. - are mathematical calculation procedures and are not a prerogative of information technology, let alone artificial intelligence. When talking of algorithmic programming languages we refer to a series of instructions - mathematical or logical computation - aimed at solving a problem. Intelligence, whether artificial or not, is needed to create an algorithm but it is not necessary to apply an algorithm. So, if we are using an algorithm to program a machine, we are using intelligence. If, on the other hand, we provide the machine a problem without suggesting an algorithm and the machine solves the problem, then we have obtained artificial intelligence.
When we talk about deep learning we refer to an ultra advanced subset of machine learning. It aims to recognize complex patterns in data samples using multiple levels of correlations. Basically, in deep learning we try to imitate the way neurons are organized in our brain. This is why applications of this type are referred to as 'neural networks'.
When talking about bots or chatbots we refer to software aimed at communication in natural language with human beings and with the aim of automating particular tasks or retrieving information from databases. A bot can live in another application, such as Facebook Messenger or Whatsapp, it can be used to manage first level, call center or help desk operations and integrated into sites and applications, or it can automate dialogue via email and text messages for a certain company or for product assistance. Software that can independently manage a profile on Twitter or Facebook and come to have a life on social media are also called bots. According to some, bots should not be fully included in the field of artificial intelligence, given that their answers are largely prepackaged and not open, but the understanding of natural language is one of the basic applications of artificial intelligence.