10 Characteristics of Artificial Intelligence

 What is artificial intelligence?

Artificial intelligence or computational intelligence is understood to be what machines or computers can display . This is a controversial term in the sense that it is difficult to define exactly what intelligence is, although it is usually understood as the ability of a computer system to rationally perceive its environment and adapt its strategies to achieve its objective .

Commonly, artificial intelligence is limited to an imitation by machines of human intelligence , giving the user the impression of being in front of another being endowed with individuality.

However, as technology and computing advance , the emergence of true intelligent computing beings , commonly referred to as AI (Artificial Intelligence), is expected.

Features of artificial intelligence :

  1. Origin of the term

Although the idea of ​​the intelligent machine has been with us for a long time, with examples such as the Golem or the robots of Science Fiction, the term "Artificial Intelligence" was used for the first time in 1956 , by John McCarthy, an eminent American computer writer who he contributed enormously to this type of study.

  1. Concept

artificial intelligence
AI usually encompasses the rational and logical aspects of thinking.

The concept of Artificial Intelligence is still diffuse. In general terms, it could refer to the attempt to build a computer system that reproduces and even transcends the thinking tasks of the human brain , with the same margin of autonomy, individuality and creativity , but taking advantage of the advantages of the rapid and massive computation of the computers.

This concept usually encompasses the rational and logical aspects of thought , but it is difficult for it when faced with concepts of another nature such as love , commitment or morality .

  1. Schools of thought

The study of Artificial Intelligence covers two different schools:

  • Symbolic-deductive Artificial Intelligence. Also known as conventional Artificial Intelligence, it attempts to understand and replicate human behavior from a formal and statistical analysis perspective.
  • Subsymbolic-inductive Artificial Intelligence. Also called Computational Artificial Intelligence, it pursues interactive development and learning, based on empirical data and modifications of connection parameters.
  1. Pillars

artificial intelligence
AI has artificial neural networks that mimic the functioning of a brain. 

Four pillars of the study and development of Artificial Intelligence are considered:

  • The search for a desired state, among the possible sets according to the actions offered at a given moment, that is, free choice.
  • Genetic algorithms inspired by the human genetic code (DNA).
  • Artificial neural networks, which mimic the functioning of organic brains.
  • Formal logic reasoning, similar to the abstract thinking of humans.
  1. Applications

Contemporary applications of Artificial Intelligence in its different prototypes and stages of development can be summarized as:

  • Video games and smart entertainment software .
  • Digital support for online services and computer programs.
  • Massive data and information processing systems.
  • Robotics and complex automation systems.
  1. Turing test

artificial intelligence
The Turing Test is considered the litmus test of AI.

One problem with AI is the real difficulty in distinguishing between a really smart artificial system, and one programmed to give that impression to the user (fake it).

To do this, the English mathematician and computer scientist Alan Turing designed a test, later named in his honor, which consisted of making a person read a conversation between another individual and a computer programmed to imitate human intelligence in their responses.

If after 5 minutes the observer was unable to distinguish the machine from the person, the system would have passed the test.

  1. History

The first proper work in the field of Artificial Intelligence is the model of artificial neurons by Warren McCulloch and Walter Pitts in 1943, although this term had not yet been coined.

The arrival of Turing and his work in the area since 1950 would mean the inauguration of a computational branch that would grow by leaps and bounds during the 1960s and 1970s , with expert support systems in solving mathematical equations and, later, with scripts or Computer scripts.

An important milestone in Artificial Intelligence would occur in 1997: the chess player Gari Kasparov would lose to Deep Blue , a computer specialized in the game. Many would see in this the announcement of the smart computers to come.

  1. Fears

artificial intelligence
AI may be able to replace people in their jobs. 

Artificial Intelligence is not always greeted with enthusiasm. Many see the possibility a technological tool capable of displacing many individuals from their positions work since the IA could do it in less time without taking breaks or claim for rights . Robotization represents both a hope for industrialists and a threat for workers.

On the other hand, much has been warned about the dangers of a rational logic devoid of empathy , emotionality, and affective commitments, capable of making cold and painful decisions for humanity in pursuit of some abstract goal.

  1. Fiction

Intelligent computers have been constant throughout science fiction and fantasy , both in film and literature , often acting as helpers and others as antagonists of the story.

The American writer Isaac Asimov was prolific in the matter, through his tales of robots . Many times the idea of ​​Artificial Intelligence is accompanied by dystopian scenarios and technological nightmares such as those represented in the films The Terminator (1984) or The Matrix (1999).

  1. Future Advances

artificial intelligence
AI is expected to be able to assist man in various everyday settings. 

The many possible applications for a truly intelligent computer system are endless, but they point to the full automation of vehicle driving systems , industrial work and assistance to people in various fields: at home, when conducting research or in the workplace. the handling of telecommunications, for example.

No comments:

Powered by Blogger.