The history of artificial intelligence

The history of artificial intelligence


Under the term Artificial Intelligence (AI), we bring together allTheories and techniques used to produce machines capable of simulating intelligence. This practice allows a human being to put a computer system to solve complex problems that integrate logic. More commonly, when we talk about artificial intelligence, we also mean machines that mimic certain human traits.

Artificial intelligence before the year 2000

Proof that it is not a science today, the first traces of artificial intelligence date back to 1950 In an article by Alan Turing entitled “Computing machines and intelligence“Where a mathematician explores the problem of determining whether a machine is conscious or not. From this article came what is now called the Turing test, which assesses a machine’s ability to conduct human conversation.

in the same category

A processor that takes the form of a silhouette of the brain to represent artificial intelligence.

What limits should be set for AI to protect fundamental rights?

Another possible origin can be traced back to 1949 In a post by Warren Weaver with a note on machine translation of languages, he argues that a machine can very well perform a task related to human intelligence.
The history of formalizing artificial intelligence as a true scientific field dates back to 1956 At a conference in the United States held at Dartmouth College. Subsequently, this field will reach prestigious universities such as Stanford, MIT or Edinburgh universities.

from the middle 60 secondsResearch on artificial intelligence on US soil was primarily funded by the Department of Defense. At the same time, laboratories are being opened here and there around the world. Some experts at the time predicted that ” Machines will be able, within 20 years, to do the work anyone can do “. If the idea is visionary, then even in 2018 artificial intelligence has not yet taken on such importance in our lives.

at 1974 There comes a period called ai winter “. Many experts have failed to complete their projects and the British and American governments have cut their funding for the academies. They prefer to support ideas that are more likely to lead to something tangible.

at 80 years, the success of expert systems helps re-launch research projects on artificial intelligence. An expert system was a computer capable of behaving like an expert (human), but in a specific field. Thanks to this success, the AI ​​market has reached a billion dollars, motivating various governments to financially support more academic projects once again.

The exponential development of computer performance, particularly by following Moore’s Law, allows them to 1990 and 2000 To exploit artificial intelligence in previously uncommon terrain. In that time, we found data mining, or even medical diagnoses. It will be necessary to wait 1997 For a truly informative release when IBM’s famous Deep Blue beat Garry Kasparov, then-world chess champion.

2000-2010: Artificial intelligence is a social issue

In the early 2000s, artificial intelligence was incorporated into a large number of “sci-fi” films that present more or less realistic scenarios. Certainly the highlight of the new millennium matrixThe first part of the saga was released in theaters on June 23, 1999. You have by Steven Spielberg Released in 2001, inspired by Stanley Kubrick, then I’m a robot (2004). Metropolis (1927) blade runner (1982), you can see (1982) and break (1984) already paved the way, but we don’t yet know enough about AI and its applications to imagine real scenarios.

between 2000 and 2010Our society is witnessing a real boom in information technology. Not only does Moore’s Law continue on its merry way, but the guys are grooming themselves. Personal computers are becoming increasingly available, the Internet is spreading, smartphones are emerging… Connectivity and mobility launch the era of Homo Numericus.

Until 2010There are also questions about the ethics of integrating AI into many industries. Thus, in 2007 South Korea unveiled a botnet code of ethics with the goal of setting limits and standards for users as well as manufacturers. In 2009, the Massachusetts Institute of Technology launched a project that brings together leading AI scientists to reflect on the main lines of research in this field.

From 2010: Artificial Intelligence Without Borders

Since the beginning of our decade, artificial intelligence has been demonstrated thanks to the ingenuity of IBM’s Waston. at 2011This super brain defeated the two greatest heroes in the world Danger!. An exercise that is far from simple for a computer. However, after Deep Blue, the 2000s marked a turning point in media coverage of research.

Moore’s Law continues to drive advances in artificial intelligence, but data processing underpins all of this. To perform a task, the system needs only rules. When it comes to reversing or giving the most accurate answer possible, this system must learn. This is how researchers develop new processes for machine learning and then deep learning. Quickly, these data-driven methods break many records, prompting many other projects to follow this path. In addition, the development of artificial intelligence technologies makes it possible to launch very diverse projects and not think about pure and difficult calculations, but to integrate image processing.

From this moment on some companies will take the lead. In fact, the problem with AI is no longer having the brains to develop systems, but rather having the data to process it. This is why Google is quickly becoming a leader. at 2012Mountain View had only a few projects of use, compared to 2,700 three years later. at 2013Facebook has opened the Facebook Artificial Intelligence (FAIR) website headed by Yann Le Cun. A turning point that takes the giant away from his social mission to turn toward science. Amazon, Microsoft, Apple, Netflix and Tesla are not left out either, as are many Chinese companies.

Data management will make it possible to apply AI to better understand X-rays from doctors, drive cars, translate, play complex video games, create music, see through a wall, imagine a game missing from a picture… Areas where AI performs more than This number raises many questions about the professional role of man in the coming years.

The media space occupied by artificial intelligence now no longer leaves questions related to this field in the hands of researchers, but rather in the public debate. This logically creates as many tensions as excitement. Unfortunately, we are only at the beginning of the massive integration of these technologies. The coming decades will keep us in store for many surprises.


Source link