Artificial intelligence has been around for more than we think, its roots stretch far into history, up to the Greek mythology. Our ancestors have been fantasizing about mechanical men and artificial beings since the antiquity and Chinese and Egyptian engineers went as far as building automatons. In 1206 AD, an Arab inventor named Al-Jazari, created what is considered to be the first programmable humanoid robot, that consisted in a boat in which there were 4 “mechanical musicians”, automatic musical instruments powered with the help of water. Afterwards, in the 15th century, Leonardo da Vinci designs the knight robot, which could move similarly to a human being, using a system of pulleys and cables.
From the early 1700s, up until the 20th century, ideas of machines that could possess consciousness and references to modern technology surfaced throughout literature. Efforts and research culminated with the invention of the first programmable digital computer, in the 1940s, and thus the idea of constructing an artificial, electronic mind, became more and more realistic. Artificial intelligence becomes an academic discipline in 1956, when its name is officially accepted at the Dartmouth Conference.
In the following 18 years, AI flourished, and an eloquent example of this is WABOT – 1, invented in 1972, in Japan, which was the world’s first full-scale intelligent humanoid robot.
Artificial intelligence also had to face obstacles, given the high expectations that haven’t been met, in spite of the notable progress that it made. The years between 1974 and 1980 marked the first ‘AI winter’, a name given to suggest the standstill that AI had to face, funding was cut, as the governments and large companies lost faith in the development of artificial intelligence. But it wasn’t all bad, since some of the most important programming languages were born in that period of time, languages such as C, Prolog, SQL and Pascal.
AI made a brief comeback in what was considered to be ‘The Second Summer of Artificial Intelligence’. Progress in the neural networks area was made and foundations were laid for a fifth-generation computer system, with the help of new funding from Japan, The United States and The United Kingdom. The AI market reaches one billion dollars, but things take a turn for the worse as artificial intelligence enters its second winter that lasts for 6 years, until 1993. Machines were no longer able to adapt to new requirements, the fifth-generation computers were underwhelming and therefore surpassed by the new desktop computers.
After the second winter came to an end and up until the year 2011, artificial intelligence hasn’t made any significant progress, but there were some small victories that are worth mentioning, like the invention of a chess-playing computer that defeated a world chess champion and artificial intelligent robots and chatbots.
Things really started to take shape for AI since 2011, as its area of application has broadened to several domains. In medicine, AI is used to diagnose multiple conditions, establish personalized treatments, analyse data, documents and scientific publications and even perform digital consultations. The military industry also benefits from artificial intelligence. From surveillance drones and target recognition, to combat simulation and training, militaries apply AI, reducing personnel costs and increasing effectiveness. Artificial intelligence is shaping the future of education as well, making it more accessible. Administrative tasks are simplified, and the learning process will be personalized, having a higher regard to every pupil’s needs, thus increasing both student and teacher performances.
The history of artificial intelligence speaks for its achievements. Nowadays, we probably couldn’t even imagine our lives without AI, without our smartphones, personal assistants like Alexa or Siri, cars with self-driving features or smart home robots, that are just a normal part of our everyday life. The future of artificial intelligence looks promising, from smart homes, to smart cities, things continue to evolve, improving the quality of living standards. And although the automated future is a challenge, the more companies will learn to adapt, and therefore get on board with the digital transformation process, considering AI is a crucial part of it, the more they will bring added value to the world we live in.
Here at Arnia, through Apsisware, our AI division, we offer state of the art AI services in various industries. With hundreds of successfully completed projects for clients ranging from Fortune 500 to Forbes 50, spanning on 3 continents and more than 10 countries we can offer excellent software development opportunities for your business. Our services cover web and mobile applications, web design, big data solutions, database management systems, e-commerce solutions, cloud-enabled solutions, content management solutions, business intelligence and R&D.