Growing up, there were a lot of cinematics shown with a very simple storyline; the ongoing advancement in technology and the robotics sector coupled with the humans’ reliance on apparent robots would be the downfall of all humanity. Ergo, the created would rule over the creator, the prey would become the predator and the hunter, the hunted. Maybe you weren’t born in the timeline or if you were you never got the opportunity to see them, there was Star Wars, Terminator, iRobot, RoboCop, the Matrix, Resident Evil, Iron Man, Tron Legacy, Transformers and the list goes on.
In a lot of these movies, producers originate concepts for the robots to be able to emulate which is ahead of the timeline we humans find ourselves in and in doing so somewhat develop an independence and rebel against their own creators. The interesting is that most of the things that we saw in these movies and dreamt about are becoming our everyday reality. Is it a laptop that can now work by the mere operation of the touch of a finger? An automated machine that works with no human supervision? How about an intelligence system designed to answer your every need? Have you sat down to really analyse all the functions you use your phone to perform? And the number of functions yourself gets done without it?
They are taking over.
AI is closer than you think
It doesn’t sound funny anymore does it? No, not when your whole life is stored in a place called the Cloud, not when everything you use now is linked to Dropbox and especially when you cannot go an hour or two without your phone. Even though it sounds like a conspiracy theory, it’s not so farfetched when you think about it. The appreciation comes subtly, and upon pondering, it will be realized as we delve into how far technology has come in the intrinsic aspect of artificial intelligence.
Artificial Intelligence also called machine intelligence popularly known as ‘AI’ is a term for simulated intelligence in machines. What it means is that in a sense the machines are programmed to think like a human being and mimic the way a person acts or behaves; hence the simulation aspect as it is not actually a human.
There is so much that serves as a source of interest when it comes to understanding AI and the journey it has undergone to reach the heights it is reaching. According to wiki, AI began in antiquity amidst myths, stories and fables of artificial beings endowed with intelligence and consciousness by master craftsmen; as Pamela McCorduck writes ‘AI began with an ancient wish to forge the gods.’ The famed story ‘Frankenstein’ is a movie which sees Victor Frankenstein create a creature and brings it to life to be like a human which turns out to be a great abomination as he turns on his creator.
The creation is quoted at a point in time saying ‘I ought to be thy Adam, but I am rather a fallen angel.’
After World War II
There is also the interesting story of Pinocchio that sees him carved by Geppetto the woodcarver and comes alive, and all he ever wanted was to be a real boy. Articles and various narratives have all pointed to all these suggestions that they were fragments of ideas men had about creating or having a hand in the creation of some sort of artificial intelligence or a creation that could emulate human intellect to some degree.
On the science font, it’s been a bit over a decade since the talk and action on any kind of artificial intelligence was implemented. It’s funny how it took the Second World War to trigger fresh thinking about AI and the systems it could be applied in. In Britain, Alan Turing was known for building some of the first ever robots whiles grey Walter pulled off something incredible by creating the Turing Test which has set the bar undoubtedly for intelligent machines because as at that time it could trick an individual into believing that they were talking to an actual person. Along came Asimov with the Three Laws of Robotics designed to stop our creations from turning on us. His imaginations included a computer with the capability of storing all possible human knowledge that anyone could ask or inquire of.
A Young Scientist
The term ‘Artificial Intelligence was coined for a summer conference at Dartmouth University organized by a young computer scientist. Top scientists gathered from all over and debated how to tackle AI. There was the school of thought supporting the view of a top-down approach: pre-programming a computer with the rules that govern human behaviour. Amongst them were influential academics like Marvin Minsky. Others, however, opted for a bottom-up approach, such as neural networks that simulated brain cells and learned new behaviours. It was Minsky’s views that dominated, and together with McCarthy he won substantial funding from the US government, who hoped AI might give them the upper hand in the Cold War.
The thing that made the whole AI concept doubtful by many was mainly because so much had been invented and very little returns had been seen. Nothing interesting had resulted so far and those who were on board had substantially nothing to show for it. This happened in 1973 and eventually, funds were refused to continue their work and research; this season in the journey is referred to as the AI winter. Even though they took away their funds, they couldn’t take away their passion and passion drives men beyond anything. And so just a little over 8 years later, the first sign of a breakthrough appeared when they created expert systems that were streamlined to focus on narrow tasks instead of general intelligence as a whole.
The first successful and commercial one was known as the RI which helped configure orders for new computer systems and by 1886 the investment had saved the company an estimated $40m almost every year. The highlight of how much belief people had in the growing accomplishments of AI was in the famous ‘Man versus Machine; fight of the 20th century’. Garry Kasparov, the then world chess champion was challenged by Deep Blue, a supercomputer in a game of chess and utterly won resoundingly. After that history-making victory, it seemed the world’s runway was ready for AI to unveil itself as a success and not just a dream the fathers who sacrificed for it had.
AI has evolved
In 2008, a small feature appeared on the new Apple iPhone – a Google app with speech recognition. Looking at what we have around now that sounds simple, right? But it wasn’t, it was a serious breakthrough because decades of investment and research never raised the accuracy of speech recognition over 80%. Now Google claims that the accuracy is a little over 90%. AI has so evolved giving birth to so many children who have become more than friends but our personal assistants, our reminders and our search assistants; Apple adopted Siri, Windows brought Cortana, Amazon brought Alexa and Google’s Assistant is ever ready to help out. And just recently, as of all that we have now it’s advanced enough, Google introduced their Duplex to the world. The technology featured a human-sounding robot having a conversation with a person who couldn’t even tell that they were talking to a robot. Yeah. Mind completely blown.
If all of it doesn’t paint the picture clearly for you, then I want you to look at it in this way; a scene from a film where David Hasselhoff is sitting in his car, the Knight Industries Three Thousand aka KITT and says ‘KITT, are you ready?’ and the car responds by saying ‘Never been a better time to be ready Michael.’ Those are the kind of territories we are about breaking into.
Submitted by Kweku Diaw (CBT Level 1 Writer)
Photo Credit: Unsplash