I have always displayed what I regard as a healthy scepticism towards buzz phrases that spread like influenza viruses through the ICT world. The industry has the disturbing habit of producing them like rabbits out of a hat to dazzle the consumers of technology with the latest innovation. In less than 18 months, the term “Artificial Intelligence” went from nowhere to number seven on the Gartner website’s search list. Is it real?
In very recent times, we have been asked to apply our minds to the potential value of “cloud computing”, “big data”, “machine learning” and “the Internet of Things”. Ask any lay person to describe what these terms mean and you will be subjected to puzzlement and some interesting word pictures that are far removed from the reality. Ask any of the more mature members of the IT profession and they are likely to reminisce about how the underlying technology of these innovations has been around for decades and there is nothing particularly new about them, other than scale.
When I realised that “Artificial Intelligence” was joining this lexicon, I asked myself what might be its validity beyond raising the sales volumes of technology vendors. What makes these two words the right ones? Separately, artificial means not real, not natural, a copy made by humans. We are familiar with artificial flowers, artificial respiration – even artificial insemination. Intelligence suggests high mental capacity, cognitive abilities and (importantly) imagination. Socrates claimed to know he was intelligent because he knew that he knew nothing. Einstein said, “The true sign of intelligence is not knowledge but imagination.” Can a rational process without human influence be intelligent?
Is it possible to combine these two words and arrive at a valid phrase that accurately defines a technology? Are we leading ourselves into the trap of believing that the products of technology that are operated autonomously by application of algorithms are therefore “intelligent”? Does that belief extend to whether it is more, or less, intelligent than some human counterpart? That belief is likely to have unfortunate consequences because artificial (unreal) intelligence lacks the key ingredient of imagination. Artificial things lack an ingredient which defines them as less than the original – artificial limbs, artificial sweeteners, artificial horizons: they all have their uses but they are not real.
The risks of what is termed Artificial Intelligence are real. (Notice that I have not abbreviated it to AI, which further embeds the unreality of the term, as we no longer associate the acronym with the words.) The Institute of Electrical and Electronics Engineers is investigating standards as part of a drive to ensure that artificial intelligence will act ethically. Elon Musk expressed his concern about unregulated Artificial Intelligence being a risk to human civilisation. Is he just a modern-day Luddite, or do we need to take heed of his warning?
I am not advocating that we should not employ sensor-informed, algorithm-driven technology. Far from it. There are myriad environments where it makes absolute sense to provide faster and more reliable management of functions than can be achieved with human hands on the controls. But we must accept that it is inhuman. It is rational but not intelligent. It is efficient but not intelligent.
If the risks are real, how do we mitigate them? Is regulation and legislation the answer? Can we ensure that the employment of this technology creates socio-economic value for humans rather than depriving them of it? As is often the case, rules and laws are formulated long after the risks have impacted the community. One school of thought suggests that algorithm-driven decision-making is inherently objective but this overlooks the human bias that can be and is built in to the algorithms themselves. Algorithms are built from history but life springs surprises, demanding a “fail-safe” mode to prevent the technology from performing an unpredictable action.
I don’t believe for a minute the suggestion that Facebook’s Artificial Intelligence project invented its own language. It may have surprised the developers that it selected codewords that were not “English” but I would have thought the purpose of codewords was that they were not readily deduced. Just like passwords… a human failure in design would be the cause of the gibberish rather than any intelligence in the technology.
So, what if we stop calling it “Artificial Intelligence” and look for a phrase that more accurately describes the reality of the technology, such as “Augmented Implementation”? The acronym buffs can still talk about AI but we can move way from ascribing properties and abilities that the technology does not possess. Augmented = made greater in size or extent, enhanced. Implementation = putting into effect, fulfilment, using tools to perform activity. For me, this makes the technology real, it removes the emotion (which isn’t there anyway) and identifies that any risks inherent in the technology arise from the quality of the human design, construction and testing processes that create it and the humans who apply the technology in real life.
Back on planet Earth (or, more accurately, on the African continent), how can we engage with the realities of Augmented Implementation to realise the benefits of the technology while avoiding the pitfalls? There are already significant contributions to the quality of life, from education to healthcare. A combination of data collection (Internet of Things), data analytics and Augmented Implementation can streamline processes, improve employee performance and enhance decision support in any private or public enterprise.
The absolutely essential outcome is that we use the opportunity to create economic value-add opportunities for the people of Africa, to have humans from 17 to 70 actively engaged in creating real wealth for their families. It is our responsibility as decision- and policy-makers to focus on promoting the ethical development of innovations to support societal growth and close the gap between those “in the know” and those looking to learn.