Brands
YSTV
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Yourstory
search

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

Videos

ADVERTISEMENT

ai caramba: What exactly is AI? | Part I: Why we need to focus on the I rather than the A

ai caramba: What exactly is AI? | Part I: Why we need to focus on the I rather than the A

Wednesday June 13, 2018 , 4 min Read

Is the hype around AI leading us astray from the true focus of the discussions around AI? Part 1 of a three-part series that explores the questions around the often fantastical world of Artificial Intelligence to answer the most basic question of them all – what exactly is AI?

To paraphrase the inimitable – and often incomprehensible – Sir Humphry Appleby, one must always dispose of the difficult bit in the title. It does less damage there than in the statute books. Given how our best definition of “intelligence” is Socrates’s famous quip “scio ne nescire”, meaning “I know that I am intelligent, because I know that I know nothing”, any article attempting to tackle the definition of AI should carry Humphry’s quote as a caveat. But most articles that explore AI and what it is, talk about the application of AI, the hype around AI, how AI is a boon or a bane (depending on your echo chamber), or exhorting incredulity at what AI can do (most probably written by AI).

In short, there’s enough hype around this for philosophers to start abandoning the ancient existential query of “What is the self?”, i.e. “What is I?” in favour of the more contemporary “What is AI?”.

Just as fools rush where angels fear to tread, so has the internet rushed to answer this new-age age-old question of what Artificial Intelligence is. My two cents to this chorus isn’t to attempt to cover this question from inception to date, but to approach it like any good Tarantino movie – by beginning from the middle.

The current frenzy around AI stems from a single fact – that we now live in an era where children are educated, but machines learn. To understand this statement better, one would only need to read about the basics of machine learning (or watch this helpful CGP Grey video) and compare it to one’s school day. The insight is stark – machines today learn exactly how you and I were educated when we were kids – pattern recognition and repeated testing. From grade one, where we were taught “A for Apple”, to grade twelve, where it was “A for Aminoethylpiperazine”, the core system of pedagogy hasn’t changed since the Industrial Revolution. It’s still a gavage of as much information as possible into a teenager with the hope that he/she retains it long enough until a pre-determined period where he/she is expected to replicate it verbatim in a controlled setting (read exam) to obtain a piece of paper which alludes to the same.

We call this ability to game this system “intelligence”. A testament to its failure is that we’re surprised that an artificial construct whose sole purpose is to complete these steps iteratively and infinitely ends up becoming more intelligent than us.

Imitation may be the sincerest form of flattery, but not of intelligence.

So, if we are to judge Artificial Intelligence with the same yardstick by which we gauge our own intelligence, we need to decide if we really deserve the appellation.

“Intelligence” requires more de rigeur than pattern recognition. If the only differentiation between human intelligence and Artificial Intelligence is that our scope of pattern recognition is larger due to exposure to a larger universe of data, then we’re mistaking the inevitability of AI superiority for its possibility. All current debates of “General vs Narrow AI” or “Strong vs Weak AI” aren’t based on principles but on the paucity of data inhibiting machines from matching us intellectually.

Thus, in due time, the basis of existence – and thus intelligence – will shift from cogito ergo sum (“I think therefore I am”) to calculo ergo sum – “I calculate therefore I am” – and none shall be the wiser.

(Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)