I liked the book ‘What to Think about Machines that Think’ immediately. Along with the jaunty title, it has a snappy structure – approximately 185 mini essays, brain bytes, by sage people about AIs (artificial intelligences). Each contribution is 3 or 4 pages long which is apparently how long a thought is when written down.
The essayists responded to ‘What do you think about machines that think?” I’m making my way through and have read mostly entries from engineers and physicists. This book is the most fertile source of thought stimulation I’ve encountered in a long time. Each contribution is wonderful and I’m riffing off of most of them.
‘Contemplation of artificial intelligence makes us ask who we humans are’¹ Murray Shanahan writes. One of the book’s themes is ‘who are we’, although it’s a desire to set ourselves apart from AI’s that’s triggered the existential question in this case.
How are we different from thinking machines? Steven Pinker suggests the way that AI’s think is nothing special², its a series of logical conclusions. A simple example is the hierarchy of suggestions you get when start to enter a URL into your search engine. It may seem like the interface ‘knows you’ and can anticipate your interests, but really, the suggested sites are based on simple statistics about your previous behaviour. Similarly, your wise grandmother might have seemed to know things about you when you were a child that you didn’t know yourself. And she’s smarter than a rudimentary AI. She watched your reactions in a number of situations and recognized the trends like the search engine, but unlike the software, she understands human nature, and what was motivates you. When it comes to human nature, we’re often very predictable. Shakespeare provides good evidence to support this. Although he wrote centuries ago, his portrayals of young lovers (Romeo and Juliet), corrupt, yet ambitious leaders (Macbeth), and crafty business people (Merchant of Venice), ring as true today as they did when the plays debuted.
Emotion could be our defining feature. An interesting observation by Steven Pinker, ‘Being smart is not the same as wanting something’² could suggest our primal ancestry will set us apart. Was this the author’s intent? The idea of motivation, of driving force, ambition, compulsion, fills my heart with pride for humankind. Machines don’t strive to excel, or make heroic efforts to do things. They do what they’re programmed to. They achieve goals. If the goal is to maintain a temperature of 22 degrees in a room, they induce the heating elements and cooling vents of the HVAC system to warm or chill the air when a deviance from the desire temperature occurs. Machines don’t care that the three year old twins have a fever and are malnourished because their father is unemployed. AI still keeps the temperature at 22 degrees. A human superintendent knows the fragility of toddlers and the added stresses of poverty and secretly tweaks the heating system to divert more heat to protect the young, even if their mother can’t afford it.
Humans have survival instincts, very strong ones, which may set them apart from AIs. Does an AI even care if it’ll be turned off tomorrow? I suspect that depends on what it believes it needs to do the next day but I’m sure it wouldn’t fight to the death to protect itself, unlike most people who would sacrifice everything to be sure they get out of bed tomorrow, even if it’s to face the same old dripping tap, sour milk, and demonically possessed boss.
Is it instincts that set us apart from AI’s? We still have a primitive area in our brain responsible for instinctive or involuntary actions. My own opinion, based on observing people is that this primitive brain controls more of our behaviour than we are aware of. If that’s the case, it could distinguish from AIs.
We honour and hold in high esteem leaders who are intuitive – those that make logical leaps most of us are afraid to pursue. Are these intuitive leaps instances of higher thought – processing so fast that only the outcome is important? That would be AI-ish.
I consider instincts and intuition closely related, although many would not³. Instincts are subconscious – leading us to perform acts without deciding to do so. We act instinctive to pull our hand out of a flame or to veer the car clear of an oncoming truck on the highway. When the adrenaline wears off, we’re proud of our quick thinking. Intuition is generally considered more conscious, related to thought. However, an intuitive action or decision is one that ‘comes from the gut’ or ‘feels right’. Whether it’s to take a different route home or hire the kid with no experience, when we realize the benefits of the choice, we learn to ‘trust our intuition’. So, is intuition higher thinking than instinct? Some explain intuition as a subconscious compilation of knowledge gathered in the brain. Could it be that intuition is the instinct of thought?
This is my premise: Human’s are different from AIs because we evolved from a less evolved species and we do things that don’t reduce to a series of logic equations. AIs are cool. We made them, so they have the potential to be ok. Or at least as ok as run-away trucks, fires, demonically possessed bosses and new hires from hell. But don’t worry. We know how to disconnect their power supply, at least on the AIs.
¹ Murray Shanahan in Brockman, J. (ed) (2015) What to Think about Machines that Think Harper Perennial NY pg. 1-4
² Steven Pinker in Brockman, J. (ed) (2015) What to Think about Machines that Think Harper Perennial NY pg. 5-8
³I have to giggle. One site I found that explained the difference between instinct and intuition used human mate choice as an example of something decided intuitively because it was the culmination of too many thought processes to be reduced to explanation. If ever there was a decision that biologists could explain at an instinctive level, it’s mate selection. Ha-ha. Geek moment.
originally posted March 3, 2016