Artificial intelligence started with the calendar and abacus

Artificial intelligence started with the calendar and abacusI wrote earlier this week about why artificial intelligence (AI) is such a big deal. Using Kevin Kelly’s example, I claimed artificial intelligence will have a bigger impact than electricity.

My argument was that electricity helps human perform manual tasks. Artificial intelligence helps us perform cognitive tasks. Electricity makes us stronger. Artificial intelligence makes us smarter.

It makes sense that we’ve made more progress on the manual side. It’s easier. It’s easier to define. It’s easier to understand. Solving cognitive challenges will make an even larger impact. We’ve already freed our bodies to take on more interesting tasks. Imagine what can happen when we increasingly free our minds.

Artificial intelligence has been around a long, long time

Artificial intelligence sure feels like a fad. It’s a bonafide buzz word. Boring people with boring products try to use “AI” to capture undeserved attention.

Part of the reason artificial intelligence seems trendy is because we don’t have the proper perspective. AI isn’t new. It didn’t start with machine learning algorithms. It didn’t even start with the computer. Artificial intelligence, when properly defined, stretches back thousands of years.

What do I mean by “properly defined”? Artificial intelligence is any technology that helps a human being perform a cognitive task. In this light, a calendar is a piece of artificial intelligence. It supplements or replaces our memory. We don’t have to remember the timing of past or future events. We can outsource that effort to the calendar.

Likewise, an abacus is a piece of artificial intelligence. We can represent numbers in powers of ten, and use the abacus to perform stepwise calculations. We have no need to perform complex arithmetic in our head. The abacus allows us to step through the calculation in a much less cognitively demandingly way.

Artificial intelligence is so much more than today’s fancy algorithms

Again, AI is anything that takes a cognitive load off human beings. It’s much, much bigger than machine learning algorithms or the language processing used in chatbots. Focusing too much on today’s tech makes AI seem like a fad.

Think about all the cognitive effort you exert in a day. Now think of all the ways that machines could eventually take over those kinds of tasks.

We already have working examples. Computers recognize faces in photographs. Computers can even write captions for photographs. Computers assess and recommend investment strategies. Computers can design buildings and prosthetics.

Each is an example of artificial intelligence. Humans used to spend mental energy accomplishing these tasks. Today, we can outsource more and more of this work to computers. As a result, we have newly available cognitive resources to spend in any number of ways.

We will increasingly hand our cognitive tasks over to computers

What did you “have” to think about today? I put “have” in quotation marks, because I’m talking about a certain type of thought. What decisions or judgments did you make that brought you little to no joy?

You probably had to decide what to wear. What to eat. Maybe you thought about what route to take to work. Or what exercise to do, and when.

At work, maybe you had to design a new process. Or build a new model or slide deck. Or decide which material to use in a new product. Or compare service offerings from two different vendors.

All of this stuff will go away. Humans in the future eventually won’t think about any of it.

Humans might impose some constraints on the artificial intelligence. I don’t want to wear this kind of shirt. Or eat this kind of food. Or use this kind of material in my new product. The computer will work within the remaining degrees of freedom. Our cognitive effort will be minimal.

The analogy to manual effort is clear

Artificial intelligence is a huge deal. We already know the scale of its impact. Just look at how much manual effort we’ve freed ourselves from.

Machines have driven much of the manual effort out of farming, manufacturing, and transportation. Humans still expend physical energy in these areas. But the volume of such energy has diminished considerably over time.

The same thing will happen cognitively. We use our brains to process so many things today that we simply won’t think about tomorrow. We’re already seeing the very beginnings of it.

With emerging autopilot systems, we don’t have to think as much about parallel parking a car, or driving on a freeway. We have to think very little about what music to listen to, or what television show or movie to watch. Online services remove many of the day to day decisions about clothing and food.

We’re moving along the same path that we did when we reduced our physical burdens. Now we’re tackling our mental burdens. And as we free mental resources, we’ll think much more deeply about all kinds of things we haven’t explored much before.

To really appreciate AI, maintain the proper perspective

Don’t get all wrapped up in the particulars of today’s flavor of artificial intelligence. It’ll change. It always has.

What hasn’t changed is our desire to push more and more tasks onto machines. Until now, the vast majority of these tasks have been manual. Today we’re having more success handing over some our most tedious cognitive tasks.

Think about what you think about. Think about what your colleagues think about. Or what your suppliers think about. Or what your customers think about.

Machines will eventually take over all these tasks. It won’t happen tomorrow. But it will happen. Don’t let the fad-like hysteria around AI fool you. It’s not going away. We’re just getting started. And fortunately for us, the timing couldn’t be any better.

Leave a Reply

Your email address will not be published. Required fields are marked *