Thoughts on Automation

Some Thoughts on Automation

Thoughts on Automation

Since reading a couple of books and a lot of articles on automation, a couple of thoughts float around my mind. Such as: what if losing some skills is not all that bad? Or, what if the average IQ will not be sufficient in the future? Or what if the evolutionary basis of our survival was not just our brains or our hands/thumbs but our collaborative nature in combination with all of those things. What if, what if, what if.

First, the skills

There is a strain of thought that posits that once tasks are automated, we will lose the capability to do such tasks.  An example that comes to mind is the ability to read maps to find our way around town. It is said that we have become so reliant on our GPS that we are losing our ability to navigate or even to use maps (if I remember correctly - this was from reading one of my numerous books - maybe The Shallows). But stop and think about this. We used to know how to ride horses but now most of us depend on cars. Only a small subset of us know how to ride and care for horses; the rest of us are doing fine without horses. Or, most of us don't know how to hunt or farm. We have big companies, small farms or hobbyists that do that. We certainly aren't hurting. So maybe won't we miss the loss of a few skills. Certainly, our future generations will not miss it as they won't know anything otherwise.

 

Only a small subset of people will retain the skills and let's face it, are you really going to miss knowing how to do trigonometry?

Survival of the fittest

Maybe humankind came to dominate the world because they collaborated to get things done. Yes, we have larger brains and more specifically, we have the executive function brain that allows us to do higher order thinking. We also have hands and thumbs that enables us to create and use tools. And we have a voice box that allows us to communicate. All of those things together allowed us to dominate the world. But I doubt our ancestors succeeded as a single person. It probably took a crew to bring down a mastodon. We probably needed to bind together in order to thrive in this hostile world.

So how does this relate to automation? As we start to automate jobs, the capital owners are going to be very tempted to do away with labor in order to reduce cost and increase the profits into their pockets. But if our survival was dependent on collaborating with others in the past, I think it will still hold true in the future. The owner alone will not come up with new ideas with his machines to solve new problems. It will be a group of humans collaborating together that will come up with solutions. So I'm saying: you probably don't want to "throw away" your labor because they are going to come up with solutions to problems we haven't thought of or encountered. We all solve problems based upon the knowledge built by those before us.

Making, Doing, Prototyping

This one is going to contradict the first one: in order to come up with new ideas and innovations, we have to know how to do things. So some of that mathematical thinking that we are giving to our computers or the making of things that we are automating - we still need to know how to do those things. Only by doing those things ourselves do we start to understand how they work and the potential for change. Experts become masters by practicing and the making/doing/prototyping is a form of practice. And it is at the master level, not at the expertise level, were innovations come into being. At least that is how I understand the current theory of mastery and innovation. There is a curse where you know so much about "something" that you can't move beyond the current knowledge or break open the box to innovation. Sometimes, it is the ignorant that come up with the new. But other times, it is the master who generate the innovative solutions or ideas that move mankind forward. But only those masters that can move beyond the box. So along with automation, we must ensure that some of use continue to understand what the computers and software are doing. We can't just rely on the machines alone.

Whither Average IQ?

What if the current average IQ is not enough to survive in the future world of automation? A couple of months ago I did some preliminary research on IQ and from what I could tell, you had to have an IQ of at least 115 in order to make it in college. And only roughly 30% of the population makes it through college. With the way technology is moving, it looks like the only jobs left will be those at an extremely high level of cognition, so we will have an even smaller pool of people with sufficient mental capacity.

Furthermore, what if an average IQ is actually dumb? We have a lot of people who are unable to make a distinction between facts and alternate reality. Some of the things I read in the news about conspiracy theories is mindboggling. Some people are just incurious. Recently, there was a lot of news about protests around the country after the Trump inauguration. And yet, there were some women in Wisconsin who have not heard about the protests. I wonder how in the world do you get so insulated from the news? So I wonder, what if average IQ is just not smart enough?

Similar Posts