The Economist’s Special Article on AI
The Economist had a special section on AI in the magazine last week. Over the last few years I’ve been tracking articles that discuss AI, what it could do, who is likely to be impacted and what we should do as far as skills upgrading. Since I have been doing readings for a couple of years, it seems that we have progressed deeper into automation. The article had plenty of examples of how automation, artificial intelligence and machine learning are being used. The first areas to be impacted will be customer service, HR and finance – no surprises there. Then the disruption will move into whole industries. The article says that there are both positives and negatives.
[ap_divider color="#CCCCCC" style="dashed" thickness="1px" width="100%" mar_top="20px" mar_bot="5px"]You can try to reach the article here but I suspect that you either have to buy the magazine (or find a library with this magazine) or pay to get behind the paywall.[ap_divider color="#CCCCCC" style="dashed" thickness="1px" width="100%" mar_top="5px" mar_bot="20px"]
From my reading of the article, most of the positives will be geared toward companies and not to employees, hence my query of who is going to enjoy the “economic value” being created. The negatives will be things such as disruption, loss of employee privacy, and Orwellian surveillance.
[ap_divider color="#CCCCCC" style="solid" thickness="3px" width="100%" mar_top="20px" mar_bot="20px"]
“Firms can use AI to sift through not just employees’ professional communications but their social media profiles, too.” The Economist, March 31st – April 6th, 2018, “AI-Spy”, p. 13.
[ap_divider color="#CCCCCC" style="solid" thickness="3px" width="100%" mar_top="20px" mar_bot="20px"]
These spying capabilities can be extended to governments, so authoritarian governments, such as China, will be deploying these technologies as rapidly as they can. So it is really critical that we hold on to our democracy because once it falls, technology will make it supremely difficult, if not impossible, to bring back democracy. Once IoT and AI are embedded everywhere, it will be impossible to hide, to plan, to do anything.
The other thing the article was upfront about was the reason companies were rushing to deploying AI, aside from not being crushed by the frontrunners:
[ap_divider color="#CCCCCC" style="solid" thickness="3px" width="100%" mar_top="20px" mar_bot="20px"]
“In private, many bosses are more interested in the potential cost and labor savings than in the broader opportunities AI might bring…That is certainly not good for workers, but nor, ultimately, is it good for business…Some companies may not actually eliminate existing jobs but use technology to avoid creating new ones. And workers who keep their jobs are more likely to feel spied on by their employers”. The Economist, March 31st – April 6th, 2018, “GrAIt Expectations”, p. 5.
[ap_divider color="#CCCCCC" style="solid" thickness="3px" width="100%" mar_top="20px" mar_bot="20px"]
So all of that talk about using technology as a complement to human skills is just “a whole lotta wishful thinking.” As long as we have shareholder value as the primacy of business, there will be greater emphasis on cost cutting than on the human complement. We would have to first change our value system to demand a broader perspectives for businesses than just the profit making bottom-line.
This article is a very interesting reading, especially all of the various ways AI can be put to use and perform so much better than humans. It seems that there is nothing that it cannot do…except:
[ap_divider color="#CCCCCC" style="solid" thickness="3px" width="100%" mar_top="20px" mar_bot="20px"]
“they can easily do jobs that humans find mind-boggling, such as detecting tiny flaws in manufactured goods or quickly categorizing millions of photos of faces, but have trouble with things that people find easy, such as basic reasoning.” Ibid, p. 4
[ap_divider color="#CCCCCC" style="solid" thickness="3px" width="100%" mar_top="20px" mar_bot="20px"]
You must be logged in to post a comment.