The excellent collection overall has two interesting articles related to automation of knowledge work. One of the articles emphasizes imperfections of algorithms and another one suggests strategies for humans when AI is able to perform some of their tasks.
(Based on my experience in marketing, marketing automation did not reduce the demand for knowledge workers, but rather expanded the “knowledge” required to be a marketer. Though companies thought initially that they will need less marketers with the advent of automation, this assumption was incorrect. The automation allowed more knowledgeable people to achieve better results, but the minimum number of people and minimum amount of knowledge to realize any ROI on automation actually increased.)
Algorithms Need Managers, Too
The article suggests that however sophisticated, algorithms are literal, require very precise instruction and understanding of their limitations. The typical example is giving the instruction to an AI to “save the Earth,” which will proceed with an attempt on elimination of humans as the most reasonable method of achieving the objective.
Example: a predictive algorithm was selecting products that can be purchased in China and re-sold in US. The program worked well until customers started to return the products. Long-term product satisfaction was not built into the process.
Example: algorithm can predict clicking on an add, but the required result is a sale; optimization on the click will generate more activity, but may not generate revenue.
Example: Netflix predictive algorithm for DVD rentals did not apply to video streaming.
Also remember that correlation still doesn’t mean causation. Suppose that an algorithm predicts that short tweets will get re-tweeted more often than longer ones. This does not in any way suggests that you should shorten your tweets. This is a prediction, not advice. It works as a prediction because there are many other factors that correlate with sort tweets that make them effective. This is also why it fails as advice: shortening your tweets will not necessarily change those other factors.
The article ponders the future of AI replacing some of the knowledge worker’s tasks, and what knowledge workers could do:
- Step up (strategy)
- Step aside (area that requires human interaction)
- Step in (work with algorithms – what might be a default “augmentation” approach)
- Step narrowly (area within profession that is unlikely to be automated)
- Step forward (create next generation of AI)
How Indra Nooyi Turned Design Thinking into Strategy (Pepsi)
The article explains very well “design thinking” on easily understandable examples of Pepsi.
Interesting: Pepsi also uses a variation of “reverse innovation” – launching a product in smaller market (outside of its home US market), where cost of failure is acceptable.
Interesting: Pepsi calls healthy products “good for you,” and products that do not fall into this category “fun for you.”
Every morning you’ve got to wake up with a healthy fear that the wold is changing, and a convection that, to win, you have to change faster and be more agile than anyone else.
People Before Strategy
Discussion of people should come before discussion of strategy. What are employees’ capabilities, what help might they need, and are they the very best?