The explosion of information is accelerating. This can be seen in our everyday use of emails, online searches, text messages, blog posts, and postings on Facebook and YouTube. The amount of data being created and captured is staggering. It is flooding corporate walls and is only getting worse as the next big explosion is already upon us, the Internet of Things, when our machines talk to each other. At this point, the rate of information growth may go exponential. In his article for Industry Tap, David Russell Schilling explained the theory behind futurist Buckminster Fuller’s “Knowledge Doubling Curve.”
… until 1900 human knowledge doubled approximately every century. By the end of World War II knowledge was doubling every 25 years. Today … human knowledge is doubling every 13 months. According to IBM, the buildout of the “internet of things” will lead to the doubling of knowledge every 12 hours.
According to Gartner, as many as 25 billion things will be connected by 2020. As we try to make sense of this information, of what Tom Davenport calls the “analytics of things,” we will need methods and tools to assimilate and distill the information into actionable insights that drive revenue. Having these troves of information is of little value if they are not utilized to give our companies a competitive edge. How are companies approaching the problem of monetizing this information today?
One approach that gets inconsistent results, for instance, is simple data mining. Corralling huge data sets allows companies to run dozens of statistical tests to identify submerged patterns, but that provides little benefit if managers can’t effectively use the correlations to enhance business performance. A pure data-mining approach often leads to an endless search for what the data really say.
This is a quote from the Harvard Business Review article, “Making Advanced Analytics Work for You,” by Dominic Barton and David Court. This idea is further reinforced by Jason Reiling, Group Director of Trade Capability at The Coca-Cola Company, who commented, “If we don’t link the business use of the data with the hypothesis and overall objective, we find situations where the data is guiding the analysis, versus the business guiding the data.” This sums up one of the biggest challenges that exist in analytics today: organizations are throwing data at the problem hoping to find a solution versus understanding the business problem and aligning the right data and methods to it.
What begins to matter more at this point is not necessarily the amount of data, but the ability to codify and distill this information into meaningful insights. Companies are struggling with this issue due to lack of integrated methods, tools, techniques, and resources. If they are able to solve this challenge, they will have a clear competitive advantage. However, this only solves part of the problem; even with the most relevant information, companies are mired in poor decision making.
Decisions
The ultimate goal of collecting and synthesizing this information is to provide insights to executives and managers to make better decisions. Decisions are at the heart of your business and the most powerful tool most managers have for achieving results. The quality of the decisions will directly impact the success of your organization. It is no longer acceptable to equip organizational leaders, managers, and analysts with one-off training courses and conferences, expecting them to make quality decisions based on limited knowledge and gut feel. They have more information coming at them than ever before. Distilling the flood of information into actionable decisions that your organization can monetize is the new challenge.
Unfortunately, simply distilling the information is not enough. There are various ways we undermine our ability to make quality decisions, from decision fatigue to cognitive bias. One way to improve decision making is by using best practices and the collective wisdom of the organization. However, this practice is not widely implemented. In a study by Erik Larson of over 500 managers and executives, they found that only 2 percent apply these best practices when making decisions. Furthermore, even fewer companies have solutions in place to improve decision making.
When executives are not applying best practices or data to make a decision, more often than not they are relying on their intuition or “gut.” This type of decision making is riddled with flaws and often brings in cognitive biases that influence choice. A cognitive bias is a deviation from the norm in judgment based on one’s preferences and beliefs. For example, confirmation bias is the tendency to look for information that confirms our existing opinions and thoughts. These biases distort our judgment and lead to errors in choice.
Another culprit of poor decisions is the hidden influences that can affect our decisions, such as mood. For example, let’s take a decision about staffing between two field managers in two different locations. Whom to hire, when to hire someone, when to let someone go are all decisions they make based on little data and not much coaching. The decisions between two managers can vary to a large degree based on years and type of experience, mood on that particular day, and other factors that may be occurring in their life at that moment. These two individuals are likely to make different decisions on staffing even when presented with identical circumstances. This type of discrepancy in decision making is what the authors of “Noise: How to Overcome the High, Hidden Cost of Inconsistent Decision Making” call noise.
The problem is that humans are unreliable decision makers; their judgments are strongly influenced by irrelevant factors, such as their current mood, the time since their last meal, and the weather. We call the chance variability of judgments noise. It is an invisible tax on the bottom line of many companies.
The prevalence of noise has been demonstrated in several studies. Academic researchers have repeatedly confirmed that professionals often contradict their own prior judgments when given the same data on different occasions. For instance, when software developers were asked on two separate days to estimate the completion time for a given task, the hours they projected differed by 71%, on average. When pathologists made two assess- ments of the severity of biopsy results, the correlation between their ratings was only .61 (out of a perfect 1.0), indicating that they made inconsistent diagnoses quite frequently.
Along with noise, another impediment to decision making is decision fatigue. Decision fatigue is the deteriorating quality of your ability to make good decisions throughout the course of a day of making decisions. For example, scientists Shai Danziger, Jonathan Levav, and Liora Avnaim-Pesso studied 1,112 bench rulings in a parole court and analyzed the level of favorable rulings throughout the course of the day. The study found that the ruling started out around 65 percent favorable at the beginning of the day and by the end of the day was close to zero. Their internal resources for making quality decisions had been depleted through fatigue as the day wore on, resulting in less favorable rulings by the end of the day.
Another challenge for decisions is company size. “Internal challenges of large organizations are big barriers to decision making” according to an executive who runs analytics for a Fortune 50 company. She commented that it can take 1.5 years to get an insight to market due to the level of effort associated with disseminating the information throughout a large matrixed environment. The number of hops in the decisioning process impedes speed to market along with the degradation of the original intent of the decision.
How do we solve for these factors that influence our ability to make a quality decision? One way is to automate all or part of the decision process. Later on in their article, “Noise,” the authors state:
It has long been known that predictions and decisions gener ated by simple statistical algorithms are often more accurate than those made by experts, even when the experts have access to more information than the formulas use. It is less well known that the key advantage of algorithms is that they are noise-free: Unlike humans, a formula will always return the same output for any given input. Superior consistency allows even simple and imperfect algorithms to achieve greater accuracy than human professionals.
Our approach to driving the quality of the decisions higher in your organization is to create embedded analytical solutions to help managers make data-driven decisions of monetary value that generate action for their organization. There is an abundance of evidence to support our approach. In a study performed by Andrew McAfee and Erik Brynjolfsson, they found that “companies in the top third of their industry in the use of data-driven decision making were, on average, 5% more productive and 6% more profitable than their competitors.”
About the Authors
Andrew Roman Wells is the CEO of Aspirent, a management-consulting firm focused on analytics. Kathy Williams Chiang is VP, Business Insights, at Wunderman Data Management. They are the co-authors of Monetizing Your Data: A Guide to Turning Data into Profit-Driving Strategies and Solutions. For more information, please visit www.monetizingyourdata.com.
Reprinted with permission of the publisher Wiley from Monetizing Your Data: A Guide to Turning Data into Profit-Driving Strategies and Solutions by Andrew Roman Wells and Kathy Williams Chiang. Copyright 201 y Andrew Wells and Kathy Chiang. Available wherever fine books are sold.