No Easy Button: The Real Story of Big Data Analytics

Now that Big Data Analytics has become a hot topic, Big Data expert Prof. Mariann Jelinek observes, “the ‘hype cycle’ has advanced to include both puff and backlash.”

Big Data enthusiasts often write about such operational advantages as shorter customer wait times or queue length – with claims of greater effectiveness. Other commentators boom Big Data as a panacea that will solve all problems, but offer little specificity as to how. And in the other corner, backlash attacks on Big Data Analytics often set up a straw man by envisioning an “easy button” version, untouched by human intervention, where machines do all the thinking!

 “Not so,” insists Prof. Jelinek, who asserts that neither side has it right. Here’s her explanation of what users can realistically expect:

Big Data does offer promise, but it demands context, interpretation and relationships among data points. For the past six years, CIMS has used Big Data in Directed Big Data Analytics (DBDA), but only with extensive up-front discussion, domain definition, identification of relevant URLs, and so on, to create the needed context.

The first challenge to using Big Data effectively is — it’s Big! The Internet is a huge universe of potential data, some worthwhile and some worthless. Finding what’s useful in any given application requires careful thinking about where relevant, trustworthy data might reside.

Where innovation is concerned, that might not be immediately obvious because new developments have not yet reached the mainstream, for example. Then, once the relevant data has been captured, it’s apt to be a huge and unwieldy corpus — millions of pages are not unusual. Finding the right needle in this huge haystack is a non-trivial problem. Browsing such a really large database is neither easy nor fast — tools like statistical sampling are generally not helpful, since the needed information for innovation (for example) may be atypical.

What DBDA Does

DBDA, in contrast, enables rapid, timely capture of huge amounts of information, which then becomes instantly accessible for targeted review. Directed web crawls target high-value websites (like those of the CDC, WHO, and relevant research labs) for disease information. There, web-crawlers capture, filter, and index their finds to make accessible vastly more information than any individual can possibly address.

The breadth of data capture helps to ensure effective coverage of the targeted information field. Filtering and indexing the captured data ensures that you can actually target the topics of interest within the data in small enough bites to work with. So, from a million web page images or article pages (for instance), the software allows you to create a search string that presents you with the 4 or 10 articles directly relevant to your topic of interest.

The Big Point I am making is that DBDA is no replacement for human Subject Matter Experts; instead, it’s an extension of SME capability. The SME still has to read these articles and decide what they imply — no easy button here. But the SME will be able to read 4 or 10 articles with confidence that they are indeed everything in (say) the field of tuberculosis that is currently reflected in the literature, on researchers’ websites, and in recent conference proceedings.

Try Doing This with Google!

Data becomes useful in proportion to its ability to affect decisions— to drive action. CIMS researchers use a well-documented method, AHP (Analytical Hierarchy Process), to create a decision model that places the data into relationships and transforms it into evidence by which you can guide actual business decisions. This implies identifying and overcoming decision biases that erode decision quality, cultural barriers to change, and the inertia and incrementalism that keep firms stuck in “business as usual.”

Finding Credible Partners

The schematic on p.9 illustrates decision stages as they would apply to strategic alliances with other organizations. This is an increasingly important aspect of innovation, particularly for firms “stuck in incrementalism” because they have previously drastically reduced R&D, marketing or other critical innovation functions. Outsourcing these functions is, in essence, a decision to depend upon strategic alliances going forward.

DBDA is tailor-made to manage a wide search for the credible partners central to success, and to evaluating potential partners, once found. Inherent human cognitive biases can unduly constrain search, truncate considerations or bias evaluation. Well-crafted procedures built into the CIMS DBDA approach mitigate those hazards, as the schematic indicates. It identifies the hazards— decision traps— at each likely stage in the decision process, and provides procedural remedies to improve decision quality. Properly framing the problem, identifying needed information, then getting it, and evaluating alternatives all improve decision quality.

Well-crafted procedures built into the CIMS DBDA approach mitigate the hazards—decision traps—shown here.

Well-crafted procedures built into the CIMS DBDA approach mitigate the hazards—decision traps—shown here.

The heart of the DBDA computer search process resides in Natural Language Processing (NLP) algorithms that enable effective search of unstructured web materials— for instance, text in websites, professional journal articles, and conference presentations. However, while these search and filter capabilities dramatically expand breadth and reach of the search, note that this “heart” is surrounded by multiple elements of a process targeting the important cultural resistance elements that typically preclude a stuck firm’s ability to “unstick itself.”

Credibility of outside information is always an issue: a strong joint project team of credible organizational actors helps reassure the culture that recommendations are believable.

In other words, bringing resources to bear to reframe the firm’s understanding and recontextualize its strategic position is as essential as the information itself. Indeed, without this reframing and recontextualization, a firm may be quite unable to understand the nature of its difficulties or the potential resources available to solve them.

It Starts with a Joint Project Team

When CIMS researchers apply DBDA to a problem, they begin by convening a heavyweight Joint Project Team (JPT) including firm executives with decision responsibility, data scientists to manage the software, and relevant outside experts to ensure appropriate depth and breadth of knowledge for the discussion.

We organize the discussion using well-documented strategic and decision models: PESTEL (for Political, Economic, Social, Technological, Environmental and Legal factors that affect a business now, or may in the future); and AHP (for Analytical Hierarchy Process, a mathematically-based framework for weighting and scaling factors relevant to a business decision) (1). These models help to expand thinking and to induce the all-important discussions that help decision makers understand the context of their choices and thus have confidence in the recommendations.

Other similarly broad-scope models could be used; these are well-documented and generally well understood, and they facilitate the exchanges that eventually result in consensus understanding of the strategic context.

Initial PESTEL and AHP analyses are, essentially, informed hypotheses to be tested against evidence from the web search and revised as necessary; the aim is to let the evidence speak.

The iterative process of identifying relevant factors and likely information sources, specifying domains of interest and suggesting keywords and phrases hones team members’ understanding as the evidence comes in. Revised models, rooted in evidence, may well be substantially different from those initially proposed. They are certainly more robustly supported, with much broader evidence, after web crawls and analysis.

Overcoming Human Weaknesses

Such evidence-based discussion directly targets the normal biases and heuristics of human decision-making, as the schematic shows. Nobody holds “all the evidence” in mind— but cognitive limits can be overcome with models like PESTEL and AHP.

It’s easy for logical errors and contradictions to creep into complex problems; AHP ensures consistency, and enables the JPT to rapidly evaluate alternatives, so that decision makers have a good sense of what might be done, how the alternatives compare, and (most importantly) why — in their opinions — one option is better than another. Moreover, credible JPT members help ensure organizational acceptance of resulting decisions.

In short, the CIMS Directed Big Data Analytics process is far from an “easy button” — and it’s no panacea. It is a vastly more rigorous, more evidence-based, and more robust approach to properly contextualizing relevant Big Data to drive evidence-based decisions. It is clearly more powerful than gut feel, biased decisions, or inadequately sourced strategies.

Reference

1. Saaty, T. L. Decision Making for Leaders: The Analytical Hierarchy Process for Decisions in a Complex World. Pittsburgh, RWS Publications, 2012.

Mariann Jelinek, Ph.D. is The Richard C. Kraemer Professor of Strategy, Emerita at the College of William & Mary; samantine@icloud.com

Comments are closed.