The Blind Men and an Elephant

Distinguishing digital technologies for clear-eyed transformation

As enterprises craft digital transformation strategies while facing a difficult learning curve regarding cutting edge solutions, they are often presented graphic depictions of digital transformation technologies that treat highly diverse tools and assets as if all were of equal utility, value, and adaptability. As a part of the learning curve, it is imperative to distinguish the inter-dependencies of these technologies in order to best adopt them to your advantage.

At its simplest level, the parable of the blind men and an elephant is a narrative about a group of blind men who have never come across an elephant before and are asked conceptualize what the elephant is like by touching it. Each blind man feels a different part of the elephant body, but only one part, such as the tail or an ear or a tusk. They then describe the elephant based on their partial experience. “It’s a rope” (the tail). “It’s a spear” (the tusk). It’s a fan (the ear).” And so forth.

Nearly each time the subject ‘digital transformation’ is raised, we are faced with a version of this parable. Are we being presented a holistic view or merely fragments that beg for elaboration?

In brief, digital transformation is any substantive change enabled through the use of digital assets. While digital transformation can occur with or without cutting-edge technologies, a myriad of software vendor and consulting firms tend to tout ‘flat’ graphics of the panoply of recent (or recently re-branded) technologies that appear to constitute ‘digital transformation’.

Among them:

  • Data Intelligence
  • Blockchain
  • Cloud
  • Internet of Things (IoT)
  • Machine Learning
  • Big Data
  • Analytics
  • Artificial Intelligence
  • Mobility

It can safely be said that no enterprise deploys all of these technologies (which, at base, are tools rather than solutions) in an integrated fashion. So, no, your company is not the only one.
Even more important is the knowledge that there are key inter-relationships among various of these technologies. Artificial Intelligence (AI), the simulation of human intelligence processes by machines, especially computer systems, is a prime example. AI solutions generally depend upon a collection of preceding technologies as per the following example.

In a first step, the Machine Learning process usually requires a massive amount of data, thus ushering in a dependency upon Big Data. While any of the data may have intrinsic value, the exercise is to extract specific ‘needles of intelligence’ from the data haystack. Whatever is not specific to the pattern that is sought is seen as ‘hay’ and is discarded.

Big Data is obviously volume-centric but greater sizes of ‘haystacks’ are not guaranteed to yield greater numbers of ‘needles of intelligence’.

Oftentimes, the needles are not directly extractable. Extraction may result from an inference of their presence being included in a proscribed ‘search pattern’ or ‘predictive analytics’ if the machine learning is supervised.

And, finally, those needles of intelligence within patterns in data will be culled and scrutinized with the aid of Analytics to generate foundational knowledge and rules for Artificial Intelligence.

Analytics could come before machine learning in order to determine the contours of your ‘learning’ and a follow-up analysis to inform subsequent machine learning contours and ‘rules’.
Given the volume of data involved as well as the need to share process and results across a variety of stakeholders, it is nearly certain that all of this activity would occur in the Cloud.

The bottom line is that a) acquisition of any of these tools on its own will yield limited benefit to your organization and b) without serious planning and design, acquisition of a necessary suite of such tools will not necessarily lead to tangible business benefit.

More importantly, if your organization does not already have a well-established enterprise applications plant and its consequent SoR or ‘System of Record’*, you will find that your legacy systems will hinder any digital transformation initiative. Following the above example, your Big Data step would be sabotaged by a lack of sufficient and credible data necessary to enable a consequent Machine Learning initiative.

A January 2018 study, Beneath the Surface of Digital Transformation – Why Leaders Modernize Enterprise Applications, commissioned by IBM and completed by Forrester Consulting Thought Leadership, reports that 23 percent of those that have completed a major phase of their digital transformation strategy would, in hindsight, move modernizing their System of Record up to an earlier phase.

Just as a house is not built with only a hammer, digital transformation will not be built with one or more cherry-picked technologies. Addressing digital transformation in the manner described herein requires business stakeholders and technologists alike to scale a steep learning curve. Deployment of these technologies poses as great a degree of difficulty as organization-wide integrated enterprise applications implementation projects. Transformative organizations need to not only scale the learning curve, they must also scale the imagination curve and face often gut-wrenching organizational changes in strategy and behavior. Only then will the elephant appear in all its majesty.

*SoR (System of Record) refers to a database management system, commonly associated with integrated enterprise applications that includes the authoritative data source for a given data element. Where the reality of “authoritative” can become vague due to data that is derived from multiple source systems.