Technology trends come and go, but the need for information is constant. Fifty-plus years into the digital revolution, and ‘information’ is still a strategic concern for IT leaders.
Designing IT with analytics in mind
Indeed, analyst company Gartner lists “information” among the ‘Nexus of Forces’ currently shaping the IT industry, alongside newcomers cloud, mobile and social.
“Enterprises are having to deal with a huge volume of data, much higher velocity and increasing complexity,” Richard Gordon, managing vice president of the analyst company told Information Age in March.
As strategic IT spending returns this year, Gordon believes that much of it will be directed towards “business intelligence, content management and database software” as a result.
Unfortunately, despite a wealth of information technologies, it seems that most organisations are still quite bad at giving their employees the information that they need.
In a survey of 8,300 workers from businesses all around the world by executive advisory the CEB, two thirds said they do not have access to the information they need to do their jobs.
Meanwhile, fewer than half believe the information contained within corporate data sources is “in a usable format”, the survey found.
CEB says these figures reflect a fundamental weakness in the way in which organisations assess the value of information.
Invoking the ‘big data’ buzzword to mean data from external and ‘unstructured’ data source, CEB says that “organisations struggle to identify the most valuable use cases for investments in big data”.
In other words, after five decades of information technology deployments, organisations still do not know how to identify the information requirements of their employees, or how to plan their technology investments accordingly.
It is a conclusion shared by Joe Peppard, professor of information systems at Cranfield University’s School of Management.
In a recent collaboration with Donald Marchand, a professor at the International ??????????????????????????Institute for Management Development (IMD), Peppard looked back over many years’ worth of IT project case studies, to see if he could spot any predictors of success that went beyond the obvious.
The pair found that there are two paradigms for IT projects – and that applying the wrong paradigm to a project is a good predictor of failure.
“These two paradigms are based on the view that the organisation holds about information,” Peppard explains.
“The first paradigm is based on a view of information as a corporate resource. If you see information as a corporate resource, you objectify it. You see it as something that resides in databases, and dashboards, and on reports.”
“Crucially, you view it as something that can be manipulated.”
“The second view is that information doesn’t exist, except in the minds of people, and that it’s people that give meaning to information,” Peppard says.
The former view, which Peppard and Marchand call ‘design to build’, is the dominant paradigm. It has served businesses well in the past, he says, when the purpose of most IT projects has been to automate business processes.
“It works very well for traditional ERP projects,” he says, “because what you are doing is finding ways to deliver the same information at less cost.
“For example, a customer order number is a fact. It doesn’t require any analysis by an employee – it’s just true,” Peppard explains. “So if you can find a way to deliver that information faster and cheaper, there’s a benefit.”
The ‘design to build’ paradigm falls down, however, when the desired end result is a capability, such as the ability to understand customers, or identify new market opportunities.
“In a typical analytics project,” Peppard explains, “we don’t know what outcome we’re looking for, other than some high level vision of giving us better insight into our customers.”
Because the precise output of these projects cannot be reliably predicted, conventional techniques for building a business case are not appropriate, Peppard says. Nor is the techno-centric attitude of many IT professionals. “I think there’s a magic bullet thesis, that somehow we can fire some technology or other at the problem and it will go away.”
This is evident in many big data deployments, Peppard says, and the result is predictable. “We are seeing dismal results in big data projects.”
In Peppard and Marchand’s view, the alternative to ‘design for build’ is ‘design for use’. This means designing a system that encourages the desired capability, such as data analysis, rather than one that encapsulates a desired business process.
What does that mean?
In the context of analytics, important considerations include whether the style of data visualisation suits the task at hand, or what cognitive biases and assumptions users are likely to exhibit and how they can be counteracted.
As for building a business case for this kind of project, Peppard reports the some organisations are exploring such esoteric models as the Black- Scholes formula for pricing stock options as a way to cost-justify analytics projects.
“The idea is that these tools give you the option, not the obligation, to do something, like discover new knowledge, so you can work out a value for those options,” he says.
Clearly, this kind of thinking is a world away from the conventional school of IT project management.
“We’re just at the beginning of thinking about this stuff,” says Peppard.
He argues, though, that organisations are stuck in an information management paradigm that is not appropriate for the task at hand. Peppard and Marchand’s forthcoming paper on their thesis is called ‘Paradigm Paralysis’.
Between the notoriously low success rates of analytics projects, and the fact the employees simply do not have access to the kinds of information they need, maybe it is time for a radically different approach.