Believe it or not, the term business intelligence (BI) is turning 150 this year. The term was first used in 1865 by Richard Devens to describe how a banker was able to gain a competitive advantage by gathering and acting on business information before his competitors.
Over the last few decades, new terms such as ‘competitive intelligence’ and ‘business analytics’ have started to pop up. Regardless of what you call it, the fundamentals are the same: companies need to utilise the data internal and external to their organisations to create efficiency and competitive advantage.
The analytic supply chain everybody is familiar with starts with someone from the business having a question or hypothesis. The request goes to a business analyst or shared services analytics group to investigate. The analyst, who not only understands the business context of the inquiry but also has enough knowledge of the IT systems to ask for the right data, makes a request of the IT analyst to procure a requisite data set.
After lots of Excel and Access work, Aaron provides a spreadsheet, BI report or visualisation that shows the results. The original business user ruminates for a bit and says, “What if we looked at it this way instead – and maybe add this metric over here?” Lather, rinse, repeat.
This cycle is fundamentally fraught with bottlenecks and impediments to business value. It can also cause rifts among the data producers and consumers, who may not trust in the results to make key business decisions. Businesses need to be agile and gain insights from their data before information goes stale or the business evolves further.
Organisations should embrace the iteration. The ‘what-ifs’ that arise are natural and are key to innovation and creative problem solving. Instead of attempting to dumb down results into generic reports or trying to throw more bodies at the problem, companies need a more efficient analytic supply chain.
Technology has risen to the challenge with the advent of new database paradigms and better ETL tools to benefit Ian in IT – and powerful BI platforms and sleek visualisation software to benefit those in the business unit. Analysts’ challenges seem underserved, and their problems keep getting harder.
There is more data out there than ever. The hunger for analytics is sharper than ever. How do businesses not only fix – but also change – the analytic supply chain to better serve the analysts and the needs of the business?
How can organisations engage more members of this supply chain and put the power of analysis into more hands, while allowing IT to fulfill its mission of systems and data governance, making analysis a truly collaborative function?
Enterprise data warehouses (EDWs) enable organisations to pull data from disparate systems and house everything in one ‘single source of the truth’. To curate and serve this data to the masses, companies rely on reports produced from their BI systems.
This is obviously important, as consumers of analytics need to know that data came from a trusted source, and there must be consistency in metrics and calculations. However, this model is starting to age and fray at the edges.
Not all data needed for analysis sits within the EDW. We can all imagine scenarios where an analyst needs to pull in data from the data warehouse, merge in a handful of spreadsheets, and connect to cloud applications.
More than ever, businesses need data exploration and predictive analytics capabilities. EDWs just weren’t built for this, and large statistical packages are complex and limited to a group of power users.
Consumers of analytics are savvier than ever. They don’t just want canned reports or Excel spreadsheets with embedded macros. When they see a result, they want to know the ‘how’ in addition to the ‘what’.
In order to fix the analytics supply chain, organisations must evolve and shift the responsibilities of the stakeholders in the chain.
IT should be allowed to set-up guardrails around data and systems to comply with corporate security and data governance. This in turn allows analysts to do more analyses without having to keep going back to Ian’s team.
Analysts must also be empowered with the analytical authoring tools to allow them to punch above their weight class. They should be able to construct complex analytical applications and predictive analyses without needing to be a SQL or R guru.
And the business should be empowered to answer their own ‘what if’ questions without running afoul of corporate IT policies, and remove the burden from the analysts.
Full transparency is critical to establishing trust among data brokers and key decision makers. The players in the analytics supply chain must understand and believe the data in front of them, along with all analyses behind it. It is only then that can true collaboration occur.
Sourced from Johnny Yang, Lavastorm Analytics