Author: Paul Thomas
It has been said, and we will say it again, that ‘data is the new oil.’ Volumes of data, and the number of channels through which that data flows are exploding. In 2016, we created more data than the previous 5,000 years of existence at about 2.5 quintillion bytes (that’s about 10 million Blu-ray discs) per day. And with the proliferation of data from sensors (IoT) and wearables, the wave of data will only grow with time.
However, with all that data – insight just ripe for the picking – less than 0.5% of it is ever analyzed.
The holdup, says computational thinking pioneer Stephen Wolfram, is twofold: accessibility and integration. In truth, our most useful data is not being captured in a way that makes it easily accessible – whether it’s stored out of reach or exists in a form that hinders any kind of actionable insight. We’re not asking much, really. We simply want to be able to ask questions of our data and get useful answers.
A Brief History of Business Intelligence and Analytics
But, all hope is not lost. We’ve come a long way in our ability to gather and process data with astounding speed. Consider this: It took the U.S. government seven years to process and analyze the data collected in the 1880 Census. Seven Years! Since 1880, we have seen the rise of computing (thanks to the ‘tabulating machine,’ the 1890 census only took 18 months to complete), and it has been an all-out race to uncover easier, cheaper ways to collect data and faster, cheaper ways to store and retrieve said data. From that 1890 census, and exponentially so after IBM’s breakthrough in the 50’s, data storage became progressively more affordable, and data retrieval quickened. And then came the 80’s.
We can thank the ‘80’s for any number of its valuable contributions to society. With Pac Man and acid wash jeans, the 1980’s brought RDB (relational database) which allowed users to write SQL to retrieve data, vastly simplifying the data retrieval process. The decade of the Trapper Keeper also brought the “data warehouse”. Proposed by ‘father of data warehousing,’ Bill Inmon, data warehouses are optimized for response time to queries – perfect for data analysis and reporting. In quick succession, Howard Dresner (of Gartner fame) suggested the term “Business Intelligence” as an overarching phrase to include “concepts and methods to improve business decision making by using fact-based support systems.” While that term took decades to really catch fire, the confluence of these factors set the board for the business intelligence and analytics capacity we see today.
Analytics for Analytics’ Sake
By now, we understand that our capacity to collect, store, and retrieve data has blossomed over the recent years and that progress is merging with the 2.5 quintillion bytes of data that we are producing each day to present a fascinating opportunity. (We didn’t even get into the whole new world of open source software like Hadoop or cloud offerings like Amazon’s Redshift or Google’s BigQuery, but that’s another expedition for another day). This opportunity, however, has to be accessible to those who need it.
There is no shortage of speculation around the application of said data, but there are certainly common threads emerging. These are the trends we’re seeing:
- Use of Artificial Intelligence for high volume, repetitive tasks
- Analysis of big data for governance or competitive advantage
- Focus on business-driven applications that provide both operational and analytical use cases
- Evolving data agility, or the progress of analytic models to understand data in context and take business action
- Maximizing Microservices impact with Machine Learning
- Visualization of data for operational and business decision making
Let’s focus on data visualization for a moment. Many of these talking-head themes around the progress of ‘big data’ or the possibilities it affords focus on the technology that is powering this wave or the ‘why’ behind that technology. Visualization, however, brings us to the million-dollar question that is, “How will we use this data?”
Data Visualization takes us all the way back to Rene Descartes in the 17th century. You know Descartes from Philosophy 101 (“I think; therefore I am”), but like many great thinkers, his interests expanded across many fields – one of those being mathematics. Now, Descartes’ founded his philosophical principles on the idea that doubt was a weakness (very simplified paraphrase of a lifetime of work), and that one should reconsider any doubt until a strong foundation of belief is established. Interestingly enough, this perspective made its way into his work on mathematics and birthed that which is the bane of life for middle school statisticians everywhere, the Cartesian coordinates graph, or x and y axes.
It’s not surprising that Descartes’ appetite for certainty expressed itself in the visualization of data. Many studies have shown that the brain can process visual information much more rapidly than verbal information (or nodes and conceptual relationships, for example). Preattentive visual processing is the occurrence that takes place when the brain is presented with visual information – it's the preconscious processing of that data. Taking place over a series of stages, each handled by neurons that are trained to perceive the world, this process makes data analysis more efficient, and often more certain. I see, therefore it is.
Business Intelligence Implications in Financial Services
In a piece titled Real-Time BI: A Banking Perspective, TWDI concluded that Banking has kept pace with the adoption of Business Intelligence and Visual Analytics, but warned that most applications remain lopsided. In effect, banks have harnessed analytics for strategic use, but operational data remains relatively untouched. This operational data, the same study cites, is where financial services organizations can gain valuable insights for use in cross-selling or refinancing scenarios. So, while most Business Intelligence can prove business impact, the presentation of operational and business data in harmony exposes the sweet spot for immediate revenue.
Over the course of history, from Descartes’ 17th century foray into visual mathematics to the 1880 census, to the mainstream adoption of machine learning and cloud computing, each phase of Business Intelligence and Analytics has scaled faster than its predecessor. In the same way, each year exponentially increases the volume and velocity of data we create. Those organizations that can take hold of data – whether application, transactional, third party, or otherwise – and morph the abstract into the tangible, placing clear and concise outcomes into the hands of the doers, the opportunities are endless.
Pull Back the Curtain on Risk Analytics and Decisioning
Provenir for BI and Analytics Opens a World of Possibilities, Based on Real Data