We take a look back at the history and evolution of analytics – and where it might go next.
The phrase ‘business intelligence’ was first used in an 1864 book of anecdotes about the world of business, by author Richard Miller Devens. Since then, it’s become a key component of decision-making for all leaders, enabled by the maturing field of analytics.
Over the decades, analytics has become an increasingly vital component of business management – even before the technology was in place to enable data analysis as we understand it today. But it’s only recently that the computing capabilities have advanced to the level where businesses can dig deep into their data for insights.
For individual organisations, the evolution of analytics can vary in scope and timing as their practice matures (and it’s possible for companies to skip an era entirely). Here we look back at the evolution of the analytics industry as a whole, from its first iterations as a manual, offline process to today’s innovations in predictive decision-making.
The four eras of analytics (so far)
Analytics 1.0
Let’s begin with the early days of analytics. The practice as we recognise it started in the 1950s, with individual companies collecting their own offline data and manually analysing it to look for ways to improve their operations. (Some even claim that it started decades earlier, with punch card tabulating machines.)
As the web found its way into businesses, these companies could start to draw data from the other sources. However, this was both rudimentary and largely the domain of large businesses that were early digital adopters.
In this era, the data analyst’s role was mostly as a technician. Rather than looking for useful insights, they spent their time handling the technology and data management. The work was slow, heavily manual, and only informed by limited data sources. Not only did this restrict the depth of the analysis; the results would often come much later in the process than was needed to be effective.
Analytics 2.0
The start of the 2000s saw the era of early big data initiatives, as computers and connectivity started to become pervasive among businesses.
Analytics was no longer strictly the domain of large, resource-rich organisations with easy access to the web. Any business that was working with a digital infrastructure could start digging into large data sets to find meaningful, more timely insights.
As the technology matured and started to have automated capabilities for data management, analysts became key personnel who could report on progress and trends, and make recommendations based on the data they were processing.
Analytics 3.0
The range of possible data sources grew hugely in the late 2000s and early 2010s, driven by the explosion of connected devices, widespread data collection initiatives, and increasingly intelligent analytics technologies.
As Neil Mason, Director Emeritus of the Digital Analytics Association, explained: “Digital analytics 3.0 is a fragmented, complicated world, but the tools are now enabling us to tame the data and begin to deliver on the promise of what digital analytics has always been about – understanding how people interact, how they use our products and services, and how we can better serve them by delivering better user and customer experiences.”
In this era, organisations started to think critically about how their data and analytics practice could translate into meaningful change for employees, customers and business processes.
Analytics 4.0
Analytics 4.0, though still in its early days, has seen organisations adopt predictive analytics and advanced decision simulations at scale. They can now pull data from hundreds of sources into highly sophisticated analytics programmes and deploy largely automated decision-making tools using cloud and big data technologies.
Building on the shifts in a data analyst’s role in earlier eras, Analytics 4.0 is enabling them to make intelligent, insight-driven recommendations for how businesses can adapt their market strategies, digital transformation initiatives, and more.
They’re playing a crucial role in the future of organisations the world over – but these specialist skills can be difficult to come by in a competitive space where talent is at a premium.
Tools are getting more specialised – and so are we
Many of our data scientists have been working in the analytics industry for decades – and that means they’ve seen the technology, tools and practice evolve in real time.
Our teams are always working to pick up new skills so The Smart Cube can keep up with this evolving environment.
Our technology stack in particular has evolved a lot over the past decade. For example, we went from using Excel, VBA, PPT and SPSS in 2005 to Tableau, SAS, SQL, Hadoop in 2010. More recently, we’ve seen the technology landscape explode – so the majority of our work uses open-source (R/Python), Spark, and visualisation tools including Tableau, Qlik, and Power BI.
The analysis techniques and algorithms have also evolved. From doing mostly statistical analysis, we’re now predominantly working with machine learning and AI algorithms. While companies still want to understand what variables impact key performance indicators – using methods like sentiment analysis – they also want to focus on automation of analytics to encourage self-service adoption. Whatever our clients need to make the best-informed decisions, we’ll strive to provide.
Visit our website to read how our solutions combine advanced analytics, data science and technology to solve our customers’ most pressing problems, or get in touch to learn how you can take advantage of the latest analytics tools and practices.