You are using an ad blocker that is interfering with our web typography and internal javascript. Please whitelist our domain to live in a more beautiful world. No ads here, just really great software!

Application Experience Live Webinar: SaaS applications – visibility, employee experience and business impact Book Your Seat

Blog Post|3 minutes

The 80/20 Rule: Time for a Number Change?

published
February 24, 2015

How efficient are your analytics? If you’re like most companies, more time is spent preparing for analysis than pulling out useful bits of information or discovering new data relationships. This is what’s known as the “80/20 rule”: Eighty percent of your time and effort is spent preparing data for analysis, leaving a mere twenty percent for production.

Is it possible to flip the equation?

Upside-Down Analytics

As noted by a recent Forbes article, data loading and writing code eat up much of the time that could be used for analytics. The most expensive and lengthy part of this process deals with three distinct steps: Extraction, transformation and loading. In other words, until target data has been extracted from the database at large, transformed into a useful subset and then loaded into analytics tools, you can derive nothing of value; attempting to run analysis on random, unadjusted data not only eats up time but can significantly reduce ROI.

So what’s the solution? Flipping the equation. If 80 percent of company time were spent performing analysis and only 20 percent preparing, results could be broadly applicable instead of narrowly focused. Forbes recommends a sea-change by “lowering the barriers to advanced analytical topics” and creating analytics solutions that support not only $5 million dollar decision making but also inform $50,000 and $500 decisions as well. The most important factor in flipping the numbers, however? Expand the number of people using data — and it’s here most organizations start to struggle.

The People Process

Analytics is often described as “understanding relationships within an ocean of data in motion.” If data were a static commodity, for example, analysis would pose little challenge since adding new information would not fundamentally change the nature of existing relationships. But in an ever-changing ocean of data, each new piece interacts with every piece that has gone before — some have no impact on the overall environment, while others create tidal waves. As a result, it can be difficult to integrate additional staff into the analytics process, since each new person brings their own set of variables and expectations, in turn changing the outcome of any data process.

Defining a clear path from data to every user desktop, therefore, requires a re-imagining of current analysis tools, since simply refining current systems to be more efficient only speaks to the symptoms of the 80/20 problem, not its cause. Instead of getting better at what doesn’t work, companies need to focus on transformation informed by simplicity — what if analytics was built-in to every end-user workstation, rather than an afterthought? What if key performance indicators (KPIs) were monitored, recorded and analyzed without the need for lengthy loading and data preparation? Forget the formula: This is new math.

The 80/20 split is holding companies back. Take your analytics beyond the numbers.

Get the best pods, articles, & research in IT experience