- Like Post
A 5-step approach to self-serve analytics to unlock business value from data
To take full advantage of the commercial benefits of data, business teams must be empowered with the tools and intelligence they need to uncover fresh insights, make better decisions, and find new opportunities.
But, in our experience, many organisations face a variety of challenges in delivering and enabling self-service analytics.
In large organisations, intelligence teams often work with legacy systems that generate thousands of individual weekly reports as part of BAU processes – but these don’t help in making business decisions.
Often, there’s an IT-driven approach to reporting processes, which results in large volumes of data but not enough intelligence to direct functional insight. This can often be a problem for companies that grow through acquisition, taking on numerous reporting systems and standards. As a result, business teams of these companies get a partial view of business performance through current reporting and suffer a major risk of sub-optimal decisions.
Custom changes are expensive
If business leaders require custom information, there’s typically a huge lead time and a tedious process involved. In a world where everything moves quickly, having to wait for the insights needed to compete is far from ideal.
A lack of dynamic functionalities
If any metric within the report needs further investigation, business teams are often dependent on data scientists and have to queue their basic requests to mine patterns. This in turn results in very high “time-to-insights”.
At The Smart Cube, we’ve done a lot of work helping organisations to run the most efficient and effective programmes to operationalise analytics. And during that time we’ve come across all of these problems (and a few more, too).
Because of this experience, we’ve been able to develop some best practices that can help self-serve analytics programmes run smoothly. Here’s a five-step approach that we rely on to deliver effective and actionable intelligence.
1. Start with a business need
Every data project has to start with a clear need. It’s no good just gathering data and seeing what comes out of it. Business teams have to know what they’re looking for. Once they’ve decided on that, they can choose the right tools for the job and plan their approach accordingly.
Understand the need by gathering user stories – or what we call ‘collaborative discovery’:
- Data visibility needs: A data discovery project involves drilling down into different levels of data to provide specific information. This could mean comparing prices against competitors, comparing stores in a certain region, getting more granular data on things like individual SKUs, or just providing executives with a complete, top-level view of business performance.
- Scenario planning needs: Scenario planning involves a requirement to input or change certain parameters and evaluate outcomes. For instance, what will happen if production is moved from Factory A to Factory B?
- A specific business need: Many business problems need analytical models to be developed and consumed by the business via a self-service platform.
Production consolidation to generate savings and EBITDA improvement
Utilising our scenario planner, we recently helped one e-commerce client identify savings of £13.3m by consolidating production across four sites. From 10 different scenarios analysed, our insights enabled the business to identify the three actions that would maximise EBITDA improvement.
2. Accelerate data discovery
Knowing what you want to uncover is the first important lesson in successful analytics work. The second is this: you can’t gain good insights from crappy data.
To have any chance of success, businesses need to gather, integrate and harmonise their data from all its different sources into a data lake or data mart, either on-premise or on cloud. This means everything, from ERP systems, flat files, SQL servers, SAP databases, APIs, streaming protocols, consumer surveys – you name it.
Creating a consolidated database is the only way to remove the inaccuracies and inconsistencies that will damage results, and get a cohesive view that can be used for reporting and analytics. It also enables rapid experimentation on data, and the capability to move quickly towards scaled solutions.
3. Develop the data connections and algorithms
Once data is unified, it’s time to form connections.
For both data discovery deep dives and scenario planning, the connections between different parameters and KPIs need to be established through a logical or mathematical wireframe.
This requires an understanding of the business and its data, as well as an understanding of data engineering, machine learning and analytics capabilities in line with the solution architecture.
There is often a need for incorporating advanced statistical models, predictive analytics, and machine learning algorithms to generate output at the point of decision making.
Few businesses have all of these skills in-house, so bringing in the experts at this stage can add measurable value in the long run.
Connecting global fleet data intelligence to improve vehicle performance:
A leading pharmaceutical and consumer goods manufacturing client asked us to help reduce total cost of ownership and CO2 emissions of its fleet across 70 countries. Our data analysts streamlined the data collection process, and plugged gaps by connecting sources and incorporating external market research, resulting in a holistic view of vehicle performance. Key insights were presented in an online fleet intelligence dashboard, which enabled the client to realise a reduction in CO2 emissions of 18 g/km per vehicle.
4. Implement a user platform
When thinking about building the front end of an analytics solution there are three things businesses need to bear in mind: it should be easy to access, easy to understand, and easy to use. This will aid smooth user training, on-boarding, and adoption.
Partnering with key business stakeholders at this stage is vital to building an interactive visualisation layer.
It’s also important to ensure the front end seamlessly integrates with existing portals or application systems, from reporting platforms like Tableau, Power Bi, Qlik and MicroStrategy, to any front end .Net and Java apps.
Building an interactive executive scorecard to visualise KPIs and related insights:
A European asset management group wanted to gain a better visibility of the operations of one of its portfolio companies, a global multi-channel bookseller. We recommended a strategy to identify and collate multiple data sources, and Qlik as the software platform to present and visualise insights for client users. The result was a comprehensive, fully automated executive scorecard which tracked 40+ KPIs, with the capability to drill down into each KPI at granular levels.
5. Continually monitor and enhance
Once the tool is deployed, it is critical to ensure user engagement and adoption.
To facilitate this, the project team needs to demonstrate the agility to respond to any feedback and suggestions. There also needs to be a plan for BAU maintenance, including training for new users, ongoing support for existing users, and a mechanism to address user feedback.
Enhancing competitor price benchmarking to deliver actionable intelligence:
Our client, a leading UK supermarket chain, wanted to create a Value Index (VI) benchmark of all its product prices against six major competitors. Our analysts collated and presented data via a series of dashboards, providing graphical comparison of competitor pricing, and analysis by category, sub-category and brand levels. Data is refreshed and enhanced on a weekly basis, delivering client teams with the insights and actionable intelligence needed to develop proactive pricing strategies and stay ahead of competitors.
This approach may seem straightforward, but from our experience, businesses typically find one or more gaps along the journey, which act as major barriers to progress. This is where we can really help – as a third-party specialist, we bring an external perspective and help internal teams see the wood for the trees.
At The Smart Cube we combine advanced analytics, data science and technology to solve our customers’ most pressing problems: from bespoke solutions such as merchandising analytics and revenue growth identification, to comprehensive Analytics Centre of Excellence support.
Nisha is an advanced analytics and consulting professional with over 12 years of experience in retail, CPG and pharmaceuticals. In her current role, Nisha is responsible for managing large analytics accounts, designing and developing data science and analytics solutions for retail and consumer goods. She is an expert in marketing strategy, CRM, measuring promotion and campaign effectiveness, test and learn, forecasting, time series analysis, and driver analysis.
When Nisha isn’t helping clients solve business problems, she can be found reading books, or in the kitchen trying out new recipes. She also enjoys travelling, meeting people of different cultures, and exploring new places.