Why does a data quality project need to be agile? How much flexibility do you really need when deciding what ‘good quality data’ looks like, and what processes and technology you need to manage it?
The truth is: a lot. A successful data quality initiative does not come in a neat package, with ‘good data’ delivered at the end of the project timeline. Data quality is constantly evolving in line with your business’ needs and market expectations. Every new product line, every new customer segment, every deviation in business objectives can change what ‘good data’ should like, and the platforms and behaviours needed to process it effectively.
A data quality project that only uses traditional waterfall or sequential delivery follows a set path to data quality, defining a goal and the steps needed to get there at the start of the project and following that route until final delivery (often, 2 or 3 years down the line) – regardless of how the business, its customers, or its market have changed in the meantime.
Agile delivery methods, on the other hand, take a more flexible approach. By incorporating short sprints into the delivery lifecycle, they can bend, adapt and detour where needed, progressing the business towards its long-term data goals in increments over time. Instead of delivering one, singular ‘data quality’ result at the end of a project, data quality is treated as something that evolves with the organisation.
If you think of data quality as a roadmap – with the end destination being your business’ commercial data goal – it’s the difference between using a printed A-Z and real-time navigation apps, like Google Maps.
The former provides a route that is fixed at the moment of printing: it can’t account for traffic, or road closures, or changes to the geography over time.
The latter is more flexible, constantly changing to avoid traffic, navigate a new one-way system, or make it easier for you to plot a new course, whether that’s choosing a new destination half-way or taking a detour to pass a particular address on your journey.
Take data quality definitions as an example. In a traditional waterfall data quality project delivery, the definitions that describe what ‘good data’ looks like – its accuracy, completeness, reliability, timeliness and relevance – are fixed at the start of the project. Yet if the business wants to explore a new data-driven initiative, such as onboarding a 360degree customer view, some of those definitions will have to change, with ‘relevance’ and ‘completeness’ requiring a much wider scope of information.
So what approach should you choose for your data quality project?
Do you stick with the A-Z: traditional, inflexible data quality project delivery?
Or twist to Agile delivery methods: a flexible, real-time approach to data quality that evolves with your business?
The Agile approach to data quality
Data quality is part of our AIM Information Management Framework, which uses agile delivery methods to provide adaptive, responsive data project delivery. With every ‘sprint’ taking no longer than 2 weeks, we deliver incremental data quality improvements that build towards a final goal – allowing data quality definitions and processes to remain flexible and fluid.
Our Agile data quality delivery uses the following steps to ensure that at every leg of the journey, your data quality initiatives are aligned with your business objectives – and delivering the results you need.
- Where do you need to be?
What level of data quality maturity do you need to reach your business goals?
- Where are you now?
What’s your current data quality benchmark: e.g., how do currently define what good data looks like?
- What’s holding you back?
What issues are stopping you from reaching the level of maturity you need? Taking a deep-dive into your domains, identify how your data quality needs to change to get you to your data destination – be it people, processes, technology or all three.
- How can you get there?
Establish the journey you’re going to take. Create a solution-driven plan to define: what ‘good data’ looks like, how you can collate it accurately, what data governance you need to maintain it and what technology you need to manage it.
- Put a pin in the map
Create a quantitative baseline to identify data KPIs for continuous improvement and monitoring, keeping your data quality on track throughout your journey.
People, prosses and technology: the core issues at the heart of data quality
Data quality issues can usually be traced back to three core areas: people, processes and technology. While traditional delivery methods take a snapshot of those three areas at the start of the project – much like that printed A-Z journey – an Agile data delivery takes a more reactive course of action.
By breaking delivery into sprints, with testing incorporated into each short 2-week cycle, it’s easier to react to the changes and fluctuations of the business, whether it’s personnel changes (new starters, changes to leadership, etc), amends to processes (new ways of working, such as a shift to a hybrid or work from home culture) or new technology requirements or implementation. Just as the real-time navigation app can quickly respond to changes to your journey, an Agile data quality delivery allows for faster response times and more reactive data quality and data governance protocols.
To find out more about the key commercial differences between traditional and Agile data quality project delivery, download our guide - Stick or Twist: An Agile guide to choosing faster, more flexible data solutions or contact the Agile team.