Data & Analytics is a critical function for today’s modern “data-driven” enterprise. Everyone wants happy business consumers, so it is quite common to focus a large amount of time and effort on front-end reporting & data visualization needs. This front-end focus can result in spending too little time implementing the vital rock-solid data delivery back-end. An unbalanced data delivery ecosystem like this can contribute to consumer frustration due to a lack of timely and useful data, trust issues with the data, and a general decrease in user adoption. It can quickly become the “beginning of the end” for any D&A project.
As the highest-rated Data & Analytics company in the world, we have seen what does and does not work and for the D&A space, we love the “Agile Data Delivery” methodology.
The Components of Agile Data Delivery
There are many components that make up agile data delivery, but for now, let’s focus on the key factors to a successful methodology. If you get these key components in place, you will be off to a great start with your journey towards agile data delivery and the benefits that will fall in place down the road.
1. Out with Waterfall; In with Agile.
To “deliver data with agility”, one needs to first forget about the traditional waterfall delivery method. It just does not work in today’s modern data-driven enterprise. Consumers can no longer wait months or years for data to be populated into the Data Warehouse/Data Lake. The modern Consumer needs data “now” to make timely data-driven decisions.
2. Data Delivery Team.
Get the right mix of folks in place. Even if this is an IT-driven data project, you will need a mix of IT and business personnel. In addition to IT, Data Architects, Data Integration and business intelligence developers, you need to have your key business stakeholders at the table.
3. Data Story Backlog
With your well-rounded team in place, meet at least once a week to discuss your data stories. What data does the business need to have access to? What is the priority for each data story? Do we already have this data? How does this data fall in line with Enterprise goals? The aim here is to identify as many data stories for our agile backlog aka the stuff we need to do.
Part of the process will also be to identify the individual data stories that will be worked on during the next sprint. Having both IT and the business involved in these discussions will ensure that everyone knows what the deliverables will be for the next sprint. And more importantly, there will be no surprises when delivery is completed.
The project manager will meet with all the IT teams and coordinate the delivery of the sprint’s data story. If this is new data, the data integration people will source the data and build ETL/ELT logic around the process. Business Intelligence (BI) developers will take this new data and build out the reporting deliverables as indicated by the data story “card”. Since all these tasks can take quite a while, a single large “epic” data story may be broken up into multiple stories—one for the data integration piece and one for the BI reporting piece. These “epic stories” can then be spread over multiple sprints depending on the complexity. But that is OK, as both IT and the business will already know the story delivery cadence, so expectations will already have been set.
To speed up the delivery of a data story, ensure that you slice and dice your deliverables into workable chunks of work. Individual stories/deliverables should fit in a single sprint (two-week sprints are relatively common for D&A projects).
Delivering quickly will allow the business to see tangible progress sooner. Based on what the business sees, the data story can be marked as completed as expected, or it will result in some rework, which will be prioritized and slotted into an upcoming sprint.
The magic of “delivering fast” is that you will also “fail faster”. This isn’t a bad thing—the sooner the business sees a problem, the sooner it can be adjusted. With the old waterfall delivery method, the business might not see issues for months, resulting in a lot more rework and backtracking.
Once a data story has been delivered and approved by the data delivery business liaisons, it is time to involve the greater business community. This is a vital step that will really help with overall adoption. The business liaison on the data delivery team will also be a key person to help with onboarding, education, and soliciting feedback from the community. It is critical that the community be educated as to what the new functionality can and cannot be used for. To help with this education, demos given by the business liaison and the BI developers are also vital. Weekly “office hours” can also be activated so that business Users can “drop-in” and ask any question and see focused demos with the data subject matter experts.
Critical to user adoption is constantly monitoring and usage auditing of the new data feature. Are consumers using the new feature? How often do they access the new feature? These are just some of the metrics which are vital to understanding if our agile project has been successful or not. If a usage issue is identified, the data delivery team needs to reach out to the business team and curate a proactive plan to tackle the issue. This could include re-running the “onboarding sessions”, running some “lunch and learns”, or just dropping by the user’s desk and “checking n” to see how things are going.
“Being proactive rather than reactive to user adoption issues can make all the difference
for the success of a data delivery project”
The final step is to return to your data story backlog. Reprioritize or adjust your stories, if necessary. Then, kick off the next data sprint.
After a few successfully delivered sprints, users will feel more confident about all the data stories being delivered. They will feel more empowered because they have a say in “what” and “when” data stories get delivered.
“Agile Data Delivery can be magical, if done correctly”
Are you ready to start your agile data delivery journey? Get started today with Infocepts.
Responsible AI: A 6-Part Blog Series
September 26, 2023
Our Learnings from the 2023 Forrester Data Strategy & Insights Summit
September 22, 2023
Navigating Data Privacy Regulations: Comparative Insights into GDPR, CCPA, LGPD, PDPA, and Privacy Act
August 23, 2023
Scaling Data Analytics: Are these Challenges holding your Enterprise back?
August 1, 2023