A majority of today’s IT portfolios include one or more active business intelligence (BI) or analytics projects. If you’re an IT manager, director, VP, or CIO, a technical project manager or consultant, or the leader of a functional area in your business, you are responsible for at least one strategic initiative involving analytics or business intelligence. Most of us know by now that technical projects don’t fail because of the technology. However, the buzz around cloud/hybrid environments has put a spotlight on the selection of the underlying platform and services for these projects. While these are critical components, it’s important not to forget rule #1 in BI/ analytics projects: Most anything is technically possible. It’s the internal politics and governance that get in the way.
This has been the case since reporting systems and data warehousing first emerged over 30 years ago. BI solutions often require data from multiple sources that cross organizational boundaries. How can you ensure that you have buy-in from the source organization? Have they bought into your accessing the data? Have they provided the correct subject matter experts?
In addition, once data sources are initially integrated, loading from these sources is an ongoing process for the life of a BI solution (often five to ten years). Source data is never perfect, there are always exceptions (e.g., data may not be consistently formatted, null values may be provided for required fields). Do you have correct data quality processes in place? More importantly, is data governance, or data stewardship, in place to ensure that correct data gets reviewed as well as loaded, and that adjustments are being made for exceptions?
With these issues in mind, I interviewed some of the most senior data architects I know, and asked them to share their experiences about factors that could make or break an analytics project.
“But,” you say, “There are tons of good articles already out there about governance of data projects— do I really need another one?”
Agreed, there is a lot of existing guidance about success factors. (Align your use cases to the business! Have an executive sponsor! Get buy-in from all stakeholders! Define and document your processes! Ensure data integrity!) And much advice exists on worst practices (Don’t get bogged down with minutiae, but don’t stay too high-level either! Don’t assemble a steering committee without a clear purpose and framework! Don’t assume that data governance is a project that will ever end!). You have access to a digital sea of carefully-worded, optimistic information about data governance.
Let the following nuggets of truth be your islands of reality in this sea, built on the experience of senior architects who’ve spent years in the data trenches. Ask yourself these six questions about your analytics projects, whether they are still in the planning stages or well underway. I’ve provided red flags to alert you to issues and tips to help you mitigate them. I also included some relevant quotes from the experts’ own experiences.
1. Who Owns the Source?
(Translation: Is Your Executive Sponsor Executive Enough?)
As I talked with the data and analytics experts in my network, this concept was stressed first and foremost: You will only be able to execute a project within the umbrella of your sponsor. Having the right people at the table accelerates progress (corollary: not having the right people could result in little or no progress at all). Success stories poured out about forward-thinking executive sponsors who were senior enough to make things happen. These were closely followed by memories of failure, where the project teams didn’t have access to the data they needed, and as a result, couldn’t make the solution shine.
“Anything we needed that got him closer to his goal, he would sign off, and we had access in 24 hours — new tools, calculations, subject matter experts. In this case everything he needed was in his silo.”
Ownership of the source data is a critical factor. If one group owns all the data sources and they have the incentive to make things happen, the project will progress. If sources are owned by different groups, this exponentially increases the complexity of the project. Some of the departments in your organization may never have shared real-time data before. They might insist on delivering a static text file on a weekly basis, and your Chief Technology Officer (or equivalent) may not have enough authority to force the issue. Some data projects will only succeed if the executive sponsor is the very top executive in the organization.
Red Flag: If the people funding and supporting the project and the people supplying the data are not the same people, you may have trouble getting that data.
Mitigate it: Most BI projects don’t have the level of scale or scope to engage the CEO. The sponsor of a Sales and Marketing BI project might be the VP of Sales, whereas for an IT-level project it’s the CIO or CTO. But if those roles don’t have the authority to break through barriers across the organization, you may need to make sure the CEO is engaged. The CFO could also be a powerful ally, because a data project’s ultimate goal is to benefit the company fiscally.
2. Are You Driving an Agenda Instead of Running a Project?
Most of the experts I interviewed had at least one story about an agenda disguised as an analytics project. The scenarios ranged from driving a wedge in the enterprise by replacing the incumbent technology with a preferred platform, to the career ambitions of a single executive, to cold wars over funding and resources.
“Their agenda had nothing to do with the tool and nothing to do with the data.”
Obviously all projects, IT or otherwise, have some degree of personal agenda behind them. However, your project will fail if it is primarily agenda-driven and not about creating insights that lead to bottom- or top-line benefit to the company.
Red Flag: If you’re not allowed to use a commonly occurring word such as “analytics” or “dashboard” when speaking about the project, you’re probably furthering an agenda.
Mitigate it: This is another good time to make sure the CEO and CFO are engaged, and that the business value is clear.
3. Could People Lose Their Jobs as a Result of This New Technology?
One of my sources told me he’d come to expect lots of infighting about who controls and accesses data. “They’ll ask you for your specific needs, validate those, and give you a snapshot of the data. They realize that once other people get control of the data, their job is called into question. What if they find data that is wrong; what if they find data that’s showing they’re not doing their job?”
The human impact of analytics projects should not be underestimated. On a basic political level, there’s a control aspect. If a group is being asked to share live data for the first time, this means they’re being asked to give up control. They’ll no longer be able to massage out the knots before delivering the data. And they may need to answer questions and make changes based on recommendations from the project team. In short, they’ll be up against the fact that it is the company’s data, not their data. That’s an uncomfortable shift in thinking.
“It is terrifying because they are significantly losing control. They don’t even know what this information looks like or what it’s going to say about them.”
Beyond the control issue, the exposure of data frightens people because it could cost them their jobs. You’re telling them, “We’re digging into all of the data that shows your productivity for as long as you’ve been keeping track. We have tools to show conclusively what the actual productivity was.” For example, in a call center, the data might show that idle times for each caller are higher than they should be. Or it could expose managers not being responsive enough, waiting too long to switch workers from a non-busy pool to a busy pool. The company may be going through a round of tightening or integration and looking for ways to introduce automation or to streamline the number of employees manning the phones. The teams generating the data may not even know the extent of what the numbers will show. As a result, there may be a massive internal scramble to improve the operations and procedures that are generating the data, before it starts to be displayed by the new system. If the teams don’t know yet what the new metrics will be, they will try to make their existing metrics look better, even if those metrics don’t involve the analytic that’s eventually considered critical.
Red Flag: If employees start resorting to digital shredding (for example, deleting their log archives), they’re probably terrified of losing their jobs. You may also experience passive-aggressive behavior in response to requests you make on behalf of the project.
Mitigate it: Recognize the human factors at work and don’t underestimate them. Maybe even — dare I say it — address these concerns openly and communicate empathetically about the strategic change that’s happening in the organization.
4. Has a Similar Project Started and Failed Multiple Times?
If your company has engaged in several failed attempts to launch an analytics solution, you’ll face skepticism from all sides. This is the polar opposite of the “over-optimism” situation that is sometimes the cause of project failure, and it should be managed just as closely. When a lot of emotional baggage weighs down your project before it even starts, take a lean engineering approach: Build, test and fail fast; iterate rapidly; and show results early and often. Take a product development approach: Rather than waiting until your analytics solution is “perfect,” get a minimally viable product in front of your stakeholders as soon as you can. Then release updates continuously, keeping a functioning version live and accessible at all times. This goes for your governance plan as well as your actual solution — put your straw man up and keep refining as you go.
Red Flag: If the project timeline goes over six months, your likelihood of success is drastically reduced.
Mitigate it: Start small and lean with a several-week Proof of Concept that can evolve into your envisioned solution.
5. Can You Adapt to the Analytics?
Metrics will change as a result of your analytics project. As the new analytics come in, be aware that the people who are able to respond to the analytics are those best able to succeed.
In some cases, however, the metrics and key performance indicators (KPIs) may not be right. The data could be bad, the math could be bad, or the metric may not reflect reality. This underscores the importance of having someone who can evaluate the data and conditions, and make the right decisions.
In one example I heard, a company with high turnover in their customer support function did an analysis to see what kind of people they retained the longest. Training was expensive, so retention was the goal. The data showed that a key component in employee longevity was commute distance: The further away someone lived from the support center, the more short-lived their employment was. The company surmised that they needed to hire closer to the support center. However, that center was located in a downtown financial district, some of the most expensive real estate in the city. Workers from more distant zip codes were more likely to be in legally protected socio-economic groups. So not hiring from those zip codes would effectively have been discriminatory. That was reality, but the analytics only saw zip codes.
This is why it’s critical to be cautious in the creation of metrics and KPIs. Are they right for the business? Are they achievable? Are your employees equipped and empowered to support them? And are you ready to help your team and the larger organization adapt? Because analytics have the potential to impact areas like professional development, performance, funding, and compensation.
“As more and more metrics surfaced, some managers adapted, grew, and became directors, because they knew how to motivate their staff into the numbers.”
Red Flag: If there are people who think they know better than the computer does once the new analytics start coming in, you may not have the right metrics in place. (Then again, you might!)
Mitigate it: Listen to your team leads, remember the support center example above, and make sure you have the big picture in mind as you evaluate and refine your metrics.
6. Are You Playing the Long Game?
Data changes. It requires ongoing care and feeding. Once you finally get everything up and running, people tend to consider the project complete. Companies almost never budget for the following years and the resources necessary to continually ensure data quality and good source data. If you don’t have a dedicated data steward over the lifetime of the solution, there’s a good chance it will fail. This may not happen right away, but farther down the road when everyone’s attention and resources are focused on different initiatives.
Your data steward:
- Ensures that your systems have the right checks in place to find bad data coming in — not only bad data types, but broken taxonomies as well.
- Keeps your governance plan and data policies up to date.
- Stays on top of changing regulatory and compliance requirements.
- Works with vendors to define and enforce requirements pertaining to their services.
- Represents the needs of all the groups across the organization.
I once said “the data must flow” during an interview. I didn’t get the job.
Red Flag: If the project launches and everyone goes away, the solution may fail when no one is expecting it.
Mitigate it: Budget and plan for a dedicated data steward over the lifetime of the solution.
As technology advances, analytics across hybrid environments become increasingly achievable. But don’t neglect political and governance factors as you work toward your vision of better business insight through data. Ask yourself the six questions I’ve outlined, not only at project inception but at key points throughout the process. They can help you concentrate on the aspects of your project that could mean the difference between failure and success.
Acknowledgement: My thanks to Bill Marriott, Ken Seier, Larry Barnes, and the elusive @TripperDay for their time and input into this chapter.
This article by Sadie Van Buren is one of many great chapters in the book, Improve It! A Collection of Essays on Using Analytics to Accomplish More With SharePoint. With multiple perspectives from Microsoft insiders (including IT Unity favorites Agnes Molnar, Susan Hanley, Christian Buckley, Naomi Moneypenny), leading SharePoint consulting firms and industry luminaries, find out how using analytics to measure SharePoint for social, collaborative and engagement enables improved ROI. And, if you’re interested getting started with better measurement of your SharePoint, you can learn more about SharePoint Analytics from Webtrends now.
Get your free copy of Improve It! below.