The closely related issues of siloed data and a lack of data integration – stemming, in part, from reliance on older data warehouses and a general inability to establish proper data governance – aren’t just slowing down your business intelligence efforts. They’re keeping you from competing in today’s data economy. Data silos play a huge role in thwarting organizational productivity, damaging profitability and gumming up employee attraction and retention.
Indeed, Database Trends and Applications says poor data quality hurts productivity by up to 20 percent and prevents 40 percent of business initiatives from achieving targets. And a recent Gartner survey found the problem costs businesses an average of $15 million every year, adding that bad data hurts organizations competitively and exposes them to client mistrust.
According to the Gartner survey, organizations can take several steps to help remedy poor data quality, including:
- Measuring the impact. Although the late management consulting guru Peter Drucker once said that “what gets measured gets improved,” it seems like most organizations haven’t heeded this advice when it comes to data integrity. The survey indicates that 60 percent of organizations don’t measure how much all that bad data costs – meaning they’re not just unaware of the scale of the problem, but also in the dark about its potential impact on the rest of the business.
- Creating data stewards. Pythian advises creating an Analytics Center of Excellence to help plan, execute, promote and govern your data program out of the gate. But even if you don’t create a formal group, the Gartner survey says it’s crucial to establish data steward roles (and, if possible, a Chief Data Officer to oversee your data quality initiatives) to improve accountability and proactively prevent your program from going off the rails.
- Optimizing data quality costs. We know not all organizations ignore the data silo problem. But in their attempt to keep data integrity costs down, some spend way too much money in other ways: an annual average of $208k on on-premise data quality tools, according to Gartner. But migrating your on-prem data warehouse to the cloud can help in this regard. A cloud-native data platform (which is essentially a cloud data integration platform) is the most cost-effective and scalable way to ensure data quality and unity, especially when dealing with new types of data from edge devices and the IoT.
Building your business case
For a clearer picture of your proposed integration and governance, run a Total Cost of Ownership analysis – initial capital expenditures (CapEx) summed with operational expenses (OpEx) – on both your current and proposed systems. It’s important to determine whether a full-fledged integration program (including governance and a master data management, or MDM, strategy) is a worthwhile investment based on your organization’s size and specific needs. Pythian also recommends considering INCIDEX, or the expenses related to an incident such as a major security breach (something Pythian works with clients to avoid).
Building a strong business case for MDM requires the following steps:
- Assess the maturity of your current MDM strategy (if you have one)
- Develop a cross-organizational implementation plan
- Identify your desired end state and understand the costs and benefits
IT managers are also being asked more and more to provide business cases for data integration. This should at least include the project’s estimated costs, potential risks and their possible costs, along with the value of expected future benefits.
According to Computer Weekly, every project should have a positive Net Present Value and a rate of return that’s higher than the organization’s revenue hurdle rate. Based on these indicators, organizations should be able to objectively select which projects have the highest potential ROI.
The costs of on-prem vs. cloud
If your organization still uses a traditional, on-premises data warehouse, much of your business case calculation must look at the differences between on-prem and cloud solutions.
On-prem systems always require a large CapEx investment out of the gate. They’re expensive to upgrade, require all sorts of cooling and fire suppression add-ons and often take up valuable space in your office. Cloud systems typically only require smaller, monthly OpEx and don’t require a dedicated room full of expensive equipment.
Cloud users, of course, receive regular bills. But the plus side is their organizations aren’t hobbled in the short-term by gigantic investments that can quickly eat away cash flow. For a closer look at the costs of cloud solutions, Google Cloud Platform and Microsoft Azure have even released their own cost calculators.
While the business case for cloud can make sense to most organizations, cost optimization doesn’t stop once you’ve made your migration: organizations are starting to realize they can further optimize costs within the cloud. At Pythian, we’ve often helped clients optimize costs when they find themselves paying for server use even when it’s not being used.
You can read all about how to build your business case for data integration, governance and MDM – along with step-by-step guidance on how to deploy an integration and governance program – in Pythian’s new eBook, The Book of Data Integration Zen.
After all, when your data is cleaned, organized and in one place, you can reach a state of data unity. Or, as we like to call it, data integration zen.
Want to talk with a technical expert? Schedule a tech call with our team to get the conversation started.
Interested in working with Ron? Schedule a tech call.