When it comes to being data-driven, organizations run the gamut with maturity levels. Most believe that data and analytics provide insights. But only one-third of respondents to a TDWI survey1 said they were truly data-driven, meaning they analyze data to drive decisions and actions.
Successful data-driven businesses foster a collaborative, goal-oriented culture. Leaders believe in data and are governance-oriented. The technology side of the business ensures sound data quality and puts analytics into operation. The data management strategy spans the full analytics life cycle. Data is accessible and usable by multiple people – data engineers and data scientists, business analysts and less-technical business users.
TDWI analyst Fern Halper conducted research of analytics and data professionals across industries and identified the following five best practices for becoming a data-driven organization.
Five Data Management and Analytics Best Practices for Becoming Data-Driven
In a survey, TDWI found that one-third of organizations don’t govern their data – and fewer than 20 percent do any type of analytics governance. Governance is just one discipline that’s essential for becoming data-driven. Learn more in this checklist report from TDWI.
1. Build relationships to support collaboration
If IT and business teams don’t collaborate, the organization can’t operate in a data-driven way – so eliminating barriers between groups is crucial. Achieving this can improve market performance and innovation; but collaboration is challenging. Business decision makers often don’t think IT understands the importance of fast results, and conversely, IT doesn’t think the business understands data management priorities. Office politics come into play.
But having clearly defined roles and responsibilities with shared goals across departments encourages teamwork. These roles should include: IT/architecture, business and others who manage various tasks on the business and IT sides (from business sponsors to DevOPs).
2. Make data accessible and trustworthy
Making data accessible – and ensuring its quality – are key to breaking down barriers and becoming data-driven. Whether it’s a data engineer assembling and transforming data for analysis or a data scientist building a model, everyone benefits from trustworthy data that’s unified and built around a common vocabulary.
As organizations analyze new forms of data – text, sensor, image and streaming – they’ll need to do so across multiple platforms like data warehouses, Hadoop, streaming platforms and data lakes. Such systems may reside on-site or in the cloud. TDWI recommends several best practices to help:
- Establish a data integration and pipeline environment with tools that provide federated access and join data across sources. It helps to have point-and-click interfaces for building workflows, and tools that support ETL, ELT and advanced specifications like conditional logic or parallel jobs.
- Manage, reuse and govern metadata – that is, the data about your data. This includes size, author, database column structure, security and more.
- Provide reusable data quality tools with built-in analytics capabilities that can profile data for accuracy, completeness and ambiguity.
3. Provide tools to help the business work with data
From marketing and finance to operations and HR, business teams need self-service tools to speed and simplify data preparation and analytics tasks. Such tools may include built-in, advanced techniques like machine learning, and many work across the analytics life cycle – from data collection and profiling to monitoring analytical models in production. These “smart” tools feature three capabilities:
- Automation helps during model building and model management processes. Data preparation tools often use machine learning and natural language processing to understand semantics and accelerate data matching.
- Reusability pulls from what has already been created for data management and analytics. For example, a source-to-target data pipeline workflow can be saved and embedded into an analytics workflow to create a predictive model.
- Explainability helps business users understand the output when, for example, they’ve built a predictive model using an automated tool. Tools that explain what they’ve done are ideal for a data-driven company.
As organizations mature analytically, it’s important for the platform to support multiple roles in a common interface with a unified data infrastructure. This strengthens collaboration and makes it easier for people to do their jobs.
4. Consider a cohesive platform that supports collaboration and analytics
As organizations mature, it’s important for their analytics platform to support multiple roles in a common interface with a unified data infrastructure. This strengthens collaboration and makes it easier for people to do their jobs. For example, a business analyst can use a discussion space to collaborate with a data scientist while building a predictive model, and during testing. The data scientist can use a notebook environment to test and validate the model as it’s versioned and metadata is captured. The data scientist can then notify the DevOps team when the model is ready for production – and they can use the platform’s tools to continually monitor the model.
5. Use modern governance technologies and practices
Governance – that is, rules and policies that prescribe how organizations protect and manage their data and analytics – is critical in learning to trust data and become data-driven. But TDWI research indicates that one-third of organizations don’t govern their data at all. Instead, many focus on security and privacy rules. Their research also indicates that fewer than 20 percent of organizations do any type of analytics governance, which includes vetting and monitoring models in production.
Decisions based on poor data – or models that have degraded – can have a negative effect on the business. As more people across an organization access data and build models, and as new types of data and technologies emerge (big data, cloud, stream mining), data management practices need to evolve. TDWI recommends three features of governance software that can strengthen your data and analytics governance:
- Data catalogs, glossaries and dictionaries. These tools often include sophisticated tagging and automated procedures for building and keeping catalogs up to date – as well as discovering metadata from existing data sets.
- Data lineage. Data lineage combined with metadata helps organizations understand where data originated and track how it was changed and transformed.
- Model management. Ongoing model tracking is crucial for analytics governance. Many tools automate model monitoring, schedule updates to keep models current and send alerts when a model is degrading.
In the future, organizations may move beyond traditional governance council models to new approaches like agile governance, embedded governance or crowdsourced governance. But involving both IT and business stakeholders in the decision-making process – including data owners, data stewards and others – will always be key to robust governance at data-driven organizations.
1. What It Takes to Be Data-Driven. A TDWI Best Practices Report by Fern Halper and David Stodder. 2017.
- Reimagine marketing: Today, tomorrow and in times of disruptionPutting the customer first has never been more important than it is now. One way marketers can prepare for the new reality is to look at each step in the marketing process (the marketing lifecycle) and map martech capabilities into the lifecycle, based on what you are trying to accomplish with each step.
- Situational awareness guides our responses – routine to crisisMany circumstances call for situational awareness – that is, being mindful of what’s present and happening around you. The COVID-19 pandemic heightened this need, as leaders across industries used analytics and visualization to gain real-time situational awareness and respond with fast, critical decisions.
- Saving lives during a global pandemic through medical resource optimizationCleveland Clinic is operationalizing analytics to combat COVID-19, creating innovative models that help forecast patient volume, bed capacity, medical equipment availability and more.