From Governance to Innovation: Aligning Data, IT and the Business

By Dan Finerty, SAS Canada Solutions Lead for Information Management

I had the privilege of kicking off a boardroom session at the recent Evanta Toronto CDO Executive Summit with those immortal words of one of the framers of the U.S. Constitution. It’s a bit tongue-in-cheek, of course, but my point was that with data scattered in repositories across the enterprise, and with different processes and data capture standards, we can’t deliver the full business insight that data can provide.

The session—Aligning Data, IT and the Business—was led by Brian O’Donnell, executive vice-president and chief data officer of the Canadian Imperial Bank of Commerce, and Anca Preda, global director of IT for Linamar Corp., and the round table included CDOs from many verticals, from banking to health care to retail to manufacturing. Regardless of the industry, a number of consistent themes—concerns, goals, opportunities, challenges—emerged.

The session began with a discussion of the three pillars of a data management strategy: governance, centralized data authority, and advanced analytics.


It all begins with data governance. Making sure the data collected, regardless of the business function that collects it, lives up to consistent data quality standards so it can be used across the enterprise is at the heart of driving value from data.

Data governance is about the three Ps: Policy, Process and People. An overall policy for how data is formatted and managed must come from an observation of how the business works. This policy not only dictates the processes of collecting and formatting data in a usable way, it is dependent on observation of business processes. It’s a good idea to have IT people involved in developing policy and data processes shadow the business units they’re supporting so they can see how data processes can support business processes.

Policy and process, of course, are nothing without people to create, enforce, and apply them. As in any strategic process, there are four categories of people: Those who are responsible; those who are accountable; those who must be consulted; and those who must be informed.


Mobility and telemetry are the prime suspects in the acceleration of data collection. Mobile devices, by virtue of their constant positioning data, along with interactions through online apps; telemetry—the world of sensors, or what has popularly become known as the Internet of Things (IoT) —through its constant autonomous creation of data. They have created exponential increases in three areas of data collection: volume, velocity and variety.

It doesn’t help matters that data is being captured by a number of different business units and stored in a number of different data silos, often without the consistency of formatting demanded by a data governance regimen. These silos have consequences.

 First, different business units may have information that can affect other business units: For example, in a financial institution, the retail arm might have data that could drive insight for the wholesale division; what questions might the latter ask if they had the data the former had?

Second, this “spaghetti map” of data marts makes it difficult to pull together the information to make strategic business decisions. If it takes months to integrate the various sources of data, then innovation becomes nearly impossible.

This is where the notion of central data authority that manages enterprise data in all its formats—transactional, interactional, structured, unstructured—becomes a critical consideration. Call it a data hub. This is the centre of the spaghetti map, the place where all of the enterprise data collection resources feed.


Once data is aggregated into a central repository—the “one version of the truth” that’s been data management’s holy grail for years—it becomes possible to perform meaningful analytics against it. Data is an asset; your enterprise is spending a lot of budget collecting it. It’s important to demonstrate that from a strategic perspective, it has a dollar value—if it’s used correctly. And it can only be used correctly if it’s usable data.

Data governance can be an eye-roller at the executive table, and analytics can be expensive. SO being able to demonstrate use cases that depend on well-curated data and advanced analytics becomes an important part of not only the business process discussion, but also the budget discussion.

Those are the three pillars of a data management strategy. But other themes emerged that are connected to the pillars, and are also simply central to the CDO philosophy.

THE 80/20 RULE

The 80/20 rule (more properly known as the Pareto Principle) pops up just about everywhere—human resources, wealth distribution, even the distribution of peas among pods. As a rule of thumb, it states that 80 per cent of outcomes are the result of 20 per cent of resources. It’s a ubiquitous pattern, and it applies to data science as well.

In the case of data science, this 80/20 split applies to how resources are applied in an analytics environment, and the anecdotal results aren’t encouraging for those with a bent for efficiency and effectiveness. In general, enterprises spend about 80 per cent of their analytics resources on data crunching—extra, transform and load (ETL) workloads—and 20 per cent on actual analytics. In other words, four-fifths of their data science budget is going toward preparing data to be analyzed rather than analyzing it.

A strategy based on governance first—data quality standards, custodians in every data-collecting arm of the business armed with those standards, data management tools and the discipline for IT and the business to work together—can start to move the needle to 80 per cent analytics and 20 per cent data preparation. That way innovation lies.


Often, enterprises are better at launching planes than landing them—it’s sometimes hard to sustain a project, especially when it seems peripheral to the bottom-line mission. Data quality and analytics processes have to be sustainable throughout the business—there must be an executive and line-of-business commitment to seeing projects through, even if phrases like “data governance” and central data authority might cause glazed eyes at the executive table.

But the business side of the house can be kept engaged in data science projects, and this is possibly the most important task of the chief data officer, whether he or she comes from the business side of the house or the IT side—build a bridge between the two to make sure the business recognizes the value of the technology, and the IT department is committed to supporting business outcomes.

Demonstrating small-scale analytics project that have a short-term impact on bottom-line can keep the business side engaged. There’s an element of risk; anything that promises short-term ROI also runs the risk of near-term failure. Embracing the “fail fast” model is becoming increasingly acceptable to businesses, as long as the stakes aren’t too high and the potential payoff of a successful project that can be rolled out more broadly across the company is attractive.


Many other themes emerged from our discussion, far too many to include in so brief a summary. But to focus on goals for CDOs to make the volumes of data we collect serve the business: Put governance first; recognize the business needs that you’re feeding; keep the business engaged in data management projects; and move the needle from data crunching to business thinking.

Dan Finerty is a 30 year veteran in the Information Management discipline. In his current role, Dan is SAS Canada’s solutions lead for information management, responsible for providing guidance to customers on ways to maximize their return on investment in Advanced Analytics through the strategic deployment Data Management capabilities.

Dan started his career at IBM as a System Engineer where he developed his expertise in application and data management. This experience was further augmented in several roles throughout sales, marketing and professional services consulting. After his tenure at IBM, Dan continued his career at Progress Software as a thought leader in mainframe application and data integration and held various other roles at Data Direct Technologies, Informatica and GT Software before coming to SAS.

Dan holds a degree in electrical engineering from The University of Waterloo.