Analytics in Action

By: Kimberly Nevala Director of Business Strategies for SAS Best Practices 

I recently had the opportunity to moderate a session examining the rewards and realities of the analytics journey in public sector. The executive panel was overwhelming positive about the opportunity for data and analytics in both the government of Alberta and the public sector in general. In fact, most agreed that Alberta has some of the best data in the world, particularly in the health, energy and natural resource sectors. They were equally frank about the real and perceived barriers to making the most of this rich resource.

Here are some of the key considerations and cautions they shared about what it will take to create a culture of evidence and make analytics a core decision-making tool in government.

Start With WHY (in their terms)

Why analytics? Too often, we answer this question generically, without explicitly identifying the need, pain or problem to be solved.

Why analytics or an integrated data ecosystem? Not because the data is in siloes, duplicated, hard to access and understand. The real why comes from the application of information to improve the business. In the case of health and human services, it’s literally to save lives.  For policy makers, analytics arms deputy ministers with tangible, tactical evidence to drive good policy decisions while also establishing a body of evidence and record of why key decisions were made. 

Of course, the why isn’t the same for every constituent. Making the case requires linking analytics to the intrinsic and extrinsic motivations and outcomes which define their success.

Begin with the end in mind

Although the public sector still largely operates top-down, panelists suggested an organic, bottom-up approach is required for analytical innovation. Front-line staff understand the micro-climate in which they operate. In the case of health delivery, this involves the clinicians who are intimately aware of the pain points and opportunities in the incumbent care delivery process.  They are also the people who ultimately must change clinical pathways and models to incorporate new insights.

That said, the panel cautioned against over-engineering data-driven improvement programs. There can be a tendency to try predetermine the system from soup to nuts: here’s what we’ll measure, here’s what you will do to change the process based on the data, here what we’ll…ad infinitum. A better approach is to expose the evidence and engage those in the know to determine how to respond. In fact, the earlier the engagement the better. Build a solution without your customers’ involvement and not only will they not come, they may actively oppose the solution on principle.

Go to the Light

Rather than worrying about universal acceptance, educate audiences on the art of the possible. Engage with those who see the potential and have a problem they are willing and ready to solve. Finding them is easy: they will raise their hands. Focusing on early adopters and small, tangible problems allows for early wins making the case with less enthusiastic constituents.

Worried people won’t engage? Evidence to the contrary abounds. Clinicians exposed to departmental quality measures have proactively asked for data on their own performance.  One large US-based health-care provider reports that quality improved – without any systematic program intervention – after key metrics were routinely published.

Evidence is the START of the Story

With the advent of digital everything and always-on communication channels, the decision-making cycle has sped up exponentially. Have answer, will act? Not so fast. Buckets of data do not evidence make. Nor do buckets of evidence create knowledge. 

Time and space are required to not only cultivate but appropriately consider the evidence. The onus here is largely on the senior tier to slow down the decision-making cascade enough to allow for such deliberation. Management must also be trained to demand and appropriately interpret information once found.  One panelist recounted how a new perspective introduced at the eleventh hour led to a heated, last-minute rethink of a proposed policy. In the end, the right answer, rather than the convenient one, emerged.

Evaluate, Implement, Adapt

The best analytics will only create expensive trivia if the organization isn’t prepared to act on found information. That requires a disciplined approach to analytic discovery and information delivery. This starts with asking not just what are we looking for, but what we will we do with the information once we find it. Followed by a systematic process that allows hypotheses to be tested and interventions to be implemented, monitored and adapted based on discovered results.

Organizations must also mindfully architect and design future systems based on found insights. This includes proactively identifying information requirements during business process and system design. Thereby ensuring data required to monitor performance and adapt is created and captured from the start, as opposed to considering data a happy byproduct or derivative of running the business.

Practice Open Data

While data access is often cited as a primary barrier to analytic progress, some suggest this is really a cultural and managerial issue, not a legislative one. It is based on a historical legacy that errs on the side of not sharing data, for fear of what someone might do with it once it leaves the nest.

Panelists suggested the onus must shift from measuring data producers on creation and access, to how effectively information is shared; judging data consumers not on access but on usage. What do they do with the information available to them?

Such a shift would make data producers obligated and accountable for sharing data while holding consumers accountable for appropriate use, thereby governing value and risk based on information use, not the mere presence of data.

Make the Facts Known

The points above make a simple assumption: that evidence is published. While this sounds a bit tongue-in-cheek, panelists pointed to a common reticence to share not just raw assets, but found insights. Perhaps that’s from fear of being proven wrong or provoking strong opposition, particularly when the evidence belies standard beliefs or operating practices.

However, the alternate is arguing on shifting ground, rather than using information to support a particular position or counter an argument of those armed with fervently held beliefs, but few facts.

Exposing the data comes with risk. But often practitioners view evidence and facts as a means to persuade or change another’s mind, rather than using those facts as a basis to engage others in a substantive discussion-- even when the discussion turns to the basis of the evidence.

Facts alone may not change the opinion of a dedicated naysayer. Facts do, however, provide a common basis for discussion. And isn’t that the real point?

Great Artists Steal – Let Them

When it comes to cultivating a shared purpose, cross-pollinating key skills and spawning new perspectives on the importance of communities of practices for analytics was top of mind. It need not be over-architected. Creativity spawns creativity. While innovation can’t be mandated, the right environment can encourage out-of-the-box thinking.

Creating time and a safe space for analysts from different departments within and among ministries has multiple benefits. Companies that have adopted this approach have reported increased returns on analytic investments and more engaged and motivated analysts, not a small victory given the current premium on core data science resources.

Such collaborative teaming models become particularly important as new business and service models transcend historical departmental or ministry boundaries. Simple examples include the integration of different episodes of care to account for a holistic patient experience and treatment and the interplay of adult, child welfare and juvenile justice systems.

Partner Up  

Of course, the need for partnership isn’t limited to the data science or analytic community. To maximize shared assets it will be imperative for like-minded ministries to work together on common problems rather than reinventing the wheel each time. Areas such a fraud and risk are often great starting groups for this type of initiative, which are most effective with broad data sets culled from across ministry boundaries. 

And while communities of interest help maximize incumbent analytic/data science skill sets, third-party partners can also play a role. Universities in particular are fertile grounds for collaboration. Engaging aspiring data scientists to work on projects with live data provides two-fold benefits. Students gain real-world experience and more interesting projects while helping bridge the talent gap (be it lack of available resources or specific skill sets required). Not to mention, it’s never too early to plant seeds with the up-and-coming generation of leaders and data scientists about the power of using data for good in public service.

Conclusion

This broad-ranging and thought-provoking discussion also touched on the need for a collaborative information governance model, cultivation of analytical not just statistical literacy, creation of a shared data infrastructure and analytic lab environments, and the need to incorporate data and analytic competencies into the GOA’s workforce development models.

While the challenges appear daunting, participants unanimously agreed that the opportunity far outweighs these impediments. The GOA in particular and the public sector in general are well-positioned to create a rich data resource that is properly governed and available for public good. But as Zig Ziegler once opined: “the key to getting ahead is getting started”. Why not now?

As the Director of Business Strategies for SAS Best Practices Kimberly Nevala balances forward-thinking with real-world perspectives in business analytics, data governance, analytic cultures and change management.