The Promise of Analytics

The Promise of Analytics

StockSnap_ROMUXAK6HD_AnalyticsPromise_BPOf our clients who have embarked on an analytics program, many of them have seen the same issues emerge that came up when they introduced Business Intelligence (BI) tools, as well as historically with traditional reports. That is to say: they are not significantly closer to actionable insight; their decision-making process isn’t informed by data in a meaningful way; and there continues to be as much distrust of data and data providers as there is belief in the data’s power or potential.

The promise of analytics is to help users move beyond simple BI and traditional performance management practices including:

  • understand more about the underlying data
  • learn why key performance indicators may be out of line with expectations
  • more quickly and ably respond to changes at the institution or in the marketplace

Used properly, analytics can lead to innovation in the way we generate solutions to problems and the way we try to align decisions and actions with strategic objectives.

So why aren’t analytics, at least in some form, ubiquitous? Let me share some of what we have observed.

We try to do analytics in a vacuum. There is no compelling institutional vision for or strategy around analytics. Some questions:

  1. What do we think analytics will do for our organization?
  2. Is there a specific goal, such as maximizing enrollment revenue or assessing student learning?
  3. Is there a specific outcome, such as making interactive analytical dashboards available to all decision makers?
  4. Are we simply hoping that making more processed data available in new formats will generate insight?

This lack of vision around analytics often goes hand in hand with an organization’s overall lack of strategy for data and data management in general. If there’s no clearly-defined role for data in an organization’s strategic or operating plans, it’s certainly a stretch to think that casual analytics, no matter how well-intended, will have a meaningful impact.

Interesting is not the same as useful. Too many organizations don’t know what problem(s) they are trying to solve, or even which questions they are trying to answer. Absent a clear direction for study, whatever analytics we produce is likely to be far more noise than signal. We need clarity about what we want to know and when we want to know it. If we don’t have that, we lack critical context for the data and analysis we see. Interesting is not the same as useful.

Too often we see analytics generated prior to figuring out what questions are being asked, and which data actually helps answer those questions. This is why we model a data request process that starts with the purpose of the request, rather than the details. And we believe this model translates to analytics.

Moreover, statistical analysis and visual representation of data ought to make clear that there are many legitimate ways to conceive of and analyze data, and that certainty is virtually impossible. The idea that analytics is a pathway to a “single version of the truth” is misguided at best. Effective analytics involves making very clear what is investigated, being very transparent about methodology, and drawing temporary conclusions, since new evidence is always uncovered that should modify our priors.

Data does not speak for itself. If no one can explain or understand the analytics outputs, then they have little value. While data fluency is stronger than it has ever been, in higher education it is still immature, and it remains in short supply. Data providers should have a plan to communicate both the results and the data lineage behind those results. Analysts ought to be able to speak to the why and how of the data, as well as the what. Moreover, smart data consumers will want to know where the data came from and how it may have been manipulated along the way. They will want to know how the analytical models work, at least conceptually, and the potential impact of changes to those models, and of decisions based on them.

Analyze the data that matters, not the data that is easiest to access. Many organizations lack a governed data set from which to generate insightful analytics. Reported figures lack consistency, and the explanations for this inconsistency tend to not inspire confidence in the reliability and utility of reported data. Transactional data in particular is very difficult to govern, given how frequently it changes, as well as the number of potential sources of accuracy and integrity errors. The challenge is even greater in higher education due to the amount of data collected, its dispersal across systems and platforms, and the decentralized way data tends to be gathered and stored at complex organizations.

For data analysis to be actionable it must be timely – in fact this may be the most important thing about analytics – and it must be accessible to those who would act. In this case accessible doesn’t just mean to have access, but also to have understanding. Even the best-documented and most efficiently built data warehouse will always lag the data collection and processing needs of modern organizations. Timeliness will almost always be more important than completeness.

In our view, data governance is key to increasing data understanding across the enterprise, and to encouraging the recognition that your data is a critical and underutilized asset. Engaged users who trust and understand data are far more likely to rely on and seek out analytics to understand which activities and programs are working, and which need to be improved. Data governance and data intelligence helps create those educated users who can rely on data assets and who can contribute the creation of even better ones.

Just-in-time data governance for analytics. Data governance can be applied to analytics in many ways and from a variety of approaches and perspectives. Data governance across the enterprise would of course be ideal, but we know that is a daunting task and that it is an end, not a starting point. However, the following principles are attainable and executable on a small-scale analytics project:

  • Begin with a defined and limited data set, one whose quality has been established according to formal institutional standards.
  • Design analytical outputs to answer specific questions, and make sure that everyone involved shares a common understanding of terminology and expectations.
  • The whole process ought to be characterized by collaboration, openness, and iterative development. Gather requirements, prototype deliverables, and strive at every point in the cycle to achieve clarity of expectations, process, and results.
  • Focus on repeatability as much as results. While there may occasionally be an “aha” moment, most insights generated by analytics lead to marginal improvements and minor alterations. The goal is not to find a magic bullet, but to build an environment and organizational culture in which data analysis is routine, and small insights are plentiful.

Analytics whose message is opaque, whose methods are suspect, whose provenance is mysterious, or who take too long to deliver—all are likely to be spurned by those who are most in need of them.

In summary, successful analytics:

  • require a context
  • ought to address real issues
  • should provide answers that can be easily understood and quickly validated
  • must be timely
  • ought to serve as the foundation for the next set of questions and investigations

Failure to govern data at any point in the analytics chain can lead to breakdowns.

For additional reporting and analytics related resources click here.  And for the complete library of data governance related resources click here.

IData is expert in data governance and integration.  Feel free to Contact Us if interested in getting assistance.  The Data Cookbook is the leading data governance solution for higher education institutions.  Feel free to Contact Us if you would like to discuss how the Data Cookbook can assist your institution.  

 (image credit StockSnap_ROMUXAK6HD_AnalyticsPromise_BP #1069)

Aaron Walker
About the Author

Aaron joined IData in 2014 after over 20 years in higher education, including more than 15 years providing analytics and decision support services. Aaron’s role at IData includes establishing data governance, training data stewards, and improving business intelligence solutions.

Subscribe to Email Updates

Recent Posts

Categories