Apply Data Governance Best Practices to Technology Projects

Apply Data Governance Best Practices to Technology Projects

Let's discuss some ways that you can apply data governance best practices to your technology project and in an efficient way. Technology projects could include conversion to a new ERP solution, implementing a data warehouse, or switching to a new reporting solution. Our recommendation is that this is the time to apply data governance as you are looking at the data closely already. Applying data governance during technology projects will save time and aggravation as well as lead to a more successful project completion. In this blog post we will discuss data governance best practices during your technology projects.

StockSnap_BT6YG9XMB0_DB Best Practices Technology Projects_Team_BP

Here are some best practices:

  • Have a knowledge base for information regarding your data and your technology project. Say, "Hey, rather than using those standard project spreadsheets and documents that we email around to everyone involved, let's consider incorporating a persistent data catalog knowledge base as a repository for all the data related technology project documentation and decisions.”   This knowledge base might be the Data Cookbook or another data governance data catalog tool, or maybe something that's homegrown. State "We plan to capture this technology project related information to exist beyond this project as a knowledge base that people can access."   Say you are building out a new report specification for something to be built, or you are documenting the list of reference code values that you are going to use for your employee statuses in the new HR system. If you do nothing else, just make sure that final documentation is put back in the knowledge base to persist over time.  If you have a tool like the Data Cookbook or some other data governance tool that have workflows, approval, and governance, then you can have the necessary routing, approval, versioning, and dependency . Most tools can handle content versioning, so you won’t have to ask, "What's the most recent version of this?”. Putting information in a knowledge base can really help from the beginning to save time and to capture it in the long run including data training and transparency. Content such as business glossary, data system catalog, reference data, report catalog, data flow catalog, tracking requests, policy attributes, and data quality rules, are all things that are relevant to capturing throughout the technology project.  
  • Leverage requests and workflows. Maybe you are doing this in a ticketing system or using a tool like the Data Cookbook to do this, but you can save time as well as help with accuracy and trust by having a request system or process in place. The types of things that you might route in a workflow are report requests, approving a new code value list, or a selection criterion for the data migration.  A data related request has a life cycle with the resolution and deliverable being put into the knowledge base.  The solution or process automatically routes the request to the right data steward or subject matter expert.  Say someone has a question on a data definition in the payroll area, the system will know who that question goes to. Or if no person is assigned to the request then it can be sent to a group of people, an oversight or triage group, who can take it and find an expert and bring them in to resolve. Ultimately, this can help avoid wasting time making content decisions in project meetings. Having a workflow is helpful.  This is where often you get pushback from your project teams because they look at workflows and approvals as too slow and too burdensome to the process. Many people have the impression that data governance is controlling and time consuming. Data governance is about helping people. You want content to be created quickly. Allow yourself either a separate set of project specific workflows or temporally different set of workflows that are tailored to focus on the technology project priorities of speed, quick response, and efficiency over perfect data governance, approval, and review.  
  • Empower data stewards and focus on what are the problems that you are trying to solve. If you approach a project team and say we were trying to do data governance and then give them a six-step approval workflow for every decision, it is not going to happen. You need to say, "We are going to focus on content creation over the review and approval in these workflows." And in doing that, you are prioritizing speed and efficiency, and empowering the data stewards to self-approve. Maybe you have a one step process, using a solution like the Data Cookbook, where the developer or the report writer can elect to have it get routed to an expert for a quick review. Empowering your developers or report writers really depends on your culture and how much time you have. But you want someone to be able to route a request for information only when they need it. If you trust your data stewards to move fast, but allow others to validate, then your developers or data stewards can self-approve this content.  But at the same time, you want to allow others to come by later and say, "I have a different opinion on that" or "There's a reason that maybe that's wrong." And that should not be seen or accepted as a negative thing. It should be, "Oh good, we received that feedback."
  • Create a data system inventory and data catalog with information about the new and old systems. Document the data systems that you are dealing with in this technology project. For each data system, know who the owner is of that data system and where the data in that data system is located. If you have a tool like the Data Cookbook that integrates with the data system and automatically ingests those data models that would be incredibly helpful. This would expose the data to your team members and identify where there might be issues in the migration or whatever the technology project is. And it doesn't take a lot of effort or set up time to gather the data system and data model information.
  • Document the migration, integration, and ETL specifications involved as well as any critical reports involved in the project. A lot of data migrations are a one-time affair, and it is not super critical sometimes that you remember forever, what that value used to be in the old system. But if you are trying to maintain this old system for reporting purposes or if anyone could ask about the information from the old system you want to capture as much information as you can.  Document these new data integrations or ETL processes in a solution like the Data Cookbook so that they can be accessed for later use.
  • Document the reference data involved in the project. Any new code values or valid value lists should be documented in a tool like the Data Cookbook. When migrating or integrating, often reference data does not match up, so it is important to document and see how they tie together.  This solves from having issues later on.
  • Understand and establish data quality rules. A technology project is an opportunity of touching all your data, from a data migration standpoint, from an integration standpoint, and somewhat from a report standpoint. It is through an implementation of a new data system that you have one of the richest sources of understanding your data quality rules. Often your data quality rules are not enforced by a database or software. If you can run the data quality issue the same way as other data requests (using say the Data Cookbook) and defining those data quality rules, it is a really good way to capture that for the long run. You understand that certain things might be useful to document as a rule or a contextual rule. For example, "In the database, an employee must have a birth date, but customers do not have to have a birth date." The database itself might not enforce that, because the birth date is stored on the person record which doesn't require a birthday. You might not realize the database is never going to tell you that's a problem, but you might realize that as you are trying to move data or build reports, and this is a good opportunity for establishing a data quality rule to improve data quality at an organization.
  • Build and update your business glossary. The most valuable bit of emergent content you are going to get during a technology project is your business glossary. At each report, data extraction, or configuration decision, you might end up defining new business glossary terms or updating existing business glossary terms. And the business glossary should be accessible by everyone who needs it in a solution like the Data Cookbook. Often the functional terms would stay the same. But look at your existing glossary terms and look how you define those in the new system or in the new data warehouse. Add the new technical definitions and link them to your functional definitions. 

Hope you found these best practices useful. Data governance with technology projects will save an organization time and effort as well as help make the project successful. Content created during the project will be beneficial in the future.  Additional data governance resources (blog posts, videos, recorded webinars, etc.) can be found here.  Additional data governance-related technology resources can be found here.

IData has a solution, the Data Cookbook, that can aid the employees and the organization in its data governance, data intelligence, data catalog, data stewardship and data quality initiatives. IData also has experts that can assist with data governance, reporting, integration and other technology services on an as needed basis. Feel free to contact us and let us know how we can assist.

 Contact Us

(image credit: StockSnap_BT6YG9XMB0_DB Best Practices Technology Projects_Team_BP #1258)

Jim Walery
About the Author

Jim Walery is a marketing professional who has been providing marketing services to technology companies for over 20 years and specifically those in higher education since 2010. Jim assists in getting the word out about the community via a variety of channels. Jim is knowledgeable in social media, blogging, collateral creation and website content. He is Inbound Marketing certified by HubSpot. Jim holds a B.A. from University of California, Irvine and a M.A. from Webster University. Jim can be reached at jwalery[at]idatainc.com.

Subscribe to Email Updates

Recent Posts

Categories