It appears there is a new quest for the holy grail in Master Data Management; autonomous processing.
The hypothesis is simple. Given the proper set of business rules, robust master data management algorithms, and machine learning applications, the need for human review and intervention can be eliminated from the Data Stewardship process.
I would argue not so fast.
For the foreseeable future, there will always be the need for manual stewardship. While the new breed of technologies should reduce the length of exception queue, nonetheless the queue will continue to exist.
Manual data stewardship drives quality assurance.
The data stewardship exception queue requires manual review and processing. Through the processing of the queue, additional business rules will be discovered. Once these rules are introduced into the master data management life cycle, the length of the queue will be further reduced.
The paradox is the knowledge workers within the organization, who understand the nuisances of the data and business processes, are generally not the resources monitoring the comings and goings of the exception queue. Most organizations look to junior level or interns to fill these positions. Unfortunately, these resources lack the necessary business acumen.
When the results of the data stewardship activity are less than desirable, quality assurance suffers. Business intelligence application success hinges upon the quality of the master data management and subsequent stewardship activities.
At Bare Cove Technologies, we understand the interrelationships of data governance, business intelligence, and quality assurance.
What’s your opinion?