In Financial Services companies, It is a long game explaining to the business about the benefits of good data quality. Historically any data quality initiatives were being downgraded by CFOs / CEOs while regulatory initiatives were prioritised. Data professionals were longing to create a structured framework for data governance and data quality. After a long period, Data professionals are delighted that the banking industry is coming to terms with data governance and data quality initiatives largely driven by regulatory pressures such as BCBS 239.
Certain regulators are taking the data quality regime to the next level by defining data quality in a more holistic way. For example European Central Bank (ECB) has included “Plausibility” as a measure of data quality. ECB has broken Plausibility in to two; “Stability” and “Outlier analysis”.
Stability is “Change in the total number of reported data points from period to period”
Outlier Analysis is a two dimensional analysis of data values
a) Extreme growth compared to peers and the population
b) Extreme levels compared to peers and the populations
From a Data Quality metric measurement perspective, it is easier to assign a data quality score for Stability. However “outlier analysis” is a lot different, and it is extremely difficult for a financial institution to measure and monitor. Majority of the submissions to ECB is quarterly / semi annually and has strict due dates. Though I appreciate ECB’s holistic view on “Plausibility”, it gives no room for banks / financial institutions to monitor this metric. Firstly, where the hell the banks are going to peer data (prior to submission), in almost all circumstances these are non-public data (like margin, trading positions, balance sheet size) prior to any exchange filings. Secondly, the legal entity structure of various banks does not make the data comparable, Thirdly, banks and institutions are highly regulated which makes it cumbersome to share the data among peers .
Banks have no way of predicting whether their data in their submissions is “plausible” by themselves, which warrants further investigations by ECB to the state of affairs of a financial institution.
As a data governance consultant, I would like to enforce the principle when a data should meet stakeholders requirements at first pass. However “plausible” dimension by ECB makes the data submissions rarely to be correct at first pass which requires further analysis and investigations.
Dear ECB, I am glad you are on-board with the data quality ideas…but please can you stop these shenanigans.
This article is published as part of the IDG Contributor Network. Want to Join?