MEDIA RELEASE. Post Royal Commission, financial institutions are cleaning up their act and cleaning up their customer data, according to QMV managing director, Mark Vaughan.
“Banks, superannuation funds, insurers and wealth managers have been working overtime since the Royal Commission, unpicking data issues and, of course where needed, compensating customers for account errors such as fee-for-no-service and interest miscalculations.
“Data remediation teams in some cases have swollen to hundreds or even thousands of personnel due to the complexity and extent of work required,” he said.
A typical remediation program involves a number to steps: isolate the data error and identify the root cause, quarantine its spread, fix and compensate where appropriate, communicate with relevant stakeholders and implement change to prevent it from happening again.
“In this environment, demand for Remediation as a Service (RaaS) is increasing as external specialists are engaged to run data remediation programs entirely, or to bolster existing internal teams.
“The whole industry will benefit from this movement. There is an increased drive to ensure that remediation and preventative measures are effectively implemented to avoid expensive and time-consuming data errors happening in the first place.
“Financial institutions have always had a focus on data and we are now seeing them taking data quality to a higher level by actively implementing more robust controls and preventative measures to avoid expensive and time-consuming data errors happening in the first place.”
Mr Vaughan said the most common causes of data error in financial institutions are unit pricing errors, delays in crediting or calculation of interest, missing or incorrect information like date of birth or salary, administrative data entry error, lack of internal controls and misinterpretation of fee calculations.
“Invariably on further investigation of an original issue, many other issues are unearthed and scope creep is a major risk in remediation,” he said.
“This type of work requires highly specialised remediation experts with developed processes, calculation models and the right technology to fast-track quality remediation work and reduce costs. Without this, time and cost can blow out extremely quickly and worse. The outcome is only partial success and implementation of inadequate preventive measures.
“The reality is, having issues with data is common and normal. It’s not about having the problem, it’s about how and when you find it, and how quickly you can quarantine and fix it.
“After this initial mass-clean up, the future for financial institutions hinges on having sophisticated controls in place to keep customer data error to a minimum,” he said.
According to Mr Vaughan, it has become requisite for all financial institutions to have both a dedicated unit that owns data quality and to have proven data quality software tools in place.
“In the past, many business units each somewhat owned data quality and therefore no one owned it. Customer data held in disparate data silos is a big part of the problem,” he said.