MEDIA RELEASE The price tag for remediation issues identified from the Hayne Royal Commission is estimated to top $2 billion, but most of this cost could have been avoided if data protocols had been better maintained, says QMV.
“Unfortunately, many organisations will now pay the price for not having implemented adequate data management tools and processes. With the right tools and systems the costs of inaccurate data can be reduced to literally thousands rather than billions,” says Mark Vaughan, managing director of QMV.
“With the royal commission focusing the spotlight on conduct, the valuable role that technology and data can play in ensuring good client and member outcomes has been overshadowed.
“It is now clear that the evidence used for decision making in institutions has not always been as robust as it could be.
“But so often the answer lies in the data, and data tells the story. Data provides a much firmer foundation for decision making than human judgement alone.”
Mr Vaughan says it is important for all financial institutions and super funds – not just the big four and AMP – to be on the front foot when it comes to the data they hold on members and clients.
“Regardless of the final numbers, there is no doubt that data errors are costly for institutions in many respects – from remediation expenditure, to compensation payments, not to mention reputational damage – and the longer a data quality error goes undetected, the greater the ultimate harm to bottom lines for institutions and members.
“The exponential harm that the proliferation of data errors can cause is a ‘disease effect’. If data quality issues are not detected and remedied soon after they occur, there is a tendency for the error to spread and contaminate other data, even jumping to other systems.”
The focus on data quality post royal commission should be on prevention, detection and correction, in that order, Mr Vaughan says.
“For this to occur, there needs to be a turnaround in the corporate attitude to data quality.
“Many organisations view data quality as an expense, even though the return on investment that good quality data can provide is invaluable. Until data quality is seen as an investment, the reactive, and inefficient, approach where data quality spending is heavily geared toward correction will continue.
“Across the industry there has been an attitude that it is ‘ok’ to wait for a problem to occur and then fix it, rather than take a proactive and preventative approach. Post royal commission there is much scrutiny, much ground to make up and much at stake.
“Many organisations tend to forecast an overly optimistic annual spend on data remediation without looking carefully at their ‘actual’ retrospective spend in previous years.
“However, organisations that will thrive both in corporate culture and consumer trust following the royal commission are those that take the initiative by routinely running internal monitoring and testing for potential breaches or process failures. Furthermore, successful organisations will communicate with their customers to inform, resolve and redeem any errors or failures that are discovered.
“The true measure of how you value your customers lies in tangible, visible and measurable action; not simply a tagline that’s promoted in an advertising blitz. Industry players who understand this will rebuild consumer trust by demonstrating both investment in and benefits of customer-centric systems and processes.
“This can be a positive, relatively inexpensive and simple step for the industry. Data quality is not just a box-ticking, window dressing measure. It offers real and sustained impacts on systems that benefit both organisations and customers.
“Organisations that continue to stick their head in the sand, remaining in the same tired cycle of reactive remediations, are building a ‘remediation debt’ that will ultimately bear significant financial and reputational cost if it hasn’t already,” Mr Vaughan said.