Data plays a central role in banking – arguably more so than in any other industry. Financial institutions are heavily reliant on quality data to make strategic business decisions, and also have the responsibility of ensuring that sensitive customer information is kept secure and managed responsibly.
According to recent research, one area where many banks have room for improvement is verifying the accuracy of their data.
The Banking Technology Vision 2018 report from Accenture revealed that more than nine out of ten bankers (94 percent) are confident in the integrity of their data sources, but many could be doing more to guarantee data quality.
Around a quarter (24 percent) of respondents said they validate their data but should do more to ensure quality, while 16 percent try to use validation methods but are unsure of the quality. One in ten bankers (11 percent) trust the reliability of their data but don’t verify it.
Another key finding showed that more than three-quarters (78 percent) of banking professionals have reservations about the use of data to drive automated decision-making. Fake data, external data manipulation and inherent bias were among the concerns highlighted.
Alan McIntyre, senior managing director and head of Accenture’s banking practice, said: “Inaccurate, unverified data will make banks vulnerable to false business insights that drive bad decisions. Banks can address this vulnerability by verifying the history of data from its origin onward – understanding the context of the data and how it is being used – and by securing and maintaining the data.”
Other trends highlighted in the report include the ongoing development of artificial intelligence, with 79 percent of bankers predicting that AI will work alongside humans as collaborators and trusted advisors within the next two years.
There are also concerns in this area, however, such as whether the decisions made by AI will meet regulatory and ethical standards.