- 16th April 2018
- Posted by: Manolis
New report from KPMG reveals a growing ‘trust gap‘ and uncertainties about who’s accountable for errors and misuse.
But a large majority of senior executives don’t have a high level of trust in the way their organization uses data, analytics, or AI, according to a new report from tax and advisory firm KPMG International.
The firm surveyed 2,190 global senior executives, and found that just 35 percent say they have a high level of trust in the way their organization uses data and analytics. Their concerns over the risks of data, analytics, and AI are high, with about two-thirds having some reservations or active mistrust in their data and analytics.
“We often see organizations run dual processes — one managed by humans and one managed by machines– to determine whether the machine-generated insights align to those delivered by their tried-and-true, human-generated processes,” said Brad Fisher, national leader D&A (data and analytics) at KPMG in the US. “That’s simply because many executives don’t have confidence that the insights are reliable and accurate.”
A huge majority (92 percent) are concerned about the negative impact of data and analytics on corporate reputation. Even so, many senior executives (62 percent) said technology functions, not the C-level and functional areas, bear responsibility when a machine or an algorithm goes wrong.
KPMG’s Guardians of Trust report suggests that the growing interrelationship between humans and machines calls for stronger accountability at the C-level rather than with the technology functions, and for proactive governance with strategic and operational controls that ensure and maintain trust.
As organizations make the shift to fully digital, analytically-driven enterprises, the study says, the management of machines is becoming as important as the management of people.
“Once analytics and AI become ubiquitous, it will be imperative and more difficult to manage trust,” said Thomas Erwin, global head of KPMG Lighthouse — Center of Excellence for D&A and Intelligent Automation.
“With the rapid take-up of predictive analytics, we should prepare now to bring appropriate governance to this wild west of algorithms,” Erwin said. “The governance of machines must become a core part of the governance of the whole organization, with the goal being to match the power and risk of D&A with the wisdom to use it well.”
Even with the low confidence over the reputational and financial risks of analytics errors or misuse, survey respondents weren’t clear about who should be accountable if a poor business decision results in financial loss, or the loss of customers.
In addition to the 62 percent of executives who said the primary responsibility should lie with technology functions within their organizations, 25 percent thought it was on the shoulders of the core business, and 13 percent felt it should be regulatory and control functions.
Taking a closer look at which roles within the C-suite should hold the blame when analytics go wrong, the broad distribution of responses suggest a lack of clarity, the report said, with only 19 percent saying the CIO, 13 percent the chief data officer, and only 7 percent C-level executive decision makers such as the CEO.
The report makes five recommendations for building trust within an organization: Develop standards to create effective policies and procedures for all organizations; improve and adapt regulations to build confidence in D&A; increase transparency of algorithms and methodologies; create professional codes for data scientists; and strengthen internal and external assurance mechanisms that validate and identify areas of weakness.