BoE tells bank boards to take responsibility for machine failings

BoE tells bank boards to take responsibility for machine failings
FILE PHOTO: A pedestrian shelters under a Union Flag umbrella in front of the Bank of England, in London, Britain August 16, 2018. REUTERS/ Hannah McKay Copyright Hannah Mckay(Reuters)
By Reuters
Share this articleComments
Share this articleClose Button

By Huw Jones

LONDON (Reuters) - Bank board members will have to pay increasingly close attention to functions which are being automated and no longer carried out by individuals, including anti-money laundering checks, a senior Bank of England official said on Tuesday.

BoE executive director for UK deposit takers supervision James Proudman said managing risks from the use of artificial intelligence (AI) or automated technology to combat money laundering (ML) is an increasingly important strategic issue.

"Some of the largest international investment banks are now declaring that they are technology companies with banking licences," he told a Financial Conduct Authority (FCA) event.

The BoE and FCA surveyed more than 200 banks, insurers and financial market infrastructure firms in March, in their first systematic survey of using AI in money-laundering checks and will publish the full results in the third quarter.

"The mood around AI implementation amongst firms regulated by the Bank of England is strategic but cautious," said Proudman, adding that boards must watch how data is being used.

"Are data being used unfairly to exclude individuals or groups, or to promote unjustifiably privileged access for others?" Proudman said, adding that recent examples of retailers using overly-personalised marketing can seem plain "creepy".

Boards will also need to consider how to allocate individual responsibilities under the Senior Managers Regime, which requires every activity at a bank to come under the direct responsibility of a named official so it is easier for regulators to identify and punish them when things go wrong.

"You cannot tell a machine to 'do the right thing' without somehow first telling it what 'right' is - nor can a machine be a whistle-blower of its own learning algorithm," Proudman said.

"As the rate of introduction of AI/ML in financial services looks set to increase, so too does the extent of execution risk that boards will need to oversee and mitigate."

(Reporting by Huw Jones; Editing by Alexander Smith)

Share this articleComments

You might also like