Connect with us
Finance Digest is a leading online platform for finance and business news, providing insights on banking, finance, technology, investing,trading, insurance, fintech, and more. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.
TECHNOLOGY

Authentaverse Roundtable – a discussion on a new era of authentication and digital trust

Published On :

In July 2023, a new report – The Future of Digital Trust: Authentaverse – provided a ground-breaking perspective on how establishing digital trust will enable financial services organisations to deliver new innovative services and richer customer engagement.

The report imagines a world where consumers can move effortlessly between online services and experiences: transferring money, making purchases, doing the weekly food shop, booking tickets, streaming videos, subscribing to services, all seamlessly, without having to go through sign ins for each individual service.

But this utopic world relies on instant, invisible identity authentication taking place constantly behind the scenes and it will depend on a new trust charter being established between consumers, service providers, technologists, governments and others. The report explores the framework on which this trust charter must be based – four pillars: empowerment, protection, understanding and respect.

Is education the answer?

Should the first, empowerment, rely on educating the customer around scams so that they can protect themselves? The truth is that even the best efforts to education the public on scam methods will have limited effect – scammers constantly evolve ever more sophisticated and convincing methods.

“Customers don’t have the persistent competency to stop themselves being scammed,” one participant suggested. There is a real need for advanced technology to support here: “Even the simplest AI bot could do a better job at avoiding scams and make better decisions than a consumer.” 

As it relates to empowerment therefore, investment and intellectual focus should be trained towards the development of ‘AI bots’ to handle transactions and answer scam calls on our behalf, a solution that offers the added benefit of interrupting the economics of the scam calls. 

But why aren’t we already better at stopping scams? After all, “a text from my bank that’s not a real text shouldn’t even come through to my phone… technology should already be clever enough to stop it.” Telcos have had the capability to stop scam texts for 10 years already, but perhaps lack the motivation to act, due to operating outside of the jurisdiction of UK banking regulation. Dating sites provide another apt example – simple address-linked checks could provide an additional level of trust to their users that the people they’re meeting online are who they say they are. 

Protection by design 

So, how to design protection into customer experience when fraudsters are so adept at turning every process against the customer and grooming them to believe they’re making good decisions? The answer may be to train bank employees (and other in the sector) to be as good at ‘grooming’ customers, to trust them.

Future user experience design must acknowledge the fact there’s such minimal information available to the victim at the point at which the fraud in enacted. What additional information could help them make better decisions? But banks and financial institutions must also become more introspective – asking themselves ‘is the reason fraudsters can easily manipulate our processes because we create systems that are easy to abuse and copycat?’

Solutions must be both progressive and regressive. First, designing friction back into the payment journey, in the form of a natural and compulsory pause for all transfers, would give victims breathing space to reflect on the validity, and crucially, cancel the transfer before it’s too late. Secondly, in some cases, financial institutions may need the power to veto payment requests, as the only realistic way to protect some customers from themselves.

These options go some way to addressing the underlying issue of unlimited compensation to victims – a policy that is considered ‘too protective,’ and ultimately counter-productive by many. There’s a strong feeling amongst industry professionals that it takes all responsibility out of the hands of victims, rendering banks’ repeated attempts to warn customers of impending fraud, obsolete. By way of an antidote, a regulatory-approved cut-off point after which compensation is surrendered, would be a compelling conversation. The inevitable consequence of not acting is banks will start off-loading risky customers, leading to de-banking.

Who is responsible?

Collaboration, as ever, will play a vital role, but regulation continues to take the blame for impeding progress. The idea that companies are hiding behind GDPR to avoid sharing information however is branded “nonsense” by some, who argue, “if there’s a will, there’s a way.”

There’s strong support for revisiting liability – why limit this to the banking sector when fraud begins far beyond those walls. To that end, PSR falls far short of imposing collective responsibility. Who will bring the telcos and big tech platforms to task?

Can we reasonably hold customers responsible for being scammed? As humans, we’re hotwired to trust – it’s a survival instinct. That’s why fraud is so effective. “We can give them general ideas, like the seatbelt campaigns (of the 1980s) using simple language to build good habits and encourage good behaviours such as ‘never buy at the door’, that avoid pressure decisions.” Research also indicates it can be effective to simply call them out; “challenge them. Say you don’t believe them.”

If AI bots answering scam calls on our behalf are indeed to form part of the solution – how to convince traditional financial services providers to invest in these technologies now? Some, but not much, robotic process automation already exists, but the sector lacks general understanding of its potential. A clear line of sight to best practices would benefit all sides. Ultimately, it all comes back to data; “scams are all about patterns, so the AI comes in to identify invisible patterns. But the issue is, we need the data.”

Data rules

Herein lies another issue – how to get hold of the right data. A discussion of this nature must include Big Tech, which has to date been somewhat reluctant to share – although the tide appears to be turning. Meaningful collaboration and data sharing must be for the benefit all of society, not individual advantage. Unfortunately, we currently lack the legal infrastructures and frameworks to govern such collaboration. Added to which, the lack of an ID framework leaves the UK wide open to targeting by fraudsters. Physical ID cards may be “last century,” but a national ID framework would protect everyone.

For this we need greater government and public sector support, but only if the existing skills gap can be filled. Recruiting the right people is a big challenge and currently “affecting the UK’s position as a global leader in technology-aided regulation.”

For any legislative framework to operate successfully, it must be independent of the ebb and flow of party politics, so progress is not disrupted by changing government. Just as nature abhors a vacuum, if industry fails to come up with a solution, one will inevitably be imposed upon them by desperate legislation designed to placate the voting electorate, potentially “creating a world of pain for everyone.”

Future ‘metaverse’ digital environments comprising underlying digital signatures, blockchain and Web3 technology provide some hope, where identity and security are in-built, in sharp contrast to the incumbent surface web. But this is still some way off.

Small steps

In the near term, banks must rely on hands-on resource and friction-intensive methods such as a much more rigorous discussion at point of transfer, as good practice, in addition to the underlying fraud prevention technology in place.

The recent charter between the UK Government and leading online platforms and services establishing a voluntary framework agreement to work together to reduce online fraud, is another step in the right direction. Whether our future selves will be downloading AI bots to answer incoming scam calls on our behalf, remains to be seen.

At the end of any passionate debate about fraud and digital trust, there’s always optimism to be found. A more secure web3 framework, better understanding, data sharing, stronger legislative frameworks and redesigned customer journeys less open to abuse. Certainly, the path to a future Authentaverse is paved with both opportunities and challenges, yet the majority of protagonists agreed that with more collaboration and action is required, across financial services, governments and big tech, we can all move close towards the shared aim of making fraudsters’ jobs as difficult as possible. 

 

This article is based on a discussion that took place during an intimate Chatham House roundtable event in Central London, following the launch of The Future of Digital Trust: Authentaverse report, a collaboration between LexisNexis Risk Solutions and The Future Laboratory in July 2023. To find out more, visit: https://risk.lexisnexis.co.uk/insights-resources/research/future-of-digital-trust-authentaverse?trmid=BSUKGN23.BSUKBR23.DgtTrust.CS3P-1079401 

 

Continue Reading

Why pay for news and opinions when you can get them for free?

       Subscribe for free now!


By submitting this form, you are consenting to receive marketing emails from: . You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Recent Posts