Executive Summary

This report contains findings and recommendations from the second round of CX research conducted as part of the Authentication Uplift project. Round 2 research focussed on ‘App/Browser-to-App with Biometric’ and ran in November of 2022. The purpose of the research was to identify consumer experience considerations to support and inform an expanded approach to Consumer Data Right (CDR) authentication. The objective is to give consumers more choice and freedom when authenticating themselves with data holders, while maintaining financial grade security. Round 1 was conducted in September of 2022 and benchmarked the existing ‘Redirect with One Time Password (OTP)’ model.

In total, 90 consumers participated in round 2 research; 10 consumers participated in 1:1 interview sessions which ran for an hour and a half each and 80 consumers participated in unmoderated prototype tests which ran for half an hour. App-to-app and Browser-to-App prototypes were used to facilitate discussion and generate insights in relation to the authentication models shown, as well as to authentication more generally.

Consultation

This project relates to NP280 which is open for consultation from 14 December 2022 to 27 January 2023.

Context

The authentication stage is the second step in The Consent Model and involves a consumer verifying who they are with their Data Holder (DH). This step is required so the data holder can connect the data recipient's authorisation request to the correct CDR consumer.

Authentication in the CDR regime is limited to a single consistent, authentication model, referred to as the 'Redirect with One Time Password' flow. No other flows are currently supported. ‘Redirect with One Time Password’ was previously tested in June 2019 against two models; ‘Redirect to Known’ and ‘Decoupled’, and was found to be the preferred authentication model by research participants. The outcomes can be accessed in Phase 2 Stream 3 report.

This research has been informed by the following:

Findings

The research found that biometric authentication methods (such as FaceID) weren’t as widely accepted as the research team had initially anticipated, though all 90 participants were familiar with them and frequently used them. We observed preferences for its usage over traditional passwords in some use cases because of its uniqueness and inherence, but this was in scenarios where there was little-to-no risk involved in successful authentication. While there was general agreement that authentication should adapt based on the scenario (i.e. accessing sensitive vs. non-sensitive data), similar to the findings in Round 1, not all participants shared this view that authentication should adapt. This was not because they thought less-sensitive data (such as telco or energy data) required less stringent authentication methods, rather, these participants had an expectation that all of their data should be kept secure and private. Many participants expected a standardised approach to authentication; with consistent and strong authentication required to login irrespective of the use case or sensitivity of data. Participants unanimously preferred Multi-Factor Authentication (MFA) over any specific authentication model using only one factor to authenticate.

From the 2-Factor Authentication (2FA) use cases tested (FaceID + OTP, FaceID + PIN) the research team observed a preference for step-up authentication (step-up authentication requires additional levels of authentication to adapt as the risk profile and sensitivity of the action increases). Some participants found back-to-back authentication overwhelming (back-to-back authentication asks two factors of authentication in a row) and step-up authentication was perceived to be the more gentle approach and considered as a confirmation of an action; leaving the participant feeling confident and in control. This is covered further in Insight #7 of this report.

There were a few issues raised by participants about redirecting from a consumer app or website to a Data Holder app for authentication and authorisation; with several participants saying they may flag automatic redirection in the real world as a suspicious “phishing attempt” if they were engaging with a brand they had not yet established trust with. Participants also stated they would feel more in control if there was an alert such as a push notification or a call to action button prior to moving over to their Data Holder app (rather than being redirected automatically) especially in the instances where the authentication method was FaceID as a single factor. This was because several participants had experiences historically where they unintentionally authenticated with FaceID just because they were looking at their phone at the point of the prompt. Lastly, for the use cases where biometric methods were used as a single factor, participants desired higher levels of friction when there was more risk involved in the action they were taking.

The research found that, while scores were not significantly different, redirecting to a DH app from an ADR app was considered more trustworthy and had slightly higher participant confidence compared to redirecting from a browser-based website. However, this finding is only true for apps that have been downloaded from a reputable source and have a pre-established level of user trust and confidence – thus making it a strong option particularly for the banking sector, with many participants having installed and regularly using their banking providers’ mobile app. For the scenarios where a participant didn’t have a Data Holder’s app installed on their device, there were two groups of expectations for what should occur; the first was that the user would be taken to a browser where they could access the web-version of the service to authenticate, and the second was that they would be taken to the AppStore or GooglePlay to download the application.

Accessibility and inclusivity continue to play a key role in how users authenticate to a platform. This round also found participants advocating for a risk-based model to protect vulnerable consumers such as those who experience Domestic and Family Violence, as there is a risk of one party taking on debt without their knowledge, or coerced consent.

We explore these findings in depth throughout this report and provide some early-stage, high-level recommendations in the summary section of this report

Research artefacts at a glance