Behavior-based user authentication on mobile devices in various usage contexts

EURASIP Journal on Information Security

Abstract

Many solutions for knowledge- and biometric-based authentication and liveness detection on mobile devices are presented on the market today. The former ones provide strong authentication at the cost of usability, namely necessity of a password typing. The biometric-based authentication solutions allows improving usability by preserving low error rate. Despite the widespread integration, they tend to be vulnerable to presentation attack (spoofing of biometric data).

 

The comparison of modern multifactor authentication and access control solutions, for mobile devices are presented below.

 

ACS-comparison

 

Promising approach is provided by modern behavior-based authentication technologies. Typically, they are based on biometric data capturing and extracting userspecific behavioral patterns needed for the analysis of several modalities during user interactions with mobile devices. However, the performance of these systems significantly depends on the context, namely user’s activity (motionless, walking, running) and application in use. Thus, spoofing-proof transparent user-friendly methods for user authentication are needed.

 

To provide on-device context-dependent behavior-based authentication, we propose the BehaviorID solution. The comparison of scopes for state-of-the-art and proposed methods is presented below.

 

IAMLandscape

 

The flowchart of the user’s features processing with proposed method is presented below.

 

BehaviorID-Realization

 

The user authentication with the BehaviorID method starts from a triggering event, such as launch a predefined application. Then, signals from the device’s embedded sensors are being gathered until finalization trigger event, for instance, start typing in a launched application. At the second step, context recognition is performed using preprocessed signals. The recognition model is based on convolutional neural network (CNN) for feature extraction from inputted signals. The prepared signals are combined into modalities to be processed with advanced A-RNN model. The feature of the network is usage of mixture layer to improve performance in case of processing sequences with multiple patterns, e.g., mixture of output signals from embedded sensors. The output of context recognition model is used as an external parameter for mixture layers of A-RNN to compensate possible alterations of of user behavioral profile. Finally, the outputs of each A-RNN related to individual modality are processed with decision-making module. In case of positive decision (user is authenticated), the user is notified about the success authentication, and the extracted features are used to update the A-RNN parameters. Otherwise, negative decision is reported (user is not recognized).

 

Effective countermeasures against spoofing attack during user authentication requires using additional factors. BehaviorID allows “strengthening” of widespread authentication methods by the usage of several modalities, for example:
  • Password-based authentication — user’s behavior parameters are analyzed during typing, namely keystroke dynamics, device small motion, and keyboard hit map. If user behavior differs from saved profile, the device will remain locked regardless of the correctness of the entered password.
  • Secure keyboard — the proposed method can be used for touchscreen keyboard strengthening while working in messengers, social networks, etc. In case of failure of the BehaviorID authentication, a message will not be sent and the system will ask for additional authentication
  • Strengthening of biometric-based authentication — BehaviorID can be used for increasing the robustness of biometric-based authentication to spoofing. This is achieved by analyzing the device’s small motions during authentication. Thus, a device remains locked even for spoofed biometric authentication, for example, facial recognition, if motion patterns differ from a known one or even absent
  • Users transparent authentication — the method checks of user authenticity at the background (without making of authentication request) after launching of predefined applications, for example, banking app. If the difference between gathered behavioral data, such as swipe patterns, touchscreen hit map, user’s sight tracking, and a reference profile, is above a threshold, the device is locked in short time.

 

Rich functionality of the proposed BehaviorID method makes it an attractive solution for transparent multifactor on-device user authentication. Performance evaluation of the state-of-the-art and proposed BehaviorID solutions was performed for both single modal and multimodal user authentication. The following use cases were considered:

 

  • Estimation of Spoofing Acceptance Rate (SAR) is performed to check the conformity of BehaviorID performance with requirements of Android Tiered Authentication Model (ATAM) for processing sensitive data on mobile and wearable devices.
  • Keystroke dynamics-based authentication in various context corresponds to the case of “strengthening” of user authentication by single modality (e.g., password-based authentication) in several usage contexts.
  • Multimodal authentication by changing usage context allows evaluating the solution robustness to changes of person’s behavioral templates, for example, start walking after still standing.
  • Long-term tracking of behavioral pattern alterations corresponds to the case of changing of user’s behavior over long-term usage of ACS system.

 

BehaviorID performance evaluation was done using a set of public and in-house datasets of behavioral patterns for considered modalities in various usage contexts, such as The ExtraSensory dataset, MotionSense dataset, SherLock dataset, H-MOG dataset, UMDAA-02 dataset, BB-MAS dataset, in-house fixed-context dataset to name a few.

 

For comparison, we considered the following state-of-the-art solutions for behavior-based user authentication on smartphones:

 

  • Abuhamad et al. method — based on the utilization of deep RNN, namely LSTM, for modeling temporary dependencies between samples of behavioral templates
  • Reichinger et al. method — based on unsupervised learning approach for gathered behavioral features with usage of hidden Markov model
  • MMAuth method — integrates the heterogeneous information of user identity from multiple modalities with usage of developed time-extended behavioral feature set and a deep learning based oneclass classifier

 

Estimated FAR, FRR, and SAR for single and multifactor authentication cases for fixed usage context (users are still sitting) using state-of-the-art and proposed methods are presented below. The “Acc” and “Gyr” stand for accelerometer and gyroscope.

 

Single and multifactor authenticaiton

 

Moving from single factor to multifactor authentication allows for decreasing SAR values from 37.2 to 2.9% by preserving low FAR (about 2.5%) and FRR (near 8% ) values for the proposed method (Table 2). The obtained results are close to state of the art in the domain of behavior-based authentication. Note that the obtained SAR values for multifactor authentication (SAR<7%) corresponds to class 3 (strong) tier of ATAM. This makes the proposed solution an attractive candidate for use in security-sensitive scenarios.

 

The next stage of performance analysis is aimed at the evaluation of considered solutions for the case of multimodal authentication in several usage contexts. The H-MOG and UMDAA-02 datasets were used for the estimation of FAR and FRR in this case. The keystroke dynamics, device fine motions, swipe patterns, applications profiling, and gaze tracking were used as authentication factors. Also, the most difficult case of long-term tracking of behavioral template was considered. The performance analysis was done on SherLock dataset using three modalities, namely keystroke dynamics, application usage logs, and device’s fine motions.

 

The estimated values of FAR and FRR metrics for the state-of-the-art and proposed solutions for case of changing usage context (from still sitting to walking) are presented below.

 

Dataset Metric Abuhamad et al. method Reichinger. method MMAuth method BehaviorID method

H-MOG dataset

FAR, % 1.8% 0.9% 1.3% 0.3%
FRR, % 3.0% 1.5% 1.9% 1.3%

UMDAA-02 dataset

FAR, % 7.4% 6.8% 7.9% 7.0%
FRR, % 5.4% 4.1% 5.0% 3.5%

 

For comparison, the estimated values of FAR and FRR metrics for modern and proposed solutions for the 3-week usage period (SherLock dataset) are presented below.

 

Metric Abuhamad et al. method Reichinger. method MMAuth method BehaviorID method
FAR, % 9.4% 2.8% 6.6% 2.1%
FRR, % 12.7% 4.2% 11.9% 3.9%

 

Performance analysis of the proposed methods showed that BehaviorID allows outperforming the state-of-theart multifactor behavior-based authentication methods even in the most difficult case of long-term tracking of behavioral patterns (about 2.1% FAR and 3.9% FRR). Also, the proposed method provides low error rate in various usage context (about 0.5% FAR and 1.3% FRR) by preserving fast detection of non-owner user (within 0.5 − 1.0 s for Samsung Galaxy S21 smartphone). This makes the proposed BehaviorID method a promising candidate for the next-generation user authentication systems on mobile and wearable devices.


Some company, Copyright © 2022