Nowadays, with the proliferation of data exchange across the Internet and the storage of sensitive data on open networks, intensive works are done to secure data. Recently, researchers made more considerable effort to utilize biometric features instead of a memorable password to authenticate the user identity. Based on the theories of chaos map, which are known to possess desirable properties of pseudorandomness, high sensitivity to initial conditions, and very large key space, we propose a biometric cryptosystem for online users’ authentications. The biggest advantage of our proposal is the secure nature of the key. In fact, our system is designed first so that the key never needs to be transmitted or revealed but it will be derived from biometric template and second even if we introduce the same biometric template twice, our system generates different key values. In order to assess the effectiveness of our approach, a biometric-based online automatic teller machine system is proposed and evaluated. Experimental results and statistical analysis, using palmprint as well as palm-vein databases of 200 users, are presented. The obtained results show that the proposed scheme is more secure, fast, and points at increased authentication accuracy. Therefore, it can be implemented in real-time applications, which will undoubtedly help to promote more widespread use of biometric cryptosystems based on chaos theory.
Person’s identity validation is becoming much more essential due to the increasing demand for high-security systems. A biometric system testifies the authenticity of specific physiological or behavioral characteristics-based biometric technology. This technology has been successfully applied to verification and identification systems. We analyze the multispectral palmprint biometric identification system in unimodal and multimodal modes. In an identification system, the feature extraction is a crucial step. For this reason, we propose an efficient deep learning feature extraction algorithm called discrete cosine transform network (DCTNet). The effectiveness of the proposed approach has been evaluated on two publicly available databases: CASIA and PolyU. The obtained results clearly indicate that the DCTNet deep learning-based feature extraction technique can achieve comparable performance to the best of the state-of-the-art techniques.
Performance of modern automated pattern recognition (PR) systems is heavily influenced by accuracy of their feature extraction algorithm. Many papers have demonstrated uses of deep learning techniques in PR, but there is little evidence on using them as feature extractors. Our goal is to contribute to this field and perform a comparative study between classical used methods in feature extraction and deep learning techniques. For that, a biometric recognition system, which is a PR application, is developed and evaluated using a proposed evaluation metric called expected risk probability. In our study, two deeply learned features, based on PCANet and DCTNet deep learning techniques, are used with two biometric modalities that are palmprint and palm-vein. Subsequently, the efficiency of these techniques is compared with various classical feature extraction methods. From the obtained results, we drew our conclusions on a very positive impact of deep learning techniques on overall recognition rate, and thus these techniques significantly outperform the classical techniques.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.