Palm (Biometric) Recognition on Android Phones Developed using Deep Learning Approach

Opeoluwa Iwashokun
3 min readApr 10, 2018

--

Image of someone demonstrating how image is captured

I just tried something new and I am sooo glad to share. Many thanks to Dr Afolabi, for his unwavering sacrifice of time, tutoring and mentoring to put me through the tasks. I salute you sir! I also make bold to mention my DSN (Data Science Nigeria) family. DSN rocks! Dr Olubayo Adekanmbi, you have brought us this far and we ll go even farther together. My heroes: Dr Afolabi and Dr Bayo, thanks a million for investing in lives! You are such a rare gem!!!

So the andriod mobile app designed is a pilot run towards having a simple biometrics identification tool. Biometrics is a technology that identifies and authenticates individuals based on physical characteristics such as fingerprint identification, iris and retina, facial recognition, gait, or voice. In this case, it is the PALMS. Yeah, the Palms!!! The palms (combined with the fingers and thumb) is unique for every individual. Also, its gonna be even easier to tap the palms than a finger. Are you feeling me!!! **winks** So here is how it was all put together…

How it was achieved

It was designed using a MobileNet Library available on TensorFlow Machine Learning (a deep learning approach) and deployed using an Andriod SDK. To do this, you ll need a laptop, android device and internet connectivity. Then follow the 5 steps below:

1. Install Tensorflow and Andriod SDK on your computer

2. Clone the MobileNet library from the github repository. In this case, I cloned a replica project used for detecting dog pets. https://codelabs.developers.google.com/codelabs/tensorflow-for-poets-2/#1

3. Train the library on images of the palms that was previously captured. The images must be placed in folders with each folder containing a minimum of 10 images of an individual. The images were trained using python codes on google collaborator Cloud environment

4. Import library into an android SDK. Run the program and install on an android device.

5. Test in live. Test by facing the camera of the mobile device to the palm of the person whose palm images were previously captured and trained.

The results shows a recognition of the person by palm and at a given probability/ accuracy precision as shown below. Watch short demonstrating video clips below

Here is a short demo video below

See Sample Results below:

Palm image correctly identified to a tune of 79%. Shadow effects on image affected accuracy level
Image correctly identified by 98% accuracy

--

--

Responses (1)