wigs

Skin Undertone Detection

At the the age of 17, I noticed that most fashion magazines would create a quiz or just provide a few celebrity pictures to help women determine the undertone of their skin. However, this method is not accurate. There is a lot of scope for human error. So, I wanted to create an app that could analyze the undertone of skin and get the most accurate results possible. Here’s how I did it.

Principle Used

An undertone is the color from underneath the surface of the skin that affects its overall hue. By observing the color of the veins at the wrist, just below the palm, one can identify the skin’s undertone.

Red = Warm undertone

Blue = Cool undertone

Green = Neutral undertone

Keeping this fact in mind, I decided to create an android app that works in the following way:

Method

  1. The user takes a close up picture of their wrist, just below the palm.
  2. The app analyzes the picture and measures the amount of red, blue and green tones in the image.
  3. If the image has:
  • A higher content of red, the user has a warm undertone;
  • A higher concentration og green, the user has a neutral undertone;
  • A higher content of blue, the user has a cool undertone.

The first version of the app was implemented using Java in Android Studio. You can view the app on the Google Playstore by clicking the button below:

Python Implementation

First, reprogrammed the Java version in Python.

Applying Machine Learning

With the recent advancements in AI, I decided to polish my original idea and make my app smarter by using KMeans clustering.

Code

Dataset

Name: Skin Dataset

Data Set Information:

The skin dataset is collected by randomly sampling B,G,R values from face images of various age groups (young, middle, and old), race groups (white, black, and asian), and genders obtained from FERET database and PAL database. Total learning sample size is 245057; out of which 50859 is the skin samples and 194198 is non-skin samples. Color FERET Image Database: [Web Link], PAL Face Database from Productive Aging Laboratory, The University of Texas at Dallas: [Web Link].

Attribute Information:

This dataset is of the dimension 245057 * 4 where first three columns are B,G,R (x1,x2, and x3 features) values and fourth column is of the class labels (decision variable y).

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *