Industry News: Google builds a machine learning model that tells you if an image is pleasing to the human eye.
Human perception is unique in design and is slightly different for every human to some extent. I would confidently compare it to fingerprint because one person sees the world differently than another which makes human perception something almost impossible to predict.
Leave it to Google to do the impossible.
In mid December Google announced a new machine learning model called Neural Image Assessment, Or NIMA. This model is designed to predict what images humans are likely to find aesthetically pleasing. What wait? Yep that’s right, Google has created a new algorithm than can predict and rank images according to their aesthetic appeal.
This is night and day compared to the machine learning model that Google currently has in place, and is limited to only being able to categorize images according to their quality.
NIMA was born from something called Convolutional Neural Network (CNN) and is typically used for image recognition and classification. CNNs work by using data that has been labeled and rated by humans to train models to identify the characteristics humans are likely to find aesthetically pleasing. NIMA is centralized around well-formed characteristics that is associated with emotions and beauty in images.
To learn more about Google’s new machine learning model, click here for the full article.