A male ex-Russian propagandist is behind an unflattering AI app that shows how women look without makeup

Unflattering.

caption
Unflattering.
source
Shona Ghosh/Business Insider

No one knows how to generate headlines and outrage like a pro-Russia propagandist.

Which might explain how Ashot Gabrelyanov has come to promote his artificial intelligence startup.

Gabrelyanov, who lives in Brooklyn, bills himself as the founder of MakeApp, a new app which uses artificial intelligence (AI) to supposedly show how women really look without makeup.

You take a selfie or upload a picture of someone else, and the free version of the app gives you two options to add makeup or remove makeup. You pick one, and the filter does the rest.

It’s a controversial idea by itself, given that AI tends to be fairly unflattering to anyone with a darker skin tone, and that using AI to judge female beauty is a pretty questionable goal.

But Gabrelyanov, a Russian native, is no stranger to controversy. He is also behind pro-Russia news outlet called LifeNews, which was described by the independent Moscow Times as showing “obsequious loyalty to the Kremlin.” One current story accuses the US of financing “Russophobe extremists” on social media. The outlet is popular in Russia for its celebrity scoops – but is also banned in Ukraine for airing “war propaganda.” In an email to Business Insider, Gabrelyanov said he had stopped working for LifeNews in 2013.

He also denied that he was a propagandist, describing Life as a popular news outlet in Russia, and stating that as its CEO he had had no control over editorial decisions.

Still, he has been described as “pro-Putin” and as a propagandist by Russia-watchers. He was also accused of sharing a doctored image on Twitter showing a poster of Adolf Hitler in Kiev, Ukraine.

Still, Gabrelyanov insisted in an email there was “no proof.”

Gabrelyanov’s previous life aside, Business Insider tested out MakeApp on light- and dark-skinned women to see how accurate it is. We quickly found the app is taking its cue from FaceApp, another controversial Russian photo app whose “hotness” filter made dark skin look lighter.

Here’s what MakeApp thinks I look like without makeup

MakeApp tries to show you what a woman looks like without makeup

caption
MakeApp tries to show you what a woman looks like without makeup
source
MakeApp

It’s pretty obvious from this photo that MakeApp’s AI is not particularly sophisticated, giving me vitiliginous skin and paring off my eyelashes when I’m not wearing mascara.

My fairer-skinned colleague Bobbie didn’t look much better.

MakeApp Bobbie

source
Shona Ghosh/Business Insider

And neither did Business Insider’s video producer Claudia, after being subjected to a puffy make-under.

MakeApp Claudia

source
Shona Ghosh/Business Insider

Finally, here’s a picture of tennis champion Serena Williams, who is conventionally seen in public not wearing makeup – and she definitely doesn’t look like this.

Serena Williams MakeApp

source
Reuters/Brendan McDermid/Business Insider

MakeApp’s unflattering, malfunctioning AI is the latest in a long line of AI controversies: Snapchat’s offensive Bob Marley filter, FaceApp’s “black” filters, and smartphone cameras that lighten your skin by default.

Gabrelyanov, in response to questions from Business Insider, suggested MakeApp could save lives by identifying women who had been trafficked illegally.

“In most of these cases, makeup is heavily used to disguise the age and/or identity of these people,” he said in an emailed statement. “If human traffickers can hide these victim’s identities, their chances of rescue are low. “When security services show an image and say “Is this your daughter?” heavily applied permanent makeup often makes the identification process quite difficult. We hope our technology may help families and authorities identify victims for rescue.”

Gabrelyanov said his firm was already in communication with organisations to identify victims of human trafficking. Asked by Business Insider about potential criticism that the app may be considered racist or sexist, Gabrelyanov said there had been no specific cricism by users.

He wrote: “Our machine learning dataset is based on users’ faces of different nationalities and skin colors.”