view

Have beauty filters stopped being fun?

Beauty filters have changed a lot. While puppy ears still linger, many of us are turning to hyper-realistic AI-driven tools that do more than tweak.
A young woman in a black sweatshirt and headphones round her neck takes a selfie while making the peace sign with her right hand.
MARIEHEADSHOT_VIEW

Written by Marie-Anne Leonard

Writer & Editor – Canon VIEW

Admit it, you’ve done it. Swiped for the filter that makes pores disappear, added a little flattering light to that selfie. Had fun by adding freckles, perhaps? Changed your hair colour or given yourself huge cartoon eyes? Maybe you are shameless when it comes to filters, cheerfully smoothing and blurring until you find your best adjusted self. It’s okay – none of us are immune to the temptation of tweaking.

It’s fun, right? After all, we’ve been zhuzhing up our selfies for years, it’s nothing new. In fact, it’s been a whole decade since the infamous Facetune app created the cult of the ‘Instagram Face’, but built-in filters in our socials weren’t far behind, with SnapChat, Instagram and, most recently, TikTok making their use ubiquitous. Today, however, there’s a new breed of beauty filter making waves for its hyper-realism and suddenly users are asking, ‘how much is too much?’

Traditionally, in-app beauty filters have been a bit basic, adding or subtracting elements, changing colours or the shape of the face in a very particular fashion. That’s because, up until now, the technology behind the filter simply detects a face in the frame and then creates a kind of 3D mesh template, which lies on top of it. The desired effect is attached to that mesh – such as the devil horns, smatterings of freckles or cute bunny noses we’re all familiar with. But if you interfere with the process, perhaps by waving your hand across your face, this type of filter will glitch because the overlay suddenly can’t find the face it’s supposed to be adhering to. These kinds of Augmented Reality filters are pretty obvious to anyone who sees them, so if you were looking for any specific, significant or discreet work to your photographed look, you’d need to come out of your socials and get retouching elsewhere.

 A close up of a face, showing only the left eye and part of an ear. Overlaid on the face and skin is a mesh of thin white lines connected with tiny dots to make a map of the face.

Traditional augmented reality filters create a kind of topographical mesh over the face for the filter to attach to. New AI filters recreate the image pixel by pixel to apply the new look.

The new filters in town hit differently. The most famous is TikTok’s ‘Bold Glamour’, but there are plenty of others that are changing faces in all manner of ways. Yes, they can do the same job, slimming down the face, shrinking the nose, defining the eyes, plumping the lips (and, let’s be honest, some of these effects are highly questionable in and of themselves), but the glitching has gone and what’s left is powerfully realistic. When users run their hands over their faces, there is little to no distortion at all, and these new filters naturally move with any face – regardless of age, ethnicity or gender. While the creators of such filters are tight-lipped around their process, experts in Augmented Reality and Artificial Intelligence have been outspoken around the subject and the consensus is that these filters use a type of machine learning model called Generative Adversarial Networks (or GANs).

A GAN is essentially two neural networks working against each other (hence ‘adversarial’) to improve both. One is the ‘generator’, which creates synthetic data. The other is called a ‘discriminator’, which is there to distinguish between what is real and what is not. They work together in a circular process whereby the generator tries to create data that can fool the discriminator, and the discriminator tries to correctly identify whether the data is real or synthetic. Over time, both get better and better at their job, but it’s the generator which can eventually create synthetic data that is almost indistinguishable from real. In the context of a beauty filter, the two adversaries are ‘the camera’s view of your face’ and ‘the facial features that will create the desired look’. From there it’s just a case of pitting your face against a huge dataset of other faces until the two datasets merge, pixel by pixel, and become – Ta-dah! – the new and enhanced you. This is why there is no distortion. Because there is no overlay. Every single pixel of the image you originally present is regenerated by the GAN to create the new look.

“The consensus is that these filters use a type of machine learning model called Generative Adversarial Networks (or GANs).”

The simplicity with which this happens and the astonishingly realistic looks that they achieve is why there’s so much controversy around these new filters. On the one hand, do they go too far? And on the other, how will we ever know what is real and what is filtered? Well, in answer to the first question, celebrities, influencers, mental health experts and technologists alike have all denounced these kinds of filters, variously calling them ‘creepy’, ‘objectifying’, ‘problematic’ and ‘reinforcing an incredibly narrow standard of beauty’. Speaking to The Verge, Memo Akten, Assistant Professor of Computational Art and Design at UC San Diego Visual Arts, made the important point that previously these kinds of apps had a playfulness to them, but the new AI beauty filters feel “ominous”, and this seems to be an opinion that is widely held.

And this is where the remaining question comes into play. It is important to remember what is new here… and what is not. The act of manipulating and enhancing images has been around since the very first photographs were taken. And before social media, the airbrushed images in magazines and advertising were accused of peddling body insecurity and unattainable beauty ideals. So, what is new here? The simplicity is new. The ease of access to the technology is new. Retouching has been available to all for some time, but these filters level the playing field by doing all the work and making ‘the work’ stand up to scrutiny. Instead of measuring ourselves against models, actors and celebrities, we are now in a position to make the same comparisons against our peers and, even more bizarrely, enhanced versions of ourselves.

But when it’s this easy, when everyone is filtered to look so very different, what happens when we meet in real life? It’s a problem frequently seen in online dating, where the gulf between profile pictures and reality occasionally raises a laugh. But en masse, this wholesale presentation of pre-determined looks is no laughing matter and filters are already accused of reinforcing harmful stereotypes about what is considered beautiful or attractive. And for those who don’t fit into these narrow standards? Damaged self-esteem, anxiety and depression through inadequacy and insecurity. But we also have the very real potential for prejudice and discrimination against those who don’t meet the new expectation of beauty. To a certain degree, this is already happening, but has been countered by an incredible army of filter-free influencers and #SkinPositivity activists. People who love beauty, make-up and fashion, but believe that true beauty lies in being comfortable in your own skin – however it looks.

A pair of hands holds a smartphone. On the screen is the same image of a woman six time, each photo with different hair colour.

Although they are intended to be fun, there are concerns that the increasingly powerful beauty filters are problematic for self-image and mental health.

Canon Ambassador Clive Booth is known for his remarkable ability to call upon the skills of the old masters in his portrait photography. He uses complex classical lighting techniques, but in a way that is so casual and comfortable, you’d barely be aware it’s happening. He chats cheerfully with his sitters and does his research, so that every portrait contains as much personality and character as it does technical skill. It’s not something that he foresees Artificial Intelligence being able to replicate any time soon. “I don’t have a problem with AI, particularly,” he says. “Some of the work I’m seeing using it is excellent, but I think true portraiture is one of the few areas that is going to be AI proof. However, he too expresses concerns around altered perceptions – those sitting for portraits not having a true understanding of how they actually look, having seen so many altered images of themselves. “In terms of self-worth, I think this is going to have really deep implications.”

From the perspective of his own work, Clive feels deeply uncomfortable that these kinds of AI filters could become the norm away from social media, edging into a world where they are requested or expected by those being photographed. “I’m a purist,” he proudly admits. “I would be really unhappy if someone dropped a filter over my work and I wouldn’t use it in a camera.” And it is here that we find the dividing line between ‘light-hearted fun’ and ‘no longer accepting our authentic selves’, always requiring adjustments to the ‘me’ we see. As long as we enter into the use of beauty filters with a clear sense of self, there’s surely no harm in playing around with our photos and feeling good about how we look, is there? Absolutely not. Augment those puppy ears to your selfies with pride. Chisel that chin. Make your eyelashes hit the ceiling. But know and acknowledge the difference between you and the filter. And be proud of who you are without it.

Read more articles like this from Canon VIEW