
Face filters
Dig into any of the biggest social media apps and you’ll find an array of filters that can transform the input from your phone’s front camera. These augmented reality (AR) effects can change a person’s appearance, or that of their background, using computer vision and image processing models that perform real-time video modifications.
Among the millions of filters available, you’ll find all sorts. You can accessorise with some glasses or throw in a few sparkles, as if you’ve been doused in glitter. Other filters are limited only by their creators’ imaginations and range from the ridiculous to the fantastical. The classics include the face, age or gender swap; appended animal ears, or the Pixar version of you. But you can also, if you like, freakishly distort your features, or see what it would look like if your face were plastered onto the body of a giant prawn. Or instead of the boring wall behind you, why not take on the lo-fi graphics of a noughties music video – using the very same background subtraction technology popularised by Zoom during the pandemic?
Face filters rely on computer vision and image processing models that can modify a video feed in real time. Central to the process is a facial detection algorithm, such as the Viola-Jones algorithm (for more on facial recognition technology, see Ingenia 79). This algorithm works out differences in contrast between portions of an image to detect the edges of the features. For example, as any portrait artist knows, the eye sockets, sides of the nose and lower lip are darker than the upper lip, bridge of the nose and middle of the forehead. If the algorithm finds enough of these features, it can detect a face.
Having detected the facial features, the software then aligns a statistical model of the face with to face in question using machine learning. Called an active shape model, this is derived from hundreds of thousands of facial images, on which people have marked the borders of features. A mesh representing the ‘average face’ from this model is then scaled and aligned with the user’s face, with adjustments made where it doesn’t fit perfectly.
This mesh is where the magic happens. Using special software, it can be distorted, or have accessories attached to it, or have colour changes applied to segments of it (such as the eyes). Importantly, the mesh and any programmed alterations to it must move along in real time with the video of your face, for a smooth viewing experience. But this tracking isn’t quite perfect yet, as can be seen when people turn their heads to the side, as effects can disappear. Plus, in some cases, the algorithms must account for occlusion. This refers to what happens say, when a hat is seen from a three-quarter profile: the head will block the back of the hat from view. So, for a hat filter, the part of the animation behind the head must be subtracted.
One concern about filters that has been increasingly highlighted is the beauty filter. Although they rely on distortion too, the distortion is so subtle you might not immediately notice it if you saw it on someone else’s photo. However, the effects are worrying: noses or jawlines can be slimmed down, skin can be smoothed, eyes and lips enlarged, and eyelashes lengthened to achieve a so-called ‘Instagram face’ – a Eurocentric standard of beauty often achieved with cosmetic procedures.
What’s more, not all ‘beauty’ filters are labelled as such – sometimes a hat filter might also slim down your nose. All of this adds up to a troubling phenomenon that has been dubbed in the media as ‘filter dysmorphia’, a new version of an old problem to which some young people are particularly susceptible. This is an issue tech companies will need to consider carefully if filters continue their trajectory of popularity.
Beyond the uses purely for social media, brands are developing filters allowing people to try on clothing, accessories, jewellery, and makeup, or see what a piece of furniture could look like at home. While these haven’t gone mainstream yet, in the next few years, you could well be investing in your next pair of glasses without ever seeing them in person.
***
This article originally appeared in Ingenia 92 (September 2022).
Keep up-to-date with Ingenia for free
SubscribeRelated content
Technology & robotics

How do chatbots work?
As human interaction online gradually gives way to automated responses, chatbots must impersonate us without attempting to replicate human empathy or enthusiasm.

Robotic vacuum cleaners
Twenty years after the Roomba's original release, the latest generation of robotic vacuum cleaners incorporate sophisticated machine vision technology to steer clear of electrical cables, stray socks and pet poo.

An innovator who fills a vacuum
From outer space to the depths of the earth, Professor Trevor Cross FREng seeks new uses of the technologies that enabled the electronic revolution.

The technologies that recreate historic artworks
Did you know Churchill's wife once set a portrait of him on fire because he hated it so much? Factum Arte used modern technology to recreate it, so it lives to see another day – sorry Clementine.
Other content from Ingenia
Quick read

- Environment & sustainability
- Opinion
A young engineer’s perspective on the good, the bad and the ugly of COP27

- Environment & sustainability
- Issue 95
How do we pay for net zero technologies?
Alex Keeler: Electrifying trains and STEMAZING outreach

- Civil & structural
- Environment & sustainability
- Issue 95