Bozen – In 2013, the word ‘selfie’ was Oxford Dictionary’s word of the year and since then, our acceptance of the phenomena has skyrocketed. Back in 2017, Justin Denison, senior vice president of product strategy and marketing at Samsung, whilst introducing the company’s latest camera posited that on average, each of us will take 25.000 selfies in a lifetime. That was before Covid, that was before the collective migration to Zoom meetings and that was before the digital conquest of the beauty industry.
Nowadays devices, apps and customised formulas promise to recreate the effects of digital filters – in real life. A recently published paper entitled ‘A Pandemic of Dysmorphia: “Zooming” into the Perception of Our Appearance’, co-authored by Shadi Khourosh, a Massachusetts based dermatologist, introduced a term that by now, we are familiar with: Zoom Dysmorphia. Following endless months of remote meetings and online social gatherings – and seeing our own faces on screen – we have become fixated on our perceived facial flaws. And while psychological studies often equate time spent in front of the mirror with increased insecurity, looking at yourself on a screen is more like looking in a funhouse mirror than an actual one.
Front-facing cameras combined with a close focus can distort your appearance making your eyes look smaller, noses bigger, lips thinner, wrinkles more pronounced and skin – grey. Little wonder then there’s been an almighty boom in the beauty industry. From surgeries to Botox, creams, peels, and fillers. In the US alone, a survey of more than 1.000 consumers conducted by the American Society of Plastic Surgeons (ASPS) found that 49 percent of those who haven’t ever undergone aesthetic plastic surgery indicated they are open to cosmetic treatments in the near future. Another ASPS study published the same month found that 64 percent of U.S. plastic surgeons had seen an increase in consultations since before the pandemic began.
AI-based technologies in plastic surgery
The uptake has been accompanied by the evolving integration of tech. Namely that of AI which has become particularly significant for a number of reasons. AI-based technologies that plastic surgeons are embracing include big data, machine learning, deep learning, natural language processing, and facial recognition. Using machine learning algorithms to reveal associations in large data sets is being utilised both for monitoring postoperative viability based on image banks of skin tone and colour as well as interpreting genetic data to highlight inherited tendencies and responses involved in skin health, aging and recovery.
The commercial use of facial recognition technology is commonplace – if you have a smart phone (and who doesn’t) you’re employing the technology.
Deep learning can also track postoperative progress and even enable the identification of malignant skin cancers using images taken with your smart phone! In fact, AI bots in your smart phone can provide answers to post-op FAQs based on natural language processing due to machine learning software’s ability to interpret your speech.
The commercial use of facial recognition technology is commonplace – if you have a smart phone (and who doesn’t) you’re employing the technology. Facial recognition operates using recognition models combined with image analysis and deep neural networks to take unique biometric measurements that are used to interpret facial characteristics.
In plastic surgery procedures today, models are being applied to gauge facial beauty in patients relative to postoperative target features, that is, to determine whether patients will be satisfied by ensuring appropriate expectations before surgery. The omnipresence of facial recognition tech in cosmetic surgery has even made it to Hollywood, or should I say Manhattan…And Just Like That, the Sex and the City spin-off, sees main character Carrie Bradshaw accompanying her buddy to a consultation with a plastic surgeon whereupon curiosity gets the better of her and she agrees to undertake a consultation of her own. Using a 3D photograph taken at the practice, the doctor creates a before and after simulation of Carrie’s face, using facial recognition technology to provide a mock-up of her face after, laser, facelift, and filler.
Zoom and its ‘touch up my appearance’ feature
But even if you aren’t ready to go under the knife quite yet, or ever… less invasive options are on hand. Back to Zoom then and its (godsent) ‘touch up my appearance’ feature, AI software capable of recognizing human faces by mapping biometric facial features and comparing the data to a photo database uses the algorithm to identify your face in visual feeds and automatically apply a soft focus to it. These real-time algorithms – long ubiquitous with the Insta, SnapChat and Tiktok set – have been employed worldwide, providing augmented reality filters to fit ‘naturally’ to your face.
Using filters to preview adjustable effects has elevated the selfie standard, however these filters have skewed our perceptions to the point that when confronted by our real selves rather than our virtual ones, we end up filling beauty industry coffers in an attempt to reconcile the two. In the pursuit of aligning these two selves, hyper personalisable tech innovations and augmented reality beauty solutions such as online shade-matching, skin and make-up consultations now mean that a trip to Sephora and the risk of the latest covid mutation, can easily be avoided. So how do we look like our favourite Instagram filter without stepping out of our home office?
Tailor-made beauty products
Nearly all the big players are in the game, AI offers consumers the opportunity to create special, tailor-made products at a level we’ve never before experienced. Following years in the pipeline, Procter & Gamble’s Opté, a cutting-edge digital skin care device which combines a digital camera with blue LED lights to scan and detect areas of skin discoloration, enabling the on-board camera to see up to three times more pigmentation than the human eye, has hit the shelves. “The world’s first real-life beauty filter” scans the skin — by taking 200 photos per second. Then an internal minicomputer’s precise colour algorithm processes 70.000 lines of code to ascertain size, shape, and intensity of the area of discoloration in contrast to the surrounding skin, triggering a state of art cosmetic inkjet printer to deposit 1 billionth of a litre of makeup on to each ‘imperfection’– weighing in at 600 dollars it isn’t cheap – but it’s definitely not as expensive as a surgical alternative.
5 years ago, Shiseido – the Japanese beauty brand – was one of the first to jump on the bandwagon with the acquisition of California-based tech start-up MatchCo, who revolutionised the ability to scan the skin using a simple app in a bid to create entirely personalised cosmetics through Optune: a subscription-based service that costs around 80 euro per month. The app, Optune is a small gadget that analyses a combination of daily factors including your skin condition, the weather, pollen levels, pollution, sleep, stress, and your menstrual cycle to create a bespoke serum that evolves to solve your skincare issues – in real time. On the subject of mega M&As – L’Oréal acquired ModiFace – an AI beauty firm founded by engineering professor Parham Aarabi in 2019, just before the pandemic hit. The futuristic partnership birthed Person, an AI-powered skincare and cosmetics device which in a similar vein to Optune addresses your personal skin concerns which can vary from fine lines, dark spots, pore size, pigmentation and dullness. The resulting product can be a custom blended moisturiser, serum or under-eye cream, and can be adapted for day or night, and even as far as preferred texture and hydration level, all in less than two minutes.
DNA test for personalised skin care
If privacy fears and digital clones haunt you, stop reading here. Our final look into the world of beauty and the tech beast reviews gene sampling and its implications on your daily skin care routines. Countless companies have sprung up which promise to reveal your body’s inherited tendencies and the responses involved in your skin’s health and aging. All you have to do is share your entire genetic code and allow AI to sift through these large data sets. Apps offering this service are compatible with most DNA tests. Scared yet – you will be when you find out how cheap it is! For around 60 dollars (following the already highly discounted DNA test you’ve already done) these whole genome sequencing tests obtain data on 100 percent of your genome to prepare personalised skin care.
Individual responses of human skin to environmental stress are determined by variations in the anatomy and physiology that are closely linked to genetic characteristics such as pigmentation and sensitivity to distinct extrinsic aging factors. These variations are not only responsible for the differences in skin performance after exposure to damaging conditions, but can also affect the mechanisms of drug absorption, sensitisation, and other longer-term effects. This actively growing area with a range of biomedical and commercial applications within cosmetics industry boasts boosting your health and wellness with evidence-based advice and solid science. But at what cost? As consumers we are growing ever more comfortable with at-home beauty tech during the pandemic and it’s likely that the industry will continue to expand in this direction in the coming years. And, while this may be accessible, inclusive, convenient, and more hygienic than ever before, wouldn’t it just be better to step away from the screen and invest more time on what’s inside. Pick up a book, learn something new – there’s nothing more beautiful than brains anyway.
THE AUTHOR has been living in South Tyrol for the past 8 years, with a degree in Egyptology and a Masters in English, she was born in Zimbabwe, grew up in London, England and after living and working in France found her way to the mountains. Her current passions are ethnobotany and losing herself in the forest.