Digital Media Concepts/Snapchat Lenses
Snapchat Lenses enable a user of the Lens Studio application by Snap Inc. to add special effects to photos and videos with features such as Face Lenses and World Lenses. Face Lenses allow manipulation of the user’s eyes, mouth, head, and shoulders to transform their face with effects such as turning them into puppy dogs, superheroes, or aliens, or showing them in different makeup or luxury jewelry. World Lenses allow users to interact with three-dimensional objects overlaid onto the surrounding environment as displayed by the outward facing camera. Lens Studio by Snap Inc. is an application designed for users of Snapchat to build Augmented Reality (AR) experiences. Lens Studio offers a deep set of built-in features such as custom shaders and advanced tracking technology and includes various templates to let users get started making Snapchat Lenses.
General
editTo use Snapchat Lenses:
- Go to the Camera screen
- Tap on a face to activate the Lens feature
- Swipe on the carousel and tap one of the Lenses
- Tap or hold the circle symbol to capture the Snap
- Record your Snap while the effect is active
- Edit and/or send the Snap
For more details, go to: https://support.snapchat.com/en-US/article/face-world-lenses
History
editVersion | Release Date |
---|---|
3.1 | 2020-08-13 |
3.0.1 | 2020-07-17 |
3.0 | 2020-06-11 |
2.3.1 | 2020-02-04 |
2.3.0 | 2019-12-12 |
2.2.0 | 2019-11-15 |
2.1.0 | 2019-08-27 |
2.0.1 | 2019-04-04 |
1.7.1 | 2018-12-12 |
1.7 | 2018-10-18 |
1.6.2 | 2018-06-19 |
1.6.1 | 2018-06-01 |
1.6.0 | 2018-05-30 |
1.5.1 | 2018-04-30 |
1.5 | 2018-04-17 |
1.0.1 (Win) & 1.0.2 (Mac) | 2018-03-02 |
1.0.1 | 2018-02-05 |
For a detailed description, go to: https://lensstudio.snapchat.com/changelog/
Technology
editComputer vision has gained massive traction in recent decades, with applications ranging from depositing checks to self-driving cars. With a daily user count in the tens of millions, Snap's augmented reality lenses have become a ubiquitous application of computer vision.[1] Initially driven through its acquisition of Ukrainian startup Looksery, Snap has continued to improve its augmented reality technology with 17 acquisitions including startups such as Zenly and AIFactory.[2] Snap's augmented reality lenses are capable of mapping faces and other objects in 3D space, taking into account rotation and occlusion so that overlaid effects animate correctly in real time. Face Lenses allow accurate manipulation of the user’s eyes, mouth, head, and shoulders to transform their face. World Lenses appear on the outward facing camera and can detect and map surfaces and environments.[3]
Overview
editImplementation details for Snap's lenses remain confidential, but the underlying technology can be broken down into the following three broad areas.
Object Detection
editThe object detection algorithm identifies all relevant objects (e.g. faces for face lenses) in the input image or video frame and provides their bounding boxes. Advancements in object detection algorithms in recent decades have improved their robustness to rotation and occlusion. Common techniques divide images into small sections and utilize a combination of Histogram of Oriented Gradients (HOG) and Support Vector Machine (SVM) to determine if relevant object features exist. In the case of face detection, the bridge of the nose is lighter than its surroundings, the eye socket is darker than the forehead, and the center of the forehead is lighter than its sides.[4]
Landmark Extraction
editAfter identifying and bounding the objects of interest, detailed object landmarks are extracted. Techniques such as an Ensemble of Regression Trees may be used for low latency landmark extraction. In the case of facial landmark extraction, local region coordinates for features such as the eyes, lips, nose, and mouth are extracted and updated in real-time. For example, changes in eyebrow coordinates relative to other facial features allow algorithms to determine if a user has raised their eyebrows.[4]
Image Processing
editAn Active Shape Model is trained based large quantities of images and adjusted to create a 3D mesh that can shift and scale with the object of interest. In the case of facial modeling, an "average face" model is adjusted to overlay a mesh over the users face, mapping to each point frame by frame. Lenses are able to distort features of the chosen object through manipulation of the mesh overlay.[4]
Development
editTo capitalize on the growing need for augmented reality capabilities, Snap introduced Lens Studio in December 2017. Lens Studio is an easy-to-use development platform that allows anyone to create lenses, giving creative license to developers outside the company.[5] As research advanced and smartphones became increasingly powerful, the capabilities of Lens Studio have been updated over time with features such as full-body tracking, pet tracking, custom materials, and support for custom Machine Learning models.[6]
Lens Studio 3.0 introduced SnapML, which allows machine learning experts to add custom ML models when creating lenses. The model acts as a black box that allows developers to extend the capabilities of Lens Studio to create a far wider variety of compelling augmented reality features.[7][8]
Similar Technology Alternatives
editMonetization
editAdvertising accounted for 98% of Snap's total $1.7 billion in revenue in 2019.[9] Sponsored lenses represent one of Snap's key advertising revenue streams, with lens usage increasing 37% year-over-year in Q2 2020 as AR platform adoption accelerated.[10] Lens AR experiences offer not just an impression, but sustained “play time” as users interact with interactive ads. They are a powerful and memorable way to connect with consumers on a massive scale using augmented reality. For example, some brands use Face Lenses to transform users into their brand icon or movie characters, while other brands leverage World Lenses to showcase products and product features.[11]
Advertising partners span industries from automobiles and financial services to retail and telecom.[8] Success stories include a McDonald's recruitment campaign which generated over 42,000 applications from Saudi job hopefuls. This ad format has been used to drive results across business objectives from awareness and engagement to consideration and sales lift.[12] Businesses have the option of either creating their own AR experiences with Lens Studio or partnering with Snap's in-house AR team to create large-scale advertising campaigns.[8]
Controversy
editRacism
editIn 2016, Snap’s “Yellowface” lens feature was criticized because the lens gave users facial features - “slanted eyes, large front teeth, and rosy cheeks” that are associated with Asian stereotypes.[13]
Also in 2016, Snap’s platform introduced a "4/20" filter, which users could use on themselves to look like Bob Marley. Some users criticized this as being racist, because it enabled users to show themselves with digital blackface and dreadlocks.[13]
Health Concerns
editAs users of Snap's augmented reality lenses (or other lenses that allow people to reshape their face or body) increased,Neelam Vashi, MD, director of the Ethnic Skin Center at BMC and Boston University School of Medicine, proposed a phenomenon named “Snapchat dysmorphia”, which is associated with Body Dysmorphic Disorder (BDD). The doctor argues that Snap’s editing features are causing patients to "seek out surgery to help them appear like the filtered versions of themselves".[14]
Security Concerns
editIn April 2016, Snap was sued for negligence by the driver of an Outlander. The driver of the Mercedes 230 that struck him from behind was preoccupied with trying to Snap her driving speed by using a function of its Lens feature that allows overlaying a car’s speed on top of photos or videos. The Outlander driver claimed that Snap knew this feature was being used in illegal speed contests but did nothing to prevent its use.[15]
Reference
edit- ↑ BASIC®. "Snapchat — Introducing Augmented Reality to the world with the Launch of Lens Studio | BASIC® | Case Study". BASIC®. Retrieved 2020-10-10.
- ↑ "Query Builder | Acquisitions | Crunchbase". Crunchbase. Retrieved 2020-10-10.
- ↑ "Business Center". businesshelp.snapchat.com. Retrieved 2020-10-11.
- ↑ 4.0 4.1 4.2 Le, James (2018-07-22). "Snapchat's Filters: How computer vision recognizes your face". Medium. Retrieved 2020-10-11.
- ↑ "Introducing Lens Studio". www.snap.com. Retrieved 2020-10-10.
- ↑ "Release History - Lens Studio by Snap Inc". lensstudio.snapchat.com. Retrieved 2020-10-10.
- ↑ "SnapML Overview - Lens Studio by Snap Inc". lensstudio.snapchat.com. Retrieved 2020-10-10.
- ↑ 8.0 8.1 8.2 "Snapchat Ads for Business | Mobile Advertising". forbusiness.snapchat.com. Retrieved 2020-10-10.
- ↑ Johnston, Matthew. "How Snapchat Makes Money: it's main revenue engine is advertising". Investopedia. Retrieved 2020-10-10.
- ↑ "Snapchat Lens Use Grows 37% Annually". Grit Daily News. 2020-08-01. Retrieved 2020-10-10.
- ↑ "Business Center". businesshelp.snapchat.com. Retrieved 2020-10-10.
- ↑ "McDonald's recruitment campaign on Snapchat receives more than 42,000 "Snaplications" from Saudi Job Hopefuls". forbusiness.snapchat.com. Retrieved 2020-10-10.
- ↑ 13.0 13.1 "Why Snapchat pulled its 'anime' filter". Christian Science Monitor. 2016-08-12. ISSN 0882-7729. Retrieved 2020-10-13.
- ↑ "A new reality for beauty standards: How selfies and filters affect body image". EurekAlert!. Retrieved 2020-10-11.
- ↑ Rogers, Katie (2016-05-03). "Snapchat at 107 M.P.H.? Lawsuit Blames Teenager (and Snapchat) (Published 2016)". The New York Times. ISSN 0362-4331. Retrieved 2020-10-11.