By Rodrigo Bonilla and Jeremy Lim
Snapchat launches AR lenses that respond to words
This past May, Snapchat released the first AR lenses that could respond to sound. While these lenses were able to distinguish between pitch and and volume, they could not identify specific words. However, this all changed last week when Snapchat took its audio-based lenses to the next level, giving them the ability to recognize specific words and adapt the lens animations accordingly. Though these AR lenses only respond to relatively simple words for now - words such as “hi,” “wow,” “yes,” “no,” and “love” - the technology used could very well lead to more complex interactions in the future, whereby users can use facial gestures as well as voice to engage with the app. Snapchat’s continued pioneering efforts in augmented reality could eventually open the door to advertisers and creators on the platform as well as give the app a distinguishing edge over platforms like Facebook and Instagram who are also playing in the AR lens space.
For a quick look at these new AR lenses, click here.
Facebook and Instagram roll out time management dashboards
In an effort to foster healthier platform habits, Facebook and Instagram have officially rolled out screen-time management dashboards on their apps. The dashboards can be accessed via the “Settings” tab on both the apps, under “Your Time on Facebook” and “Your Activity (Instagram)”. Though they can currently only track usage on mobile as opposed to desktop, the dashboards are able to monitor daily usage and averages throughout the week, as well as set daily reminders to notify users once they’ve exceeded specified usage times. While these usage alerts certainly provide users with more self-awareness, Facebook and Instagram have not yet gone so far as expanding the mechanism to block any further access to the apps. These dashboards are currently only available to users in the U.S. but will be rolled out globally in the coming weeks.
Creating new reality with Octi
Here’s yet another Augmented Video revolution that might just make Snapchat’s and Facebook’s filters a little sour. iOS video app, Octi, uses Computer Vision and Machine Learning to recognize full body forms to apply various effects, ranging from chromakey filters to gesture actions, and additional life-like 3D characters that can imitate your movement. The AI actually understands and sees humans in the video, so users can achieve exciting things with their phones' cameras that were never possible with industry stalwarts like Facebook or Snapchat.
Learn more about the app here.