Posted inNews

Find out if 2020 is going be the year of deepfakes going mainstream

Some social media platforms are already racing to be the first to add Deep Fake Technology.

Find out if 2020 is going be the year of deepfakes going mainstream

It’s only the first week of the year, and we have already seen Snapchat acquiring computer vision start-up AI Factory and images which show that TikTok is working on a deepfake-style addition within its app.

All in all, this is mainly aimed to help users create videos and content on the apps.

TikTok meanwhile is on a direct deepfake-style feature, which asks users to take a multi-angle, biometric scan of their face, then enables them to add their image into a selection of videos as reported on TechCrunch.

According to Social Media Today, Snapchat, meanwhile has reportedly purchased AI Factory, to launch its Cameo feature, which enables users to overlay their face over clips that are pre-designed for the users on the photo-video-sharing platform, this will further improve its capacity to overlay your image onto video content.

Technology is rapidly advancing, there are still some elements like it’s not hard to imagine the same technology being used for a more nefarious purpose, like showing a politician saying something he didn’t, or depicting a business leader or celebrity in a compromising position. Which means the advancement may cause conflicts in the future.

In order to avoid any such controversy Facebook, Google and Twitter have been conducting research into how to detect and highlight deepfakes to avoid misinterpretation.

In the case of Snapchat since the Cameo feature is cartoonish-animated the function cannot be misused however TikToks variation is rather a realistic feature which could be a concern.

We will just have to wait and watch as these feature roll out, there certainly will be restrictions shortly if the features are misused. However, for several users, this is a great form of entertainment and creating content.

What are your thoughts? Let us know in the comments
Photo Credits: Shutterstock