AI and machine learning have crept into almost every part of modern technology and products but one of its most whimsical applications is face-swapping. Dubbed “deepfakes”, these programs can almost convincingly stitch your face on someone else’s. One such app named ZAO became an overnight sensation in China because of how impressively convincing its end results were. That, however, quickly gave way to fear and concern once it became evident how the technology could be misused for criminal acts.
Most deepfakes just “cut and paste” your face on another but ZAO really does easily set itself apart by the results. You take just a lone picture of your face, choose from a few video clips from movies or of actors and then magically find yourself as the main star of the shows. Thanks to AI, of course.
After its raging success, however, opinion on ZAO quickly turned after issues were raised about privacy and security implications. It turns out that ZAO had a 6,000-character user agreement that gave the developers rights to any image created on the app for free and without requiring permission or even knowledge. The user agreement was already updated to address that but that only gave way to another fear.
Developers Momo now basically has a database of users’ faces, faces that can then be used, with the same deepfake AI, to create misleading if not illegal content. In fact, people need only use someone else’s photo to create the videos, no need to hack ZAO’s database. And in a country like China where fac recognition is a popular security system for apps and services, that has become a major source of fear.
Apple may not yet see the dangers of the app as it remain iOS’ top download in China, but the Chinese government itself could put a stop to it if it deems it too controversial. On the other hand, it could also benefit from ZAO’s database of faces from users only too happy to upload theirs for a few seconds of imagined fame.