Major Camera Companies Are Working To Fight Deepfakes: Here's How

Before the ubiquity of artificial intelligence (AI), humans imagined a world wherein machines would take over the most mundane and repetitive tasks. In this ideal world, things like cleaning or organizing would have been a thing of the past, and people could do less manual labor and more creative work. Unfortunately, this is not what is happening.

These days, AI is being used to fulfill the roles that many of us perceive to be particularly human activities, especially in the arts. In 2023, content behemoths like Netflix announced their plans to create increasingly more content with AI. Knowing this, it is unsurprising that AI was a core concern for performers, and the harvesting of their "digital likeness" to train them was one of the key issues raised during the SAG-AFTRA Strike in the same year.

With generative AI making several aspects of movie magic and photography obsolete, many photographers and videographers are struggling to protect their work and their livelihood. Thankfully, some of the largest camera companies are on their side. Here's how they're planning to help turn things around.

How photographers can take back their power

Before the age of generative AI, photographers, and videographers employed the use of watermarks to tag their work. When AI-generated content became more common, many groups called for transparency and sought to mandate watermarks which would show that these images were not created by humans.

Despite this, watermarks have never fully been able to keep bad actors at bay, even before AI, especially because of the existence of accessible technology that can easily remove them. In September 2023, a study claimed that watermarking AI content still has several issues, including how forging watermarks can lead to misattribution. Although, all hope is not lost yet.

In December 2023, Nikon, Sony, and Canon announced their bid to keep photographers and videographers safe from misattribution and deepfakes of their work. Aside from just watermarking their images, these camera manufacturers have proposed the use of digital signatures as the new global standard for media professionals. Although, ordinary photographers, videographers, and other hobbyists, can also stand to benefit from these efforts.

Nikkei Asia shares that these tamper-resistant signatures will include metadata, such as the date, time, GPS location, and photographer's details. Nikkei Asia also shares the plans to launch Verify, a web-based verification tool that can check the digital signature of these images moving forward.

What cameras will get deep fake features

According to Nikkei, these three Japanese brands hold a whopping 84.3% global market share for the camera industry, which includes everything from compact cameras to higher-end DLSRs. After a 15.2% decrease in sales in the last year, it's no wonder that they're leading the charge to fight against AI-generated images and working to protect their customers.

For Sony camera users, you can expect a firmware update for existing mirrorless cameras. On the other hand, Nikon and Canon users will have to wait a little longer for succeeding models for the digital signature feature. Although Canon plans to reward their users with some additional features, such as a built-in authentication feature with video watermarking in 2024.

While introducing the digital signature can help professional users and media publications determine the authenticity of a photo, it's important to understand that having to check the verification website manually is still a big block for the average user. However, it is a much-welcomed half-way solution.

As deepfakes and other AI-generated content become more common and difficult to differentiate from human-generated content, here is to hope that more effective technology will be developed in the future. In the meantime, it's up to the consumers to be mindful of the content they consume, follow publications that conduct more stringent due diligence practices, and make an effort to verify news stories regularly.