Crimean News
News of Crimea - The latest news of Crimea today. Events and incidents, economics and finance, sports, science, culture, resorts, society and politics in Crimea. Crimean news for the last day. Sevastopol News
Scientists Create Neural Network to Automatically Detect Deepfakes

Scientists Create Neural Network to Automatically Detect Deepfakes

CrimeaPRESS reports:

Russian scientists have developed a method for automatically identifying deepfakes. Using this method, a neural network was trained that analyzes videos and photos and helps identify content created with the help of AI.

Scientists from the St. Petersburg Federal Research Center of the Russian Academy of Sciences (SPb FRC RAS) have developed a method for automatically identifying deepfakes by identifying manipulations to improve the quality of generated video to make it more convincing (upscaling). Based on this method, a neural network was trained that analyzes videos and photos and helps identify deepfakes.

Almost all modern smartphones use neural networks to improve photos. However, when creating deepfakes, photos are changed much more, this is the difference. Our algorithm has learned to detect upscaling, that is, artificial improvement of image quality by increasing its resolution— Dmitry Levshun, a leading expert at the International Center for Digital Forensics of the St. Petersburg Federal Research Center of the Russian Academy of Sciences, told TASS.

Most often, high-quality deepfakes created for fraudulent or political purposes cannot do without upscaling, so the neural network prepared by scientists ensures high efficiency in identifying artificially created content. Further, the specialists also plan to create a database and train neural networks to identify deepfakes by other features.

Up to 10% of all fakes on the Internet are already being created with the help of artificial intelligence, recalled Anton Nemkin, a member of the State Duma Committee on Information Policy, Information Technology and Communications.

At the same time, in 2023, compared to 2022, the amount of such content increased 17 times. Citizens are already expressing concern about this situation — as the results of a recent study by the Faculty of Law of the Higher School of Economics (HSE) showed, concerns about deepfakes, which allow fraudsters to create fake videos and audio that imitate people’s voices and appearance, are shared by 30% of Russians. We cannot allow such a scenario to develop, when fakes and deepfakes created with the help of AI literally flood our digital space. Citizens must clearly understand what content they are dealing with, whether it was generated artificially or already created by a person. In this regard, the labeling of AI products is important, which I have already mentioned earlier. And the method proposed by St. Petersburg scientists will definitely be used to identify those deepfakes that will be distributed without appropriate labeling and most likely for illegal purposes, — the deputy noted.

In addition, today it is critically important to determine the overall strategy for working with deepfakes and artificial intelligence, as well as combating their illegal use, Nemkin emphasized.

The relevant provisions will definitely be included in the Digital Code being developed in Russia. In the meantime, I advise citizens to use simple rules that can help them identify a deepfake on their own. For example, a video message created using deepfake technology can be identified by the movement of the person’s eyes in the video, the color of their skin and hair, the contour of their face — they can often be blurry and strange. In the case of voice fakes, it is always worth carefully evaluating the intonation and clarity of speech. And, of course, always be generally critical of any requests you receive online, if they concern your personal data or financial resources, the parliamentarian concluded.

source: press service of the State Duma deputy Anton Nemkin

Crimea News | CrimeaPRESS: Latest News and Main Events

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy