Facebook uses AI to eliminate facial data in the video to prevent personal privacy leaks
Face recognition technology based on biological data is being applied in various fields. Of course, no verification system is absolutely safe until now. When you shoot your own video and publish it online, it’s hard to say whether a criminal will extract facial data from the video and spoof the user for verification. To this end, Facebook’s artificial intelligence laboratory has released new research papers that can be used to protect privacy by eliminating facial data from video through special algorithms. The algorithm-processed video cannot be recognized by other face recognition algorithms, and it is basically impossible for criminals to extract facial data from them.
The researchers of Facebook released the video processed by the algorithm. If you use the naked eye to view it, you can hardly see that the original video is different from the processed video. However, the processed video is adjusted by the algorithm from various angles, and the adjusted image can make the face recognition system unable to accurately determine the facial data. The researchers said that the simple summary is to change the key facial features of the video in real-time through machine learning to deceive the facial recognition system from identifying it incorrectly.
The Facebook Artificial Intelligence Lab only emphasizes that technology is designed to improve privacy, and the company is not prepared to commercialize the technology. However, according to media speculation, Facebook is ready to use it on the Facebook platform. When the user publishes a video containing the face of the person, it automatically performs algorithm processing to prevent misappropriation. The piracy here is not to prevent others from downloading the user’s video, but to destroy their facial data to prevent criminals from extracting key facial recognition data. For example, more and more company access control systems use face recognition technology. If criminals extract employee’s facial data, they may pretend to be employees to bypass the door.
Via: theverge