Intel doubles its investment in data sharing and privacy protection technology
At the online Intel Research Institute Open Day event in early December, two researchers from the company said that using homomorphic encryption and federated learning technology, companies and researchers can collaboratively analyze data and create data without exposing actual data. The machine learning model effectively avoids the risk of data leakage.
The collaborative team can use federated learning to create a general machine learning model, train it with its own internal data, and then safely collect and combine these scattered models to establish a more accurate iteration that integrates the data of all parties. Homomorphic encryption is more versatile and is the result of a special field of cryptography, focusing on data calculations in the encrypted state, such as encrypted data search and machine learning algorithm training. Homomorphic encryption can maintain the availability of information while effectively protecting privacy.
Intel has doubled down on these two technologies, providing support in its hardware with Software Guard Extensions (SGX). Jason Martin, the chief engineer of Intel’s security intelligence team, said that this can reduce the application cost of homomorphic encryption and federated learning technology.
“Unprocessed data is data that’s not useful,” he said. “The primary tools that we have for making sense of the increasing volumes of data is machine learning and statistics technologies, but companies are worried about sharing that data because of security and privacy concerns.”