From banking to telecommunications, our contemporary lives revolve around data on a daily basis, while raising privacy concerns. A new study was published by the École polytechnique fédérale de Lausanne in computational natural sciences He asserts that many of the promises revolving around mechanisms to protect this privacy will never be kept, and that we must accept these inherent limitations.
Data-driven innovations, such as personalized medicine, better public services, or, say, more efficient and less polluting industrial production, promise huge benefits to citizens and our planet, and broad access to data is essential to fueling this future. However, aggressive methods of collecting and analyzing information lead to red flags regarding societal values and fundamental rights.
This has made expanding access to data, while protecting the privacy of the most sensitive information, one of the biggest challenges in the field. According to the researchers responsible for the new study, believing that any use of data is possible while fully respecting the right to privacy is like believing in a fairy tale.
According to the study’s co-author, Associate Professor Carmela Troncoso, there are two traditional approaches to privacy. “There is a path to using encryption, i.e. processing data in the decoder segment and thus obtaining a result. But to achieve this, it is necessary to design specially targeted algorithms, and not perform general computer operations.”
The study authors write that the problem with this method is that it not only achieves the desired goal of sharing individual high-quality data in a way that protects privacy, but also allows analysts to extract an entire database in a flexible manner.
The second solution generally offered is to anonymize the data, thus removing names, locations and other postal codes, but Ms Troncoso says that in this case the problem is usually with the data itself.
“There is a famous example regarding Netflix where the company decided to publish databases and run a public competition to produce better algorithms that make recommendations. The company removed the names of customers, but when researchers compared movie ratings to other platforms where people rate feature films, they were able to re-establish some people’s identities.”
New method, same problem
Recently, synthetic data has found space as a new way to anonymize data, but the study notes, however, that compared to the promises of proponents of this way of doing things, it is subject to the same limitations as anonymous data. The data is already known. “In our work, we point out that researchers and practitioners must accept the inherent flaws in high flexibility in data use and strong privacy safeguards,” said study co-author Theresa Stadler.
“This could mean that the range of data-powered applications will be reduced, and data owners will have to make explicit decisions about the most appropriate data-sharing approach based on their needs,” Ms Stadler added.
Another important message from the study is the idea of slower and better controlled technology commercialization. Ultra-fast launches are the norm these days, with the idea of ”fixing it later” if something goes wrong. All of this is particularly dangerous, Ms. Troncoso says.
“We have to start accepting that there are limits. Do we really want to continue this data-fed rush, where there is no privacy, with major implications for the democratic process? It seems like Groundhog Day, we’ve been talking about for 20 years and it’s happening now with machine learning We run the algorithms, and they are biased and hopefully they’ll be fixed later. But what if it isn’t fixed?”
However, limited functionality and improved privacy are not a business model for the tech giants, and Ms Troncoso is urging the public to think more about how to tackle this problem.
“A lot of what Google and Apple do is ‘whitewash’ their harmful practices and shut down the market. For example, Apple doesn’t allow apps to collect data, but it does collect the data itself in a ‘privacy protection’ way before selling it. What we are saying is that there is no way that can protect privacy. ,” added the researcher.
Don’t miss any of our content
#Sharing #data #protecting #impossible #dream #Octopusca