top of page

The Pedagogical Palette: Exploring Diverse Teaching Methods

Understanding Privacy-Enhancing Technologies (PETs): Non-Cryptographic Based Solutions - Part 2



In the first part of this series, we delved into some cryptographically-based privacy-enhancing technologies (PETs) along with blockchains and companies that utilize them. In this second part, we’ll be taking a look at privacy-enhancing technologies that are not cryptographic or don’t involve encryption. 



Communication Anonymizers


Communication anonymizers are tools that are used to keep online activities untraceable by hiding users’ identifiable information stored on the computer or by replacing them with one-time identifiable information. The underlying concept behind anonymizers is anonymous proxy servers.


Proxy servers are intermediary servers or privacy shields between the information that users consume from the internet and the data that they share on the web, thereby providing security and privacy. Think of proxy servers as “bodyguards” that gatekeep information that passes through users and the internet/websites they browse. These proxy servers hide the IP address of the users, that way, websites are unable to track the browsing histories of the users.


How proxy servers work: Users request to visit a website, the request goes to the proxy server and passes through the proxy server to the site. The website responds by displaying the information which the proxy server then accesses and relays to the user. This technique prevents the website from knowing the IP address of the real user behind the request. There are different types of proxy servers each with its function, however, the anonymous proxy servers are our interest because they work like communication anonymizers.


Anonymizers conceal users' identities and data while they access the internet. There are various kinds of anonymizers, some of which are very common to us. VPN, for instance, is a type of anonymizer because it establishes a secure connection over the internet using the user's device. Anonymizers are useful for preventing data theft and unwanted access to browsing histories.



Federated Learning


This is a machine-learning technique that uses several decentralized data sets to train models for statistical analysis. When the data is shared among decentralized servers rather than being kept on a single server, data minimization is achieved because the amount of data that would normally be on one server is shared amongst many. In businesses, data minimization is very useful because the less data a company has, the lesser the harm that can be done. 


Several model versions are trained and run locally on the device, rather than the data being fed into a central model. IoT applications are one area where federated learning is notably employed.




Trusted Execution Environments (TEEs) or Secure Enclave


TEEs are non-cryptographic privacy-enhancing technologies. They are physical regions in the main processors of physical computer systems. TEEs are physical regions of computer processors, isolated from the other parts of the central processing unit, where data and codes are stored to prevent tampering from unauthorized parties. It is also referred to as a secure enclave. The term “enclave” comes from a French word that means “to enclose” and it is based on a Latin word that means “key.” From this, we can deduce that TEEs or secure enclaves are like secured black boxes or keys.


In design, TEEs allow for encrypted data to be stored in them such that no one - not even the owner(s) of the servers can access users’ data inside them. Encrypted data are decrypted and computed only within the TEEs. They aid in the confidentiality and integrity protection of codes and data stored within them. With TEEs, unauthorized parties are unable to modify or replace data, thereby preserving the privacy of the data.


TEEs are advantageous in the sense that they have computation time overhead lesser than cryptographic techniques. However, because TEEs are hardware-dependent, they pose certain risks because they can be exploited. In regards to use in blockchains, permissioned blockchains are better suited to employ the use of TEEs than public blockchains. To avoid making this lengthy, you can learn more about why this is so in this Medium article.


An example of a blockchain that uses TEEs is Oasis Network.



Synthetic Data


Synthetic data generation involves artificially generating non-identifiable datasets from the simulation of an original dataset for use. Synthetic data generation is the process of taking an original dataset or data source and using it to create new non-identifiable artificial data that has comparable statistical characteristics. Maintaining the statistical qualities entails ensuring that an analyst of a particular dataset of synthetic data gets the same statistical conclusions from it as he would if he were working with the real (original) data.

Artificially generated data can be produced by machine learning algorithms.



Use Cases of Privacy Enhancing Technologies (PETs)


By offering transparency, choice, and auditability within data systems, PETs have the potential to enable increased security and confidentiality of data across use cases and industries. Here are some highlighted use cases.



Financial institutions and DeFi protocols: This is one of the sectors where PETs are applied. Privacy-enhancing technologies are very useful in safeguarding the financial history and balances of users who wish to remain private whether in centralized (traditional finance) or decentralized finance.


Data analysis: Data analysts work with a lot of data, without data they can’t work effectively so they are often exposed to a large set of data. This poses a high risk because, even though the analysts are trustworthy, their computers can be intentionally hacked to steal users’ data. In a measure to prevent this, certain privacy-enhancing technologies can be employed so that these analysts can compute with data but are not exposed to the sensitive content of the datasets.


AI/Machine Learning: With the use of AI and machine learning, the risk of privacy breaches can be minimized. Personal data can be encrypted, human error can be reduced, and possible cybersecurity events can be detected. PETs can be employed in training AI and machine learning models to develop privacy-preserving machine learning techniques that aid in preserving sensitive data.


Healthcare: The healthcare industry holds one of the highest records of individuals’ data and personal information (medical records inclusive) making it a high target for data breaches. Employing PETs is beneficial not just for hospitals and healthcare centers but also for patients.


In addition to PETs, other classes of technologies are designed to protect users’ privacy. These technologies are said to be complementary to PETs and they are Transparency Enhancing Technologies (TETs) and Intervenability Enhancing Technologies (IETs). Let’s take a brief look at what they offer.



Transparency Enhancing Technologies (TETs)


While PETs are designed to minimize the amount of sensitive data accessible by third parties, TETs are tools and techniques designed to allow users to get more insights into how their data and private information are utilized by organizations and online service providers, giving them control over how they share it. 

It is believed that when organizations are made to account to their users about how they utilize their data, they’ll be kept in check on how to safeguard these data.



Intervenability Enhancing Technologies (IETs)


As the name insinuates, IETs are tools and techniques that give users the ability to intervene or interfere with data processing provided their data is concerned. Users can delete, give consent, and choose which data to share, etc; giving users better control of their data.



Challenges of Privacy Enhancing Technologies (PETs)


  1. Complexity: The most effective PET solutions are complex to deploy and manage.

  2. Expensive: Due to the high computational tasks involved in some PET approaches, the hardware components used are usually not cheap to get.

  3. Lack of expertise: Some companies do not have the internal capacity to integrate PETs into their systems. Outsourcing to third-party services contrasts the whole idea of users’ data privacy.

  4. Compliance issues: Many PET solutions do not comply with data privacy laws set in place by regulatory bodies.


Privacy-enhancing technologies emerge in various forms; some are solutions that focus on hiding the data, some are approaches to change the original data, and others are solutions that focus on encrypting the data. In all of these, one goal is common - to ensure that users' data are protected both within and outside the blockchain.


Comments


Recommended

Subscribe Us

Get the latest creative news from CodeTavren magazine

bottom of page