Glossary

What are privacy-enhancing technologies?

All industries
No items found.
What are privacy-enhancing technologies
Written by
No items found.
Published on
January 16, 2024

Find out about the most important types of privacy-enhancing technologies and how they help maintain the confidentiality and integrity of data.

Recommended reading

Your guide to reducing wasted ad spend using first-party data

An estimated 23-56% of ad spend is currently wasted (and that’s before third-party cookies are completely deprecated). So how can brands ensure they’re reaching their ideal audiences at a time when consumers expect more personalized — yet privacy-preserving — advertising experiences than ever before?

Key visual for guide to reducing ad waste

Privacy-enhancing technologies (PETs) are technologies, tools, techniques, and practices designed to protect individuals' privacy. They achieve this by safeguarding personal data during storage, processing, and transmission.

PETs include methods like encryption, anonymization, access controls, and solutions such as differential privacy, synthetic data generation, and confidential computing. They help organizations and individuals maintain control over their data and mitigate privacy risks in an increasingly data-centric world.

Arial view of a neighborhood
PETs ensure that individual privacy is safeguarded

Privacy-enhancing technologies: Definition

Privacy-enhancing technologies are tools and methodologies designed to protect sensitive data and maintain the confidentiality and integrity of information. These technologies act as a safeguard, ensuring that personal information remains private, even in the face of data collaboration and analytics. Here are the most important types of PETs:


Synthetic data

Synthetic data allows organizations to generate artificial data that closely mimics real-world data, while still preserving privacy.

Organizations can safeguard sensitive information by generating synthetic datasets. These synthetic datasets closely resemble real data, encompassing not only the same shape but also similar statistical qualities. While they lack private details, they maintain correlations and patterns found in real data. This enables companies to conduct analyses and develop machine learning models without the risk of data exposure.

For example, in a well-designed synthetic dataset, it might be possible to observe a correlation between age and heart disease, preserving the statistical characteristics crucial for accurate analysis.


Differential privacy

Differential privacy is a mathematical method used in data analysis. It works by introducing randomness or noise into query responses, making it harder to pinpoint individual data points.

While there are numerous techniques for adding noise, they don’t all meet the criteria to qualify as differential privacy. Rather, differential privacy is the science of determining the precise amount of noise to incorporate in data queries to attain specific statistical privacy assurances.

Differential privacy employs aggregation to balance data analysis with privacy preservation. This technique involves summarizing and generalizing data to derive meaningful insights while protecting individual privacy.

By enabling the extraction of aggregate information, this method allows organizations to extract meaningful insights without revealing specific details about individuals. Differential privacy serves as a robust barrier against re-identification attacks, making it a valuable tool in data analytics.


Confidential computing

Confidential computing enables data processing within secure enclaves. This innovative approach prevents unauthorized access to data during computation, offering a new level of security in data processing and analysis.

Confidential computing keeps sensitive data safe even during use with two key security methods: isolation and remote attestation. The former safeguards sensitive information while in use, while the latter verifies this protection and what the data will be used for before computation even begins.


Homomorphic encryption

Homomorphic encryption enables computations on encrypted data without decrypting it first. This ensures data privacy while still allowing meaningful operations to be carried out on the encrypted information.


Secure multiparty computation

Secure multiparty computation relies on cryptographic protocols using encryption and mathematical techniques to enable multiple parties to jointly compute a function over their individual inputs while keeping those inputs private. It ensures no party learns anything beyond the computation output, even if some of the parties are malicious.


Federated learning

Federated learning is a decentralized machine learning approach. Here, a model is trained across multiple decentralized devices or servers holding local data samples, without exchanging them. Instead of sending raw data to a central server, only model updates (gradients) are communicated, preserving data privacy.


Trusted execution environments

Trusted execution environments are secure hardware or software environments within a computer system. They provide a secure and isolated area for executing sensitive code or operations. They protect code and data within them from external tampering, even from the operating system or other software layers.

Enclaves and trusted execution environments are a key part of confidential computing, and are broadly interchangeable terms. They typically imply that the environment is hardware-based. A few rare exceptions of software-based “enclaves” exist, but they provide less robust security.


Why are privacy-enhancing technologies essential?

The erosion of privacy in the digital age has raised significant concerns for individuals and organizations alike. Here are several compelling reasons why PETs should be at the forefront of every organization's data strategy:


Data breaches and privacy violations

We often hear about data breaches in the news. These incidents can expose sensitive information, such as credit card details and personal records. PETs can help prevent such breaches and protect your data.


Regulatory compliance

As data privacy regulations like GDPR and CCPA become increasingly stringent, organizations must implement robust data protection measures. Privacy-enhancing technologies provide a practical way to ensure compliance with these regulations, helping avoid hefty fines and legal consequences.


Cross-organization collaboration

PETs facilitate secure data exchanges among organizations, ensuring confidentiality in collaborative projects and research. This enables them to make data-driven decisions and more quickly close deals dependent on upholding stringent privacy measures.



PET use cases

PET use cases are commonly found where organizations must safeguard personal data while still enabling valuable data-driven insights. Some examples are:


Healthcare: Healthcare providers, researchers, and institutions use PETs to collaborate on and analyze patient data while preserving patient privacy.


Financial services: PETs help protect financial data during transactions, fraud detection, and risk assessment while adhering to regulatory requirements.


Digital advertising: PETs enable personalized advertising without exposing individuals' personal information, allowing ad targeting without privacy infringement.


Market research: Companies can collaborate with anonymized data, preserving individual privacy while gaining insights into market trends and consumer behavior.


Cybersecurity: PETs help protect sensitive security data, detect threats, and analyze network traffic without exposing vulnerabilities.


Compliance and reporting: Organizations use PETs to meet data privacy regulations like GDPR and CCPA while maintaining operational efficiency.



How PETs enable data collaboration

In previous examples, secure data collaboration emerges as a key use of PETs, ensuring privacy while sharing insights. There are several platforms that employ privacy-enhancing technologies to enable data partnerships while maintaining the confidentiality of the data:


Data clean rooms

These are secure environments where organizations can safely collaborate on or share data while it stays protected. Depending on the data clean room provider, they will employ different combinations of PETs to build data clean rooms with privacy-preserving capabilities.


Walled garden solutions

Closed ad platform ecosystems ("walled gardens") have their own versions of data clean rooms. A main motivator to use walled garden solutions is to do measurement without also opting-in to targeting.

However, because they control the access, rules, and data within their platform, these clean rooms pose a significant privacy challenge. Historically, they also have not integrated PETs.

Therefore, it's essential to acknowledge that there's no absolute assurance of data separation within walled garden data clean rooms. Instead, this separation relies on a specific agreement, where the technology company acknowledges that data within this environment serves a sole purpose and won't be intermingled with other data streams. The sole means of enforcing this agreement is the continuous commitment of the company, rather than any technological basis guaranteeing it.


Google Privacy Sandbox

The Google Privacy Sandbox seeks to balance user data protection with advertisers' need for insights to serve relevant ads. To achieve this balance, the Privacy Sandbox employs privacy-enhancing technologies, curbing invasive tracking like third-party cookies in favor of privacy-friendly alternatives.

The Privacy Sandbox primarily focuses on data within the web browser environment. Therefore, it may not provide the same level of data usability, collaboration, and historical data retention as data clean rooms. That's because these are designed specifically for advanced data collaboration and analytics while preserving user privacy.


The most privacy-preserving way to collaborate on data

Privacy-enhancing technologies are indispensable in today's data-driven landscape. They empower organizations to protect sensitive information, comply with regulations, build trust, and gain a competitive edge. When integrated into solutions like data clean rooms, PETs provide a secure environment for collaborative data analysis.

Decentriq's use of PETs in data clean rooms highlights their potential in securing the future of data-driven collaboration. By using techniques such as differential privacy, synthetic data, and confidential computing, our clean rooms ensure data encryption at rest, in transit, and in memory. This approach provides verifiable proof of data privacy throughout the entire data collaboration process.

References

Recommended reading

Your guide to reducing wasted ad spend using first-party data

An estimated 23-56% of ad spend is currently wasted (and that’s before third-party cookies are completely deprecated). So how can brands ensure they’re reaching their ideal audiences at a time when consumers expect more personalized — yet privacy-preserving — advertising experiences than ever before?

Key visual for guide to reducing ad waste

Related content

Subscribe to Decentriq

Stay connected with Decentriq. Receive email notifications about industry news and product updates.