Unlock the full value of your distributed health data for machine learning.
Download the whitepaper
News Analysis

4 key takeaways from the UK’s privacy-enhancing technology guidance

September 8, 2022
4 min read

This week, The UK Information Commissioner's Office (ICO) published much awaited guidance on privacy-enhancing technologies (PETs) - this as an important signal from one of the world's leading regulators that PETs are the building blocks of a future which puts privacy, security, and trust first. 

PETs are a suite of “technologies that embody fundamental data protection principles by minimizing personal data use, maximizing data security, and empowering individuals.” Developed in response to persistent insecurity, dark privacy patterns, and increasingly stringent (and necessary) data protection regulations, these technologies include federated learning, homomorphic encryption, secure multiparty computation, among others. They are being employed to solve some of humanity’s greatest challenges across domains as diverse as precision medicine and drug discovery to open banking and industry 4.0. 

Organizations building this future should take note of four key takeaways from the ICO guidance: 

PETs can unlock new opportunities for innovation

Observers posit that only 1% of the world’s data is used to its full extent. In our experience, this critical data is often underutilized due to some combination of technological, regulatory, and trust barriers. For example, two hospitals want to train a machine learning model on joint patient data to predict a rare disease but are barred from sharing the data directly due to its sensitivity, security, and regulatory considerations. The new ICO guidance highlights that “you can use PETs to give access to datasets which would otherwise be too sensitive to share, while ensuring individuals’ data is protected.” Organizations should review their innovation opportunity space through a PETs lens to unlock new opportunities for collaboration and innovation.

A regulatory green light for PET adoption

Despite the magnitude of the opportunity, a core barrier to the adoption of PETs has been trepidation within industry on how regulators view PETs. They have not been specifically contemplated by law or precedent. Parameters for their usage, adequacy, and their impact on an organization's compliance posture have been largely uncertain. The new ICO guidelines provide important license for organizations to adopt PETs confidently. The ICO positions PETs as a key way for organizations to demonstrate ‘data protection by design and by default’ - two core components of many data protection legislation. 

Combining PETs can unlock multiple objectives

The real power of PETs lies in combining them in smart ways to achieve the level of privacy, security, and trust that your use case requires. The ICO states that “whether a specific PET, or combination of PETs, is appropriate for your processing depends on your particular circumstances.” Indeed, certain PETs provide different levels and types of protection, and entail other performance and technical limitations. Federated learning provides strong input privacy by ensuring that raw data never moves and that custodians always retain control. The addition of differential privacy protects  the resultant local and global models from privacy leakage. Other options to protect data flows could include homomorphic encryption which enables computation on data that is never decrypted. Your unique objectives, threat model, data sensitivity, stakeholders, and technical requirements should drive this decision. What is clear from the new ICO guidance is that regulators recognize the power of PETs in combination. 

Implementing PETs should be done carefully

The ICO is correct in stating that “PETs should not be regarded as a silver bullet to meet all of your data protection requirements.” Processing personal data with a legal basis, transparently, and with integrity will always be important. Moreover, PETs vary in maturity and can be complicated - both in terms of how they work but also how to implement them in a production context. One particular practical example we’re working on now is how to set differential privacy settings in an interpretable and optimized way.

We got your back on the last takeaway. Our team has years of expertise developing privacy-enhancing technologies to obtain insights from sensitive datasets. Connect with us to learn more about opportunities to partner together on building a future which puts privacy, security, and trust first. 

Similar posts

News, insights, and opinions about federated learning and analytics.
Close Cookie Preference Manager
Cookie Preferences
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. Privacy Policy
Strictly Necessary (Always Active)
Cookies required to enable basic website functionality.
Made by Flinch 77
Oops! Something went wrong while submitting the form.
Cookie Preferences
X Close
Please provide a business or institutional email to continue.
We have a date... to federate!

Your request for a 14-day free trial has been received. You will receive an email within 1 business day with instructions to access your account.
Oops! Something went wrong while submitting the form.