AWS and Cloudanix team co-authored this blog: Real-Time Threat and Anomaly Detection for Workloads on AWS

Privacy by Design: A New Era of Proactive Security

Episode: 78

In a world increasingly dominated by data, the role of privacy has evolved from a regulatory afterthought to a fundamental pillar of system design. To unravel the complexities of this shift, we speak with Apoorvaa Despande, a senior privacy engineer at Google Cloud, whose journey from a PhD in cryptography to a leader in privacy engineering at Snap Inc. and now Google Cloud, offers a unique perspective on the field. Her work spans privacy design, data governance for Gen AI products, and the development of innovative Privacy Enhancing Technologies (PETs).

Privacy engineering

You can read the complete transcript of the epiosde here >

Apoorvaa’s entry into privacy engineering was serendipitous. While finishing her PhD in computer science and cryptography at Brown University, she attended a “Real World Crypto” conference to explore how her academic expertise could be applied in the industry. It was there that she attended a talk by a future manager at Snapchat, which introduced her to the fascinating world of privacy engineering and end-to-end encryption at scale. This discovery proved to be a perfect fit, blending her passion for cryptography with the practical challenges of safeguarding user privacy.

Her day-to-day work is multifaceted, falling into four main buckets: privacy reviews and analysis, problem-solving, execution and design, and communication. Apoorvaa begins by deeply understanding products and features, identifying potential privacy vulnerabilities and attack surfaces. The next step, and what she finds most interesting, is devising solutions using techniques like cryptography, which often leads to new patents and papers.

From there, she focuses on execution, designing and architecting solutions in collaboration with various teams. The final, and crucial, component is communication, as privacy engineering is a highly cross-functional activity requiring clear articulation of ideas and constant alignment across teams, including product, engineering, and legal. Apoorvaa sees the privacy engineer as a “technical advocate for an organization for their end users’ privacy,” defining the problem and solution spaces by bridging the gap between product requirements, legal perspectives, and user needs.

Core Concepts: Privacy by Design and Privacy by Default

The conversation delves into two core concepts: “Privacy by Design” and “Privacy by Default”. Apoorvaa clarifies that Privacy by Design is a framework for embedding privacy into the entire product development lifecycle, ensuring it is a proactive consideration, not an afterthought. This framework is about being responsible stewards of user data, treating it with the same respect as a friend’s secret, using it only for its stated purpose, and being transparent with users.

Privacy by Design also encourages a “win-win” or “positive-sum” approach. Apoorvaa firmly believes her role is not to simply say “no” to feature teams but to collaborate and find solutions that work for everyone, including the user. She views privacy as a product feature that can build user trust and enhance a brand, citing Apple’s marketing strategy as a prime example.

Privacy by Default, on the other hand, is a specific implementation of the Privacy by Design framework. It refers to settings that prioritize privacy from the outset, such as a device’s location services being off by default, messages being end-to-end encrypted, or data being processed only on the device. These concepts help privacy engineers engage in early discussions with product teams, “shifting privacy left” to avoid conflicts and costly reworks that arise when privacy issues are discovered late in the development cycle.

The Role of PETs: Utility without Compromise

Privacy-enhancing technologies, or PETs, are a critical part of the privacy engineer’s toolkit. Apoorvaa defines PETs as technical solutions that “provide utility or value from data while preserving privacy”. She categorizes them into two main areas:

  1. Execution Privacy: This involves maintaining privacy while operating on sensitive data. Examples include multi-party computing (MPC), which allows multiple parties to compute a function on their data without revealing the individual inputs, only the output. This is being explored in fields like machine learning, where one entity has a model and another has sensitive data, enabling computation without sharing raw information. Other technologies in this category are confidential computing and end-to-end encryption.
  2. Output Privacy: This focuses on ensuring that released insights and statistics remain private. Differential privacy is a key technology here, as it guarantees that any output, even after post-processing, maintains its privacy properties. Another fascinating PET is zero-knowledge proofs, which enable a party to prove they know a secret without revealing the secret itself, such as proving they solved a Sudoku puzzle without showing the solution.

Apoorvaa provides compelling examples of PETs in action, including the use of differential privacy for US census reports and federated learning for smartphone auto-complete features. She also highlights a personal case study: her PhD research on zero-knowledge proofs was implemented by WhatsApp to allow users to verify that they are truly communicating with the person they intend to, which is particularly crucial for sensitive conversations.

Striking a Balance: User Experience vs. Privacy

The conversation also touches on the friction that privacy measures can sometimes create for users. Apoorvaa references Apple’s App Tracking Transparency (ATT) framework, which disrupted the advertising industry by giving users the default option to not be tracked across apps. While a positive step for user privacy, she points out that the suddenness of the change negatively impacted small businesses and could even incentivize more invasive tracking methods.

The key, she argues, is to find a balance where user experience is not compromised. The goal is not to eliminate ads, but to make them less intrusive. Advertisers don’t need to know that a specific individual, like Purusottam, bought a product after seeing an ad. Instead, they care about aggregate trends, which can be provided using PETs like private set intersection without revealing individual identities. Apoorvaa advocates for greater user transparency and involvement, allowing them to express their interests so that they receive relevant ads, creating a “win-win” scenario.

However, this increased choice can lead to “decision fatigue”. She notes that complex cookie pop-ups are often designed to confuse users, leading them to simply “accept all”. The solution, she suggests, is to design privacy features in a more user-friendly way, perhaps through a more conversational interface or even by leveraging AI agents that can manage user preferences.

Generative AI and the Future of Privacy

The discussion concludes by exploring the intersection of privacy and generative AI. The biggest privacy vulnerability in this space is “memorization,” where large language models (LLMs) can verbatim memorize and then output private information from their training data. Another risk is “membership inference attacks” where an attacker can determine if a specific data sample was used in training a model.

To mitigate these risks, organizations must have a strong control on their data lineage, ensuring proper sanitization and processing before training. They should also invest in “privacy test suites” to assess models before deployment.

Looking ahead, Apoorvaa is excited about the future of PETs, seeing the field as a ripe area for innovation. She highlights promising research in areas like fully homomorphic encryption, which allows computation on encrypted data, and “machine unlearning” which aims to make a model forget specific data without requiring a full retraining. These proactive solutions will be crucial for building a more privacy-conscious digital future.

The Human Side of Security

In a world where security and privacy professionals often face burnout, Apoorvaa shares her personal approach to managing stress. She finds energy in working on projects she is passionate about, and she believes in balancing work with a multi-dimensional life, embracing diverse interests like music and parenting. This concept of being a “polypath”—someone who pursues and excels at multiple paths—is key to a holistic and fulfilling life, and she encourages others to explore it.

Ultimately, Apoorvaa’s insights reinforce the idea that privacy engineering is a collaborative, proactive, and human-centric discipline. It’s about more than just technology; it’s about fostering a culture where every stakeholder has a voice, and the collective goal is to build a more secure, trustworthy, and user-friendly digital world.

People Also Read

  1. Deepfakes and Evolving Threat Landscape
  2. Inherent cybersecurity risks
cta-image

Secure Every Layer of Your Cloud Stack with Cloudanix

Unify your security workflows with Cloudanix — one dashboard for misconfigurations, drift detection, CI/CD, and identity protection.

Get Started

Blog

Read More Posts

Your Trusted Partner in Data Protection with Cutting-Edge Solutions for
Comprehensive Data Security.

Tuesday, Sep 30, 2025

Eliminate Standing Access: Introducing JIT Kubernetes for Azure AKS Security

The Security Mandate: Why Permanent Access Fails Mission-Critical AKS Kubernetes has become the operating system of

Read More

Friday, Aug 08, 2025

User Access Review in Cloud Security: A Foundational Guide to Securing Your Cloud Environment

Introduction: The Unseen Gatekeepers of Cloud Security In the rapidly expanding landscape of cloud computing, organi

Read More

Saturday, Aug 02, 2025

Streamlining Just-in-Time Access: Balancing Security and Developer Workflow Integration

Introduction Just-in-Time (JIT) access is an undisputed cornerstone of modern cloud security. By eliminating standin

Read More