Designing Systems That Know Less About Us

Privacy is often treated as a feature or a promise. Add encryption, write a policy, or comply with a regulation, and privacy is assumed to follow. Our work this past year reinforced our longstanding core principle: privacy holds when it is enforced by system design. When privacy depends on process, policy, or good intentions, it can erode under scale, convenience, or changing incentives.

Over more than a decade of work applying the principle of least authority, we have seen the same architectural insight hold consistently. Systems tend to be more resilient when individual components are designed to know only what they need to function, and no more. This is not because such systems are inherently harder to attack, but because they reduce the amount of information that can be correlated or misused in the first place. In practice, durable privacy emerges from architectural choices that deliberately limit correlation, separate responsibilities, and minimize the need for trust where it is not strictly required.

One example of this approach is Double Privacy Pass with Commitments (DPP+C), a privacy-preserving system architecture and protocol we are developing for a client. Designed around a simple constraint, the architecture aims to unlink a person’s real identity, payment information, and activity while using the service. Instead of centralizing sensitive data for convenience, the system deliberately partitions responsibility across separate components, each of which knows only what it needs to function. This architectural choice reduces the risk of correlation by design and limits the impact of compromise, misuse, or data exposure.

The same architectural pattern showed up repeatedly in our work this year, across systems that at first glance appear unrelated. Whether the problem involved storage, payments, authorization, or publication, the core question was the same: what is the minimum each component of the system needs to know to properly function? When systems were designed to accumulate context “just in case,” privacy weakened and risk increased. When responsibilities were narrowly scoped and knowledge was minimized, privacy became easier to maintain, and failures became easier to contain.

Taken together, these experiences reinforced a lesson that has long guided our work: privacy does not emerge from isolated techniques or well-intentioned controls. It emerges from systems that are designed to know less, correlate less, and retain less by default. Across different use cases, the most resilient designs were those that treated minimization, separation, and verifiability as foundational constraints rather than optional enhancements. As systems grow more complex and interconnected, these architectural choices become even more important, not just for protecting individual users, but for preserving trust.

As designs like DPP+C see broader real-world use, the next set of lessons will come from operating privacy-preserving systems in everyday conditions. Experience at scale tends to clarify where architectural choices support usability as intended and where further refinement is needed. These deployments offer an opportunity to validate that systems designed around minimization and separation can remain reliable, adaptable, and practical as they grow.

There is also reason for cautious optimism around perception. Privacy is often framed as a tradeoff, and perceived convenience frequently overrides a person’s choice of services. Fortunately, real-world systems increasingly challenge that assumption. When privacy is enforced by design rather than added as a feature, it can fade into the background and simply become part of how a service works. As more systems demonstrate that strong privacy does not require sacrificing convenience or functionality, the conversation may shift from whether privacy is possible to why it was ever treated as optional in the first place. Looking toward 2026 and beyond, continued deployment and iteration may help make privacy-by-design feel less exceptional and more expected.

Archives