The Threat Modeling Podcast
Chris Romeo is going on a journey. A journey to understand threat modeling at the deepest levels. He thought he understood threat modeling but realized he could go deeper. Chris shares his findings and talks with some of the best-known experts in the space to experience continuous learning. Join along for the ride -- you will learn something.
Chris Romeo is the CEO of Devici (THE Threat Modeling Company) and a General Partner at Kerr Ventures.
The Threat Modeling Podcast
Nandita Rao Narla -- Privacy Threat Modeling
Nandita Rao Narla introduces the basics of privacy in software. She discusses privacy threats, privacy threat modeling, and privacy by design. Suppose you write or handle software that touches user information. In that case, you need to understand privacy, how to assess and mitigate privacy concerns, and know when to implement privacy concerns into a design. This episode of the Threat Modeling Podcast is the perfect primer to raise awareness of the critical role privacy concerns should play in your next project.
Helpful Links:
Daniel J. Solove's "A Taxonomy of Privacy": https://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2074&context=faculty_publications
Welcome to Smart Threat Modeling. Devici makes threat modeling simple, actionable, and scalable. Identify and deal with threats faster than ever. Build three free models and collaborate with up to ten people in our Free Forever plan. Get started at devici.com and threat model for free! Smart threat modeling for development teams.
Chris Romeo: Privacy is important. And it's an individual right. Edward Snowden says, “Arguing that you don't care about the right to privacy, because you have nothing to hide, is no different than saying you don't care about free speech, because you have nothing to say.” As a human being, you should have ownership of the collection of data that you generate, and the data that describes your being.
Let's journey on the privacy road, exploring stops for privacy, privacy by design, and privacy threat modeling.
Nandita Narla: I'm Nandita Rao Narda. I am the Head of Technical Privacy and Governance at Doordash, where my team supports building tooling and features for privacy, as well as owning privacy assurance and privacy operations functions.
Chris Romeo: Before we move too far, we need to learn an expert's definition of privacy.
Nandita Narla: Privacy is respecting individuals preferences, and handling data how it was intended to be used by the individuals. So if you think about privacy and security, it's an overlapping intersection of the two, where the union is a privacy breach.
Chris Romeo: Don't sleep on this statement that you just heard. The union of privacy and security is a privacy breach. Loss of personally identifiable information is a top level threat in anything that you build.
Now that we have a working definition of privacy, let's introduce privacy threat modeling. We must start with privacy threat to land at privacy threat modeling.
Nandita Narla: Privacy threat is a potential source of privacy harm. And when we talk about privacy harms, it is a negative privacy consequence. For example, chilling effects where you are unable to express your thoughts or the feeling of being watched all the time limits free speech. Another example is manipulation where you have a loss of autonomy because there are some deceptive designs, dark patterns which are forcing you to give more information than you intend to do. There are discrimination harms due to biased AI or improper personal data in the training set.
These are risks or negative outcomes that could happen if a privacy threat is exploited.
A more in depth discussion of privacy harms is in Dan Solove's taxonomy of privacy harms. I think that's a really good starting point if someone wants to identify what are privacy harms.
Once you have that understanding, privacy threat modeling is, I like to use the Threat Modeling Manifesto definition: it is assessing a representation of a system. And by that, I mean architecture diagrams, data flow diagrams, or some narrative about what the system is to find the privacy concerns or privacy harms and to address them proactively.
Chris Romeo: Privacy harms are a new concept for me. Privacy harms focus us on the outcomes of the breach, or loss of privacy, and make it about the entity that experiences that loss.
Security and privacy threat modeling have been maintained as separate things. This leads me to wonder if they should be approached in the same way, simultaneously, with the same resources.
Nandita Narla: I have seen it done together, and it had a lot of process efficiencies by doing it together. It happens because, at least for privacy threat modeling, you need a large cross functional team to participate in the threat modeling exercise, which includes privacy engineers and architects. There is an overlap in which the people who are involved in security threat modeling and privacy, so massive efficiency gain by having it together.
The second component is usually I've seen when security and privacy threat modeling are done together there's a lot of cross pollination. Everybody gets smarter, learns about the other, processes, and this feedback loop and educational and awareness is a side benefit of doing it together.
The other aspect is usually when it's done together, the privacy component is a much more lightweight approach. I've seen companies used STRIPED model, which is basically STRIDE plus privacy. And by privacy, they're not including all of the LINNDUN seven threat categories. It's a much more lightweight approach. So, typically focusing on privacy non-compliance because most organizations are compliance focused when it comes to privacy. They're not risk focused. So, it's almost like the bare minimum of privacy threat modeling you can do, but it's still better than not doing it at all.
Chris Romeo: The more I think about this, the more I wonder if privacy could get lost in a conjoined threat modeling session.
Nandita Narla: It's usually a very lively discussion. It is all together. You're not trying to follow the acronym to do this. In most cases, the privacy professional who is part of this STRIPED methodology is a legal professional. So they are not engineers. They're not really looking at the interfaces, data flows, and assets, but taking a more general view and coming up with threats.
But, the benefit is as you continue down this journey, a lot of security engineers pick up privacy harms and requirements and eventually become the privacy champions, and start coming up with more technical considerations that should be included in the prioritization and risk mitigation exercise.
Chris Romeo: Privacy by design is a hot topic these days, but how does privacy by design fit into the bigger picture?
Nandita Narla: Privacy threat modeling is a component of privacy by design, and if you were to think of concentric circles, I would think risk management, then a subset of risk management would be privacy by design, and a subset of that would be privacy threat modeling. The difference between privacy threat modeling and privacy by design is that privacy by design starts much earlier, even in the ideation phase of a product.
So even when you're thinking about let's build a feature or this particular user story, that's when someone from the product council, or a privacy engineer, or someone from the privacy team would lean in and say is this even possible? Should we even do this? Versus, I feel that, for privacy threat modeling, you need to start at the design phase.
There has to be some sort of definition or scope that you can use to go forward with the abstraction of threats. If you think about it from the linear perspective, it comes in a little bit later but is part of the overall risk reduction or harm reduction exercise, which all these three things aim to do.
Chris Romeo: Privacy, privacy threats, privacy threat modeling, privacy by design. We must interweave these concepts into the things that we build. Anything that you build that has users, has data, and that data must be protected. The release of data against a user's desire could cause them catastrophic loss in multiple areas of their life.
It's time that we as an industry embrace the discipline of privacy and bring it forward from afterthought to the primary focus of the design.