Rob Pinna
Originally published at blog.ironcorelabs.com.

Agile Approaches to Privacy and Security for SaaS Vendors

Highlights From the Agile Amped Podcast With Rob Pinna and Leslie Morse

Listen to the Agile Amped Podcast Baking Privacy and Security into the Technical Architecture above or read our summary of the podcast below.

What do enterprise software and SaaS businesses get out of protecting their customer’s information?

Revenue protection. As large enterprise deals come up for renewal, SaaS companies are being asked how they are protecting customer data stored in SaaS systems. Those renewals are increasingly tied to strong protections for customer data.

And many companies are now looking at data privacy for competitive differentiation. Apple, for example, is putting a lot of time and effort into establishing privacy as a brand value, and I think we’ll see more of that.

Can you give me an example?

Since we’re at an Agile conference, let’s consider an agile project management tool. One common function is to be able to attach a screenshot or a mockup to a user story. That’s considered sensitive intellectual property by an enterprise customer, and their regulators will want to ensure that it’s protected. So the ability for a tooling vendor to offer zero-trust or end-to-end encryption for attachments might be important to secure a renewal.

Making a product secure isn’t the function of just one team — it impacts the entire company. How do you build that mentality into a company that’s siloed?

SAFe or other kinds of scaled agile techniques have a portfolio layer where cross-cutting concerns are defined. That’s the best place to do privacy and security work. It’s typically done by chief architects and baked into the underlying system. It should be possible for developers to focus on domain logic and have privacy and security handled for them automatically by a framework.

From a people point of view, agile programs must pull in stakeholders from areas that are not typically involved, like information security and the chief information security officer, and data compliance. Invite them to your next big-room planning.

How do you ship privacy and security early and often?

One of the principles of agile development is we focus on “cake slices” as opposed to “cake layers.” With privacy, you slice the cake by “following the data.”

Let’s say we are writing web conferencing software. We know that we have meeting audio and video, chat, attendee lists and assorted metadata. If we’re creating the privacy and security backlog, we might focus on just one of those data items.

The meeting recording is potentially very sensitive. In fact, a data compliance professional would call it toxic data — we have liability risk in holding it. So there would be a lot of value in protecting meeting recordings, so I would start there and worry about chat, attendees and metadata later.

What I’ve observed is that Agile teams fall back to waterfall “all-or-nothing” thinking when it comes to privacy and security. They don’t understand how to split the work into smaller epics and stories so that it’ll flow down to the team. The follow-the-data technique is one key to splitting. Here’s an example from our agile project management example, in story format:

As a (customer) product owner, When entering a user story with an attached mockup, I want the mockup to be considered private data, So that our company’s intellectual property is protected, And so that we meet our regulatory and compliance obligations.

Acceptance Criteria:

  • I can view the mockup
  • My tooling vendor can not view the mockup
  • I can search for user stories that do or do not have mockups

Are there other approaches to splitting epics (and stories)?

A second approach is to consider how far your data protection extends. Let me explain. The ideal in privacy and security is to protect data from the point where it originates to the point where it is used.

Let’s go back to our web conferencing example where we decided to protect conference recordings. If we encrypted at the point of origination, it might be hard for us to generate a transcript. We may instead decide that as we record we generate a transcript, then immediately encrypt both the recording and the transcript. That’s a faster approach to delivering value, with minimal risk exposure.

In practice, a “how far” question is often server-side or end-to-end (client). In many cases, you can start with server-side, and extend to end-to-end in later iterations.

Does this mean developers need to be really focused on privacy and security every day?

No. Developers don’t want to think about privacy and security day in and day out. If you’re relying on a thousand developers to think about privacy and security with every line of code they write, you’re in trouble. Privacy and security are an example of where we need to consider not just process, but technical excellence as an Agile value.

Let’s go back to our agile tooling example. Some parts of a user story could be considered very sensitive — for example, let’s imagine that there are custom fields on the user story that disclose contractual information. We could build into our platform a way to attach metadata to any field that would classify it from a privacy and security perspective. As long as developers applied the metadata, we could treat privacy policy as a separate concern.

Business specialists, in this case, security specialists, actually write the policy, which needs to be able to change frequently since laws or regulations change or if you want to classify something differently. Audits can be conducted on the policy.

It all comes down to separating the work by concern. You have developers making sure that they’re using the correct infrastructure. You have architects creating the architectural runway and you have security specialists defining the business rules.

There are several. One of the things that’s interesting about the policy-driven framework we just discussed is you can test the policy as opposed to the implementation. You test the implementation once and then you test the policy against the infrastructure. It’s much more automated and time-efficient. In fact, we see customers integrate checks into their deployment pipelines to ensure all data has a policy attached.

What’s the benefit of having data privacy built into the infrastructure of your product?

Today, most privacy and security is legal and administrative, with no underlying support in the technical infrastructure. That’s why you see so many issues in the news. If you’re old enough to remember languages before garbage collection, I sometimes use that as an example. For decades, software blew up on memory leaks, malloc and free errors. It just wasn’t realistic to expect thousands of developers to reliably handle memory allocation. We had to bake it into the system before those types of errors diminished.

Consider Facebook and Cambridge Analytica as an example of what happens when we don’t have a system for securing data, but instead rely on administrative and legal approaches.

Cambridge Analytica gained access to the social graphs of users through a partner app. They correlated that data with other information to predict how those users would vote, which in turn was used to target advertising. As a partner, this was against Facebook’s agreement and Facebook did discover it through an offline complaint. But they had no technical mechanism for doing anything about it. Instead, they had their legal department send a letter saying, “Please delete the data,” which Cambridge Analytica ignored. Then they sent a follow-up, “Did you delete the data?” And Cambridge Analytica said “Yes.”

There was no technical system that identified that Cambridge Analytica was mistreating this data, sharing it, and using it in an anomalous way. Facebook did not have a system in place to automatically revoke their access. And that’s why Zuckerberg ended up in front of Congress.

That’s really what we’re trying to fix. Privacy and security countermeasures today are all administrative and legal. As technologists, we have an opportunity to bake them into the system and we’ll all do a lot better when that happens.

What advice do you have for teams wanting to be well prepared going forward?

Be responsible. Don’t think it’s all on the CISO, or DevOps or your lawyers. Stop creating privacy and security technical debt for future developers. At the end of the day, when sensitive data is disclosed it’s a failure on the part of software architects and engineers. Agile is about technical excellence and craftspersonship. As a technologist, own privacy and security. You can do it.


Read more from IronCore Labs

The California Consumer Privacy Act goes into effect January 1, 2020. Download our in-depth CCPA white paper for examples and in-depth analysis on the impact of CCPA on enterprise SaaS businesses.

Get CCPA White Paper