Spiria logo.

Integrate security in the software development cycle, but slow down the project?

February 3, 2022.

An interview with Garett Spencley-Sales, software architect, lead developer and security specialist.

Spiria: Security is often implemented as an afterthought at the end of the software development cycle. What are your thoughts on this?

Garett Spencley-Sales: Security deserves primary consideration. It needs to be dealt with upfront, whereas there’s a tendency in our industry to think of security as a technical problem to be overcome. There’s also an assumption that, “Well, if there’s a security problem, it’s a bug because the developers didn’t code correctly.” That’s a disastrous way to think about security. You’re leaving yourself wide open to all kinds of issues. In order to address security properly, you need to do the risk analysis upfront and you need to identify your most valuable assets. Whether it’s customer data and information, company systems—whatever you’re protecting, or whatever would be a liability if it were compromised—you need to identify those things before development even starts.

Threat modeling should also be complete before you get underway, so that if development goes sideways, you can say, “Okay, here’s how we anticipated these fault lines; these high-risk assets could be attacked and potentially compromised, but here are the features we included in order to protect them on the technical front.”

And then, of course, organizations need to be aware that technical risk is but one of the risks they run. Over the decades, we’ve seen so many attacks that preyed on social vulnerabilities rather than on technical ones. Even today, phishing campaigns, which are social attacks, not technical attacks, are one of the most common ways to compromise a system.

Clearly, the concern for security starts well before any line of code is written. The considerations run from the choice of tools and frameworks to the definition of the requirements, right?

Garett: Sure. You need to think of security as an important feature. You can’t just assume that the system will be inherently secure. What it means to be secure varies from project to project. It depends on what the risks are and what needs to be protected. What does security entail for your system and for your data? It’s not cookie cutter, as every project will have different requirements. When you understand what your priorities are, you can treat security as a feature and you can make sure that it’s addressed from the start. Then the tools, the processes and the systems you use will be the right ones to solve those particular problems.

Integrating security into the software development cycle without slowing down the project is a time-consuming task that comes with its own cost. Don’t you need extra hands to maintain the timeframe? Does that make it a bigger investment?

Garett: Getting security right inevitably comes at a cost. Properly developing features and software costs money. But if we treat security as an afterthought and problems as bugs, this will add even more to the cost.

When dealing with security, bugs are especially harmful because they represent a huge liability for businesses, who could face regulatory fines, lawsuits for breach of privacy if user status is compromised, or a market loss because of a decreased level of trust in their service. Security has many financial implications for a company.

So that projects don’t grind to a halt, the smart way to go is to handle security as a feature from the get-go. This way, you ensure that those types of bugs are caught early, when they can be addressed cheaply, quickly and efficiently, instead of developing an unsecured system because no-one felt security was a prerequisite. You don’t want to be saying after the fact, “Oh no, we have to do something about our security problem, but we can’t because the system wasn’t designed to do what we now need it to do.”

So the unfortunate answer is yes, you do need to invest a little bit more upfront. But the business reason to make that investment is that this will potentially save you a massive amount of liability. And if you want to do the cost-benefit analysis, then do that risk assessment upfront. Identify what’s most valuable and what your liabilities could be, because if you don’t, you’re taking a huge risk without even being aware that you’re at risk. And that’s what we’re seeing now with all these ransomware attacks, breaches and so on.

If you don’t tend to the security aspect continually, you accumulate a kind of technical debt, right? Could you talk about “security debt” and what problems such a debt creates?

Garett: You could frame it as a type of technical debt. But now, luck unfortunately enters the picture. Out of sheer dumb luck, some companies never find themselves targeted. They don’t suffer those liabilities and they don’t pay the cost of technical debt. But if you do get breached, if there is a vulnerability at the technical level and you only address it after the fact, other security vulnerabilities in your system will also come to light. It wouldn’t be inappropriate to consider that as a form of debt that eventually comes due.

In order to minimize this debt, would you say that it is better to have several minor slowdowns during development than one major one at the very end?

Garett: Well, the topic of slowdowns is an interesting one, because I think there’s an implicit expectation that the system will be secure. We expect security because we recognize that we are dealing with risk. Whether it’s a product company or a client contracting an agency such as Spiria, nobody commissioning a project intends to buy unsecured software. I would argue that planning for security is not a slowdown.

Security is a feature, whether it’s treated as such or not. The smart thing to do is to treat it as such, and bake that in the cost of developing the software’s features. You have to build the security and that just comes with a cost, which is that of developing the features.No one needs an unsecured system that will cost the company in terms of liability, or cost our customers, which blows back on the company as well.

That’s all well and good, but in practice, how does it work? You’re a senior developer with a lot of experience, you’ve seen many projects and teams. Do things always work out that way? Is security truly integrated into the development process?

Garett: To be honest, in my observation and experience, security is all-too-often not considered at all. It simply doesn’t enter the conversation. Companies race to go to market with shiny features that they expect will make them a lot of money, while security is never mentioned. Or if it is, it tends to be brought up by developers who understand that, “Hey, you have risks here. This database has an insane amount of customer data and information, but there are actually very few access controls and measures in place to safeguard it.” The typical answer is “That’s a very good point” and “We swear we take it seriously,” but it’s not a priority right then. It’s something they want to get to someday, but that day never comes.

It’s no surprise to me, because we software developers and engineers have been raising these concerns for the twenty years that I’ve been in the industry, and now I’m like, “I told you so.” We predicted that there would be more large-scale attacks, data leaks and breaches, such as the ransomware attacks that are making the news.

Forgive my bluntness, but I don’t think things will really improve until business leaders face jail time. There are simply no consequences. It’s a matter of risk assessment. Managers, business leaders, and executives who are not particularly tech-savvy believe that it takes some sort of genius like you only see in the movies to compromise a system. They think, “This can’t happen to us.” But anyone who doesn’t treat this seriously is putting the system at risk. Any system can be breached if someone is determined to get in. If you skip the risk assessment, you don’t know what’s vulnerable and what you need to safeguard, and everything is left exposed.

So we’re seeing that the benefit of going to market quickly to give users new features outweighs the risk of fines related to database breaches, for example…

Garett: It’s out of sight, out of mind. I think it’s not even considered early on, when in fact that’s when it should be highlighted. Security should be included as a feature. Clients assume that they’re buying a secure system, but security is never mentioned during the discovery phase or during planning. We’re never given documents that outline the risk assessment and threat modeling, nor are we ever asked to do that modeling. I’ve never seen a report that said, “Here’s the threat modeling, the risk assessment, our priorities, so here’s what we have to put in place.” Security consulting agencies are hired to do that in some cases, and I did it once when I was on a project for a major US bank that had its own systems in place.

What’s more, I recently listened to someone’s lecture who is contracted by banks to do physical security assessments. They hire him to deliberately try and rob the bank, to see what the bank would do in case of an attack. He found that even banks’ physical security is much weaker than one would expect. He argues that if they can’t do physical security right, then there’s no way they can do digital security either. If planning for security features isn’t deemed important, the liability becomes the incentive.

It’s assumed that developers will write secure code, but policies to control access to database servers have little to do with code. That’s one potential attack vector; then there are social attack vectors, infrastructure attack vectors … there are many ways to compromise systems. It varies between projects and between systems. You can’t proceed through planning with assumptions only, because you won’t get it right, you won’t recognize what the risks are.

Do you have specific recommendations?

Garett: That depends on the project and on the system. I’m going to sound like a broken record, but the primary recommendation is to do the threat modeling and risk assessment because without that, you don’t know what your maintenance requirements are.

Some organizations have done their risk assessment upfront only to decide that the legacy systems didn’t represent much of a risk. The worst-case scenario isn’t so bad if someone were to attack them. Those legacy systems can be left as-is because no serious damage could be inflicted, while there might be other systems within the company and network that are much higher priority and that deserve full attention.

Risk assessment is also a cost-saving measure, because companies can use it to define their liabilities and then prioritize their maintenance efforts accordingly.

To go back to the question, which was how to integrate security into the software development cycle without slowing down the project, is that a catch-22? If it requires more time, people, and maintenance, do we run around in circles? Or is the solution to do it upfront so that you can plan for the right amount of people and a realistic timeframe, so things go smoothly?

Garett: If this software is expected to be secure, the question becomes “How do I write fully-functioning software without slowing down the dev cycle?” And that is the fundamental problem that every project manager faces, whether they’re talking about security or anything else. How do you plan the development of software that functions correctly, and how do you develop it on a tight schedule without introducing delays? You do that by planning for security upfront. If you treat it as any other feature, if you identify the important security requirements, then you can plan their development instead of circling back to fix an unsecured system.

It’s much more efficient and cost effective to do that little bit of work upfront and to bake it in as part of the planning and process. If you leave it for later, you’re basically thinking wishfully, hoping that it’s not going to be too much work to somehow magically make your system secure when you haven’t even identified what security means for your project and your system.

Thank you so much for your time, Garett. Your advice was straightforward and enlightening. Do you have anything else to add?

Garett: I’d summarize some key recommendations and takeaways. Do the risk assessment upfront to identify what your highly valuable assets and your potential liabilities are, treat security as a feature and add that in the project planning. It won’t necessarily be hugely expensive. As a matter of fact, it’s probably much cheaper to do that first-thing. Develop with security in mind from the start, rather than trying to secure an unsecured system after the fact.

Organizations should start thinking about security concerns as a potential liability issue. One of two things will happen: either high insurance premiums will become part of normal operating costs, or companies will finally take measures to get security right by treating it as a priority from the get-go.