Cybersecurity is often viewed from the point of view of practitioners, which is why the DevSecOps company Jit took a different tack on the subject — and asked developers about their views on application security (AppSec).
The authors of the survey, which garnered the opinions of 150 developers across industries and company sizes, noted: "While the challenges and needs of security professionals are well researched, this report aims to build a clearer understanding of how developers experience application security and to foster better alignment between their needs and organizational security goals."
With all eyes on the shift-left movement's failures — and the rise of AI as savior for all things security — what development teams have to say might surprise you. Here are four key takeaways from the Jit survey.
[ Get White Paper: How the Rise of AI Will Impact AppSec ]
1. Security is not a cultural priority among development teams
The survey found that 61% of participants believed that security is not a priority, or "somewhat important," to their development culture. This was consistent with another finding, that a "lack of organizational priority" ranked as one of the top challenges to AppSec.
Chris Romeo, CEO of the threat modeling company Devici, said this finding was not surprising, and reflected enterprise priorities.
"Teams discount security due to other institutional or team priorities. Unfortunately, secure code has not yet reached the priority level of delivering features or quality. Cultural priorities are defined from the top, and corporate and engineering leadership are not embracing secure code with the correct level of emphasis."
—Chris Romeo
One of the biggest factors driving developers to downplay security is the pressure teams face to deliver — and deliver fast, said MJ Kaufmann, an author and instructor at O'Reilly Media.
"They face tight deadlines and rapid iteration cycles, where security is perceived as a potential bottleneck rather than an integral part of the development process, driving it down as a priority."
—MJ Kaufmann
Mike McGuire, senior security solutions manager at Black Duck Software, said that top DevOps teams aim for a cycle time of less than one day, and even slower teams are pressured to get newly committed code into production within a week.
"Simply put, the main priority for development teams is time to market, whereas security and speed haven’t historically been synonymous with one another. Developers are motivated by, and compensated for, how quickly they can get a functional, high-quality product into their customers’ hands before their competitors do. If security isn’t one of the customer requirements, it will not be a priority for development teams."
—Mike McGuire
2. Software complexity and obstacles rank as top security challenges
The survey found that software complexity was perceived in different ways, ranging from the variety of technologies and modern environments, to large dependency trees. The top organizational obstacles included lack of knowledge, training and guidelines, organizational priority — and time.
Gene Spafford, a computer science professor at Purdue University, said complexity is the enemy of good security and code quality. And the more complex something is, the more likely it is that it is misunderstood, he said.
“The more complex something is, the more chance that small mistakes or edge cases result in unexpected and unwanted behavior. When more people are involved in complex code development and maintenance, differing assumptions and poor communication can result in important concerns being missed."
—Gene Spafford
Spafford said that organizational issues such as conflicting deadlines and priorities, uncertain decision chains, shifting requirements, and discontinuities of involvement all led to deflecting a developer’s attention away from security. “Any of those may add to the misunderstanding and complexity of developing quality code,” he said.
“Organizations that are successful with building and maintaining complex systems have built detailed mechanisms and personnel systems to be able to address these issues. Organizations that don't yet have such systems in place often run into difficulties.”
—Gene Spafford
Software supply chain issues can also add to complexity, he maintained. “If management decides to use complex third-party code to save money or add compatibility, the developers are unlikely to have the means and management support to do thorough code audits and security analyses,” Spafford said.
Romeo said that complexity shouldn’t be an issue for developers, because they are more than capable. It all comes down to priorities and time, he stressed.
“I disagree that application complexity is a top challenge. It’s like the mechanic telling you they can’t fix your new car because it is too complex of a device. Secure coding is a fundamental skill that can be applied to a one-line pull request or building an entirely new product.”
—Chris Romeo
3. Automation is key, but tool integration and noisy results stifle progress
Developers said they believe that the implementation of automated security testing via tools like static application security testing (SAST), software composition analysis (SCA), and secrets detection is the most effective strategy to secure applications. However, their integration into development workflows and noisy results were often cited as core challenges to code security. More than half of the survey participants (53%) indicated that they had access to automated security tools, making it the most common strategy to improve code security.
Automated tools have their limitations, though. For example, an SCA tool is often used with a software bill of materials (SBOM) to identify flaws in code, said Lori Flynn, a senior software security researcher with the CERT division at Carnegie Mellon University's Software Engineering Institute.
“Currently, SCA and SBOM-generation tools lack capabilities for some programming languages, plus they don’t always correctly identify vulnerable packages in binaries or packages pulled into builds as dependencies."
—Lori Flynn
SAST tools can also leave security gaps in software, as well as be annoying for developers, said Will Klieber, a software security engineer also at Carnegie Mellon University. “Static-analysis tools examine code without running it. They can discover vulnerable code but suffer from false alarms, meaning they warn about code that is actually perfectly okay. Often the volume of false alarms is too high to manually examine all the alerts,” he said.
Michael Nov, co-founder and CEO of Prime Security, said automated security testing is very useful on a tactical level, but it's a late-stage control. “It catches code-level issues after development is done, when fixes are costly and disruptive,” Nov said.
“Worse, these tests often miss design flaws, like broken authorization models or insecure data flows, which aren't simple code mistakes but fundamental architectural issues. Security needs to shift earlier, ensuring that potential risks are addressed before they turn into vulnerabilities.”
—Michael Nov
Devici's Romeo said that tools are great and necessary, but depending on security tools for secure coding is like depending on the fire alarm to turn off your stove before the fire starts.
“Secure coding is an educational challenge, and developers must learn the principles and apply them to every feature and product. Then use the tools to verify that secure coding principles have been correctly applied.”
—Chris Romeo
4. Developers are not confident AI will help them secure their code
Coding assistants like GitHub's Copilot were ranked in the survey as the least effective strategy for securing code, and AI was cited as the second least common source to answer questions related to code security.
Feng Li, a computer information technology professor at Purdue University, said that AI-driven security tools often feel like black boxes, making it hard for developers to trust their recommendations. If they don’t understand why a tool flagged something, they’re less likely to act on it.
"At the same time, AI itself introduces new attack surfaces. Attackers can manipulate model biases, evade detection using adversarial inputs, or even poison training data. Developers already struggle to debug AI-driven systems in general — so for security, they’re understandably cautious about handing over control to something they can’t fully verify.”
—Feng Li
Rosario Mastrogiacomo, VP of strategy and solutions engineering at Sphere Technology Solutions, said AI seems to be the go-to solution for everything. But he said that AI was not living up to the hype.
“While AI is a valuable tool for writing better, more secure code, it’s not a silver bullet. Just like automated security testing alone isn’t enough, AI can't replace critical thinking and human oversight.”
—Rosario Mastrogiacomo
Eric Schwake, director of cybersecurity strategy at the API security firm Salt Security, said that AI is becoming vital to application security, especially API security. “The vast amount of code and the ever-changing nature of APIs present significant challenges for humans trying to spot and address vulnerabilities effectively. AI quickly analyzes large datasets and recognizes patterns, making it essential for API security,” Schwake said.
“AI-powered tools can identify anomalies and vulnerabilities in real-time, automate security testing, and offer proactive defense mechanisms. While human expertise is still key, AI serves as a formidable partner in safeguarding code and shielding organizations from evolving threats to APIs. Developers and security teams need to adopt AI to bolster their API security efforts.”
—Eric Schwake
But Purdue's Li pointed out that security isn’t just a tooling problem. “Developers need to think about security as they code, not just when they run a scan before release. The challenge is that security often competes with deadlines, and without strong incentives, it gets pushed aside.”
“At least for now, no tool can replace human intuition in cybersecurity. An experienced engineer reviewing an authentication flow can spot subtle logic flaws that an automated tool would miss. That’s why human-in-the-loop security is still critical.”
—Feng Li
Going beyond legacy application security is key
Jasmine Noel, senior product marketing manager for ReversingLabs, said the reason developers get annoyed with SAST and SCA tooling is that code vulnerabilities do not inherently cause harm. The most dangerous vulnerabilities are those being actively exploited by malicious actors.
"Traditional AppSec tools simply doesn’t have threat intelligence data to prioritize that risk. Similarly, those tools are not designed to find attacks on build and release infrastructure and workflows."
—Jasmine Noel
Such attacks embed malware, and make unauthorized functionality changes, which are specifically designed to harm, exploit or infiltrate systems, Noel said.
"It is also fairly common to find those threats embedded in vulnerability-free components, dependencies and artifacts – making it even harder to find indicators of these potentially business crippling threats."
—Jasmine Noel
Josh Knox, a former evangelist for ReversingLabs, said what's needed is a new approach such as complex binary analysis, which complements traditional security testing with a final test for packages before release.
Modern AppSec strategies like binary analysis also play an important role in managing risks accrued by software complexity. Foundational to that is ensuring that code review and testing are conducted across the entire software development lifecycle (SDLC). For software engineering teams, those checks should start before code commit, Knox said.
"Back when I worked in government contracting, even before you could commit the code, we had what they called pre-commit hooks. That was a fancy way of saying that when I go to commit the code, the code that I'm going to commit gets checked for certain things before it will even be allowed to be added to the codebase."
—Josh Knox
Those hooks would warn developers early on if a package wasn't allowed or a style or other component was not permitted. But code review and testing are not just a matter of shifting left; testing to the center and right of the development timeline is also needed.
"Good code review needs to happen," Knox said. The review might be automated or involve a human looking at the code and asking, "Why do we need that? Why did you pick that one? Why are you adding it?"
"But all of those things add time and add effort that a lot of development teams feel like they don't have because they're under such pressure to get these new features out quickly."
—Josh Knox
Keep learning
- Get up to speed on securing AI/ML with our white paper: AI Is the Supply Chain. Plus: See RL's research on nullifAI and join our Webinar to learn how RL discovered the novel threat.
- Upgrade your software security posture with RL's new essential guide, Software Supply Chain Security for Dummies.
- Learn how commercial software risk is under-addressed: Download the white paper — and see our related Webinar for more insights.
Explore RL's Spectra suite: Spectra Assure for software supply chain security, Spectra Detect for scalable file analysis, Spectra Analyze for malware analysis and threat hunting, and Spectra Intelligence for reputation data and intelligence.