Study: Open-Source Software Risks Are Rampant

A newly released study finds that the vast majority of software systems that businesses use to manage their operations rely to at least some extent on open-source software — and the vast majority of that open-source code contains multiple high-risk vulnerabilities. 

So says the 2025 Open Source Security and Risk Analysis Report, released Tuesday by Black Duck Software and a document sure to leave auditors, CISOs, and cybersecurity risk managers everywhere banging their heads against the keyboard. The report raises pointed questions about companies’ ability to manage their software supply chains, which in turn spills into questions of internal control, vendor risk management, and regulatory compliance in our IT-centric, interconnected world.

Let’s start with the findings themselves. Black Duck researchers studied nearly 1,000 “commercial codebases” that were widely used by corporations in 2024. A codebase is the complete software package that a company might use to build a software application: the code itself, programming instructions, resource libraries, and other documentation. Black Duck examined 965 commercial codebases across 16 industry sectors.

The researchers found that 86 percent of commercial codebases contained open-source software vulnerabilities, and 81 percent contained high- or critical-risk vulnerabilities. Black Duck also found that the number of open source files in an average application has tripled, from roughly 5,300 in 2020 to more than 16,000 in 2024. Figure 1, below, shows which industries were at highest risk.

open-source

Source: Black Duck

There’s nothing inherently wrong in using open-source software as part of your software development process. The code itself is free; anyone can grab it off the internet and then tailor it to your specific needs. The problem is that you need to assure that the code is secure, and that can be tricky. The code you use might have incomplete or erroneous documentation, or be so picked over and modified through the years that you can’t easily discern the vulnerabilities that might be hidden inside.

In other words, using open-source code as part of your software development is an inherently risky process, and the Black Duck report reminds us that it is getting riskier over time

So companies must develop policies, controls, and audit procedures that can keep pace with that increasing threat. Otherwise you risk regulatory compliance violations (say, a privacy breach) or an operational risk disaster (a ransomware attack caused by the tech of some vendor in your supply chain).

Getting Ahead of Open Source Risk

The Black Duck report offers several suggestions that companies could adopt to get ahead of open-source software vulnerabilities. Most are diehard stuff that only CISOs, IT auditors, and software development teams will readily understand, but the rest of us can still follow along.

For example, your company could engage in software composition analysis, where you or an outside IT team study the codebase in detail to identify vulnerabilities and root them out. (By coincidence, I’m sure, “SCA” is a service that Black Duck offers to paying clients.) 

That analysis can help identify which parts of your codebase depend on other parts, so you can better understand, “OK, if this piece here is bad code, what are the consequences of leaving it in place? What are the consequences of removing it?” It can also uncover any issues with software licenses, in case you’re unwittingly using unlicensed software, which could expose your business to intellectual property disputes. 

More than anything else, however, software composition analysis can help you generate a rather nerdy thing known as a “Software Bill of Materials.” SBOMs are an inventory of all the parts of a software application, including open source libraries, third-party modules, frameworks, licenses, metadata, and everything else. 

SBOMs are important because they are the evidence you can provide to customers to prove that your software is reliable to use. Essentially, it’s the proof that you’ve done your bit to be a good do-bee in the software supply chain, so that others will feel more confident using the tech you supply to them.

By the way, if that acronym sounds familiar, that’s because SBOMs were mentioned in an executive order on cybersecurity that the Biden Administration issued back in 2021. The goal then was to bring more cybersecurity into the federal government, which does rely heavily on third-party software from IT vendors. The order directed the Commerce Department and NIST to develop standards for what an SBOM should contain, so that vendors could provide the material to federal agency purchasers. 

Notably, that cybersecurity order is one of the few from the Biden era that so far the Trump Administration has not rescinded. President Trump is moving to weaken the country’s cybersecurity protections and oversight in other ways, but at least for now, SBOMs still have Uncle Sam’s blessing as a good idea.

Other Compliance Issues

We can leave the rest of the technical stuff to the CISOs and IT developers. Compliance officers and auditors, however, should keep a few other points about this issue on your radar screen.

First, what assurances are you extracting from your suppliers about their own use of software? Do you have contracts in place that allow you to raise this issue of secure software development? Do you have people on staff with the right skills to review the vendors’ code, so you’ll know what you’re looking at? If you’re relying on some sort of SOC 2 audit report, have you scoped that SOC report appropriately? 

controlSecond, what processes or controls have you put in place for your own software development efforts? For example, it’s easy to draft a policy for your developers, “Any software that relies on JQuery must undergo rigorous security audits.” (JQuery is a piece of software that’s popular, but also notorious for unpatched vulnerabilities.) But how would you enforce such a policy over time, and at scale? What automated tools would you want to implement, or what post-development testing should be conducted before pushing that software into a live environment? Those are questions CISOs, IT audit teams, and even compliance officers will need to ponder, to assure that your developers don’t blunder into something reckless.

Third, what assurances are you currently providing to the public, and how do you know those statements are accurate? This is on my mind because the Black Duck report is really about how you can assure a safe, secure software development process at your company. If you misunderstand how secure that development process is, or somehow misstate that process to the public, your company could end up facing lawsuits or enforcement action.

This is precisely what happened to SolarWinds, which suffered a massive cybersecurity breach in 2020. For years the company had published a “Security Statement” on its website, promising that it used only the strongest software development practices. After the breach, the Securities and Exchange Commission found numerous internal messages from employees scoffing at that notion. The SEC eventually filed a lawsuit against the company, and while part of that suit was dismissed last summer, the complaint about misleading disclosures was allowed to proceed.

So the compliance or legal team needs to confirm that statements you make publicly about your cybersecurity and software development efforts align with the actual efforts you take internally to develop software in a safe, secure manner. That’s a complicated effort involving multiple parts of the enterprise, so tread carefully.