In 2025, open-source consumption hit 9.8 trillion downloads across the four largest registries—a 67 percent increase year-over-year.
This volume reflects a consumption model where CI/CD pipelines, ephemeral build environments, and aggressive caching strategies pull dependencies relentlessly. However, while shared building blocks accelerate delivery, the sheer weight of this consumption is cracking the commons.
Brian Fox, Co-founder and CTO of Sonatype, commented: “In our eleventh year of this analysis, the open-source bargain holds true: we all move faster because we share. What’s changed is the scale and the stakes.”
The ecosystem has matured into vital infrastructure, yet it often operates with the fragility of a hobbyist project. The challenge for 2026 is about surviving the operational and security costs of using open-source at scale.
Malware industrialisation
The days of isolated script kiddies defacing libraries for notoriety have largely passed. The threat environment has shifted toward industrialised and often state-sponsored campaigns designed to compromise the very people building the software.
In 2025, Sonatype identified nearly 455,000 new malicious packages, bringing the total known malicious components to over 1.233 million. Attackers now treat open-source registries as reliable delivery channels for malware, optimised to bypass perimeter defences and execute directly on developer workstations.
Lazarus Group, a notorious North Korean state-linked actor, exemplifies this evolution. Their campaigns are manufacturing lines rather than opportunistic strikes.
Analysis shows that 97 percent of Lazarus-associated activity concentrated on npm, targeting the JavaScript ecosystem where dependency churn is highest. These actors utilise “social engineering mimicry,” publishing packages that impersonate standard front-end tooling like Tailwind or Vite plugins to trick developers.
Most concerning is the emergence of self-replicating malware. In late 2025, a worm dubbed “Shai-Hulud” demonstrated the ability to propagate autonomously. It did not wait for a developer to manually install it; it compromised maintainer credentials and spread across repositories and machines without human intervention.
For DevOps teams, this implies that the developer environment is no longer behind the firewall; it is the new perimeter.
The intelligence gap of AI agents
As organisations rush to integrate Large Language Models (LLMs) into their development workflows, a quiet crisis of data quality is emerging. AI coding assistants are powerful, but they suffer from a temporal blindness that makes them risky advisors for dependency management.
Tests involving nearly 37,000 upgrade recommendations revealed that GPT-5 hallucinated version numbers at a rate of 27.8 percent. The models frequently recommend package versions that do not exist, or worse, suggest upgrading to versions that were compromised after the model’s training cut-off date.
The danger amplifies through developer trust in these tools. Katie Norton, Research Manager at IDC, explains that “developers accept an average of 39 percent of AI-generated code without revision, highlighting how often AI output is incorporated as-is.”
Because these models lack a live connection to registry data, they cannot know that a previously safe package was hijacked yesterday. In one instance, an AI agent confidently recommended sweetalert2 version 11.21.2, a version known to contain “protestware” executing political payloads.
While upgrading to the “latest” version is often touted as a best practice, doing so blindly can be financially ruinous. Analysis suggests that an unmanaged “latest version” strategy costs approximately $29,500 (£23,000) per application in developer hours spent fixing breaking changes. AI-driven recommendations fared poorly in efficiency tests, often suggesting upgrades that degraded the security posture of the application.
For developers, the lesson is straightforward: AI needs guardrails. Automation without live intelligence leads to confident errors.
Collapse of vulnerability data increases open-source risks
Even when teams attempt to do the right thing by scanning for vulnerabilities and patching promptly, they are often let down by the data layer. The global system for tracking vulnerabilities is struggling to keep pace with the speed of software release cycles.
In 2025, nearly 65 percent of open source CVEs (Common Vulnerabilities and Exposures) lacked a severity score from the National Vulnerability Database (NVD). This leaves security teams flying blind, unable to prioritise effectively. When researchers analysed these unscored vulnerabilities, they found that 46 percent were actually ‘High’ or ‘Critical’ severity.
The median time for the NVD to score a vulnerability was 41 days last year, with some taking up to a year. In an era where exploits are often available within hours of disclosure, a six-week lag renders the official data nearly useless for immediate triage.
This failure cascades down to the consumption layer. Organisations continue to download known vulnerable components at alarming rates, not necessarily due to negligence, but because the signal-to-noise ratio in security tooling is so poor.
“Alert fatigue” often causes teams to ignore warnings, leading to the persistent use of compromised libraries like Log4j years after fixes became available. In 2025 alone, Log4Shell reached 42 million downloads, exposing organisations to a critical vulnerability patched more than four years ago.
Transparency as a licence to operate
Governments and regulators have lost patience with the “move fast and break things” ethos. Transparency is becoming a legal requirement for doing business.
Estimates suggest 90 percent of global organisations now fall under one or more regulatory mandates requiring evidence of software assurance. In the EU, the Cyber Resilience Act (CRA) and NIS2 Directive are forcing companies to demonstrate control over their supply chain. In the US, Executive Order 14028 has made Software Bills of Materials (SBOMs) a prerequisite for federal procurement.
This changes compliance from a box-ticking exercise into an engineering discipline. It is no longer sufficient to just have a security policy; organisations must produce artifacts (e.g. SBOMs, signed provenance, and attestations) proven to be generated during the build process.
For vendors and enterprise development teams, this means the era of “trust us bro” is over. Procurement teams and auditors now demand machine-readable evidence that software was built securely. Organisations that cannot produce a clean SBOM on demand will find themselves locked out of markets and contracts.
Reducing the burden on open-source registries
The strain on registries like Maven Central is largely self-inflicted. Automated consumption is straining shared infrastructure, with 86 percent of Maven Central traffic in 2025 coming from Cloud Services Providers (CSPs).
Redundant downloads – where CI/CD systems pull the same artifacts thousands of times – account for a massive portion of traffic. Simple changes, such as implementing durable caching and using private repositories to proxy public registries, can dramatically reduce this load.
Addressing the security crisis demands a “block by default” approach. Organisations should employ repository firewalls to prevent known malicious components and “shadow downloads” from entering the development lifecycle in the first place. If a component cannot be vetted, it should not be installable.
Finally, we must acknowledge that software ages. End-of-Life (EOL) components that are no longer supported by maintainers constitute a permanent risk. Data indicates that 18.5 percent of NuGet components and over 25 percent of npm packages in enterprise environments are effectively abandoned. These are liabilities that cannot be patched; they must be removed.
As Christopher Robinson, CTO at the Open Source Security Foundation, states: “The report demonstrates how package repositories and the software housed within them are critical assets that need support if they hope to continue providing services to the developers and consumers using them.”
The software supply chain has become the nervous system of the global economy. Protecting it requires moving beyond passive consumption to active and intelligent governance.
See also: Ai2’s open coding agents slash costs for developers
Want to learn more about cybersecurity from industry leaders? Check out Cyber Security & Cloud Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events including the AI & Big Data Expo. Click here for more information.
Developer is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.



