Three stories broke this week that, taken together, paint a clear picture of where the technology industry is heading: AI is rewriting the economics of legacy software, trust in digital platforms is fraying under pressure from surveillance concerns, and the infrastructure powering the AI boom is running out of room. Here is what happened, and what it means for anyone working in tech.
IBM Shares Drop After Anthropic Reveals AI's COBOL Rewriting Capability
IBM shares fell 13 percent on Monday after Anthropic published a blog post describing how its Claude Code tools could dramatically accelerate the refactoring of COBOL applications. The market reaction was immediate and, in some respects, disproportionate.
The real story is more complicated than the stock drop suggests. COBOL is not some niche curiosity: the language runs payroll systems for federal governments, reservation platforms for major airlines, and transaction processing for the largest banks in the world. The problem is not that COBOL exists. The problem is that the pool of programmers who understand it is shrinking by attrition, and modernization projects have historically cost hundreds of millions of dollars with a high failure rate. Anthropic's post argued that Claude Code can read COBOL, understand its business logic, and produce equivalent code in a modern language at a pace that was previously impossible.
IBM's mainframe business has long benefited from that complexity. The harder it is to leave, the less likely customers are to try. If AI genuinely lowers the exit barrier, the competitive dynamics shift. Whether Anthropic's claims hold up at enterprise scale, in production systems with decades of undocumented interdependencies, remains an open question. But investors clearly decided they did not want to wait for the answer.
For developers, the story is less about disruption and more about opportunity. COBOL modernization is one of the most underserved problems in enterprise software, and engineers who combine AI tooling skills with knowledge of legacy systems are going to be in demand regardless of which platforms handle the translation work. If you are thinking about where programming skills are heading, the programming languages with the strongest career outlook in 2026 now include serious consideration of AI-assisted development contexts. And for those curious about which AI tools are actually useful day to day, the best AI tools ranked by real-world usefulness covers the landscape beyond the headlines.
IBM was notably skeptical of AI infrastructure spending earlier this year. That skepticism looks different now that the tools are targeting its own installed base.
Discord Severs Verification Service Linked to US Surveillance Efforts
Discord ended its partnership with Persona Identities after security researchers discovered nearly 2,500 files sitting openly on a U.S. government server, with no exploit required to access them. What the files revealed was striking: Persona does not just check whether a user is old enough to view a platform. It performs 269 distinct verification checks, including facial recognition against watchlists, screening for politically exposed persons, and scanning for adverse media across 14 categories including terrorism and espionage.
The partnership lasted less than a month and affected only a small number of test users, according to Discord. Any submitted data was deleted within seven days. But the episode raises questions that go well beyond this particular integration. Verification services have become a standard feature of platforms responding to regulatory pressure around age and identity. Most users have no idea what those services actually do once they hand over a photo ID or agree to a facial scan.
Persona is partially backed by Peter Thiel, a venture capitalist with significant investments in defence and intelligence technology. The presence of references to active intelligence programs in the exposed files suggests that the verification ecosystem is more entangled with government infrastructure than its consumer-facing presentation implies.
For anyone working in cybersecurity or building software that handles user data, this is a useful case study in third-party risk. The vulnerability was not a clever hack. It was a misconfigured server. The exposure happened not because someone broke in, but because the files were simply left in a place where anyone could find them. Understanding how to identify and close that kind of exposure is core to any serious security practice. Ethical hacking courses increasingly cover third-party attack surface analysis as a distinct discipline, and cybersecurity certifications that include cloud configuration auditing are worth prioritising if this is your field. For a broader look at why VPNs and network privacy tools matter in this environment, the full guide to what a VPN actually does is a practical starting point.
AI Surge Triggers Massive Storage Shortage in Data Center Market
The AI boom has a hardware problem, and it is not the one most people are focused on. While GPU shortages dominated the conversation through 2024, the constraint now quietly reshaping the industry is storage. Data centers are buying solid-state drives and hard disk drives at a pace that manufacturers cannot match, driven by the enormous storage demands of AI training runs and inference workloads.
Enterprise-grade hard drives now carry lead times stretching beyond two years in some product categories. Hyperscalers are pivoting toward cheaper quad-level cell SSDs as a stopgap, but that substitution is creating its own pressure elsewhere in the supply chain. Prices for some storage components have climbed more than 100 percent in six months. Manufacturers are warning that the shortage will persist well into 2026.
The consumer side is beginning to feel it too. When enterprise buyers absorb production capacity at scale, the components that would otherwise reach retail markets either disappear or become significantly more expensive. Anyone who has tried to buy high-capacity drives for a home server or workstation recently may have noticed prices moving in the wrong direction.
This is worth watching for anyone building systems that depend on storage-intensive workloads, including machine learning pipelines and data engineering infrastructure. The cost assumptions that made certain architectures practical six months ago may not hold through the end of 2026. Data engineering careers are directly exposed to this dynamic, since the infrastructure decisions made at the pipeline design stage have real cost implications when component prices are volatile. More broadly, the shortage is a reminder that the AI build-out is a physical phenomenon with physical constraints, and the data center capacity story is not only about power grids. Storage, cooling, and interconnect are all under pressure simultaneously.
A Unifying Theme
These three stories share an underlying theme. The AI wave is moving fast enough to outrun the assumptions that existing systems were built on. IBM's mainframe business assumed migration would always be too expensive and risky to be worth attempting. Discord's verification integration assumed that third-party services are contained within their stated scope. Data center planners assumed that GPU supply would remain the primary constraint. All three assumptions are being tested at once.
For developers and engineers, the practical implication is the same in each case: the people who understand both the new tools and the legacy systems, the security implications, and the infrastructure realities will be better positioned than those who understand only one layer. The technology is moving. The foundations it is being built on are complicated, and understanding them is not becoming less important.
Frequently Asked Questions
Why does IBM's mainframe business depend on COBOL being hard to replace?
IBM earns significant revenue from mainframe hardware, software licenses, and support contracts tied to COBOL-based systems. The complexity and risk of migrating decades-old production code has historically made customers reluctant to switch, keeping them locked into IBM infrastructure. If AI tools genuinely lower the cost and risk of migration, that lock-in weakens.
What is COBOL and why is it still in use?
COBOL is a programming language developed in the late 1950s and designed for business data processing. It remains in use because critical systems, particularly in banking, government, and insurance, were built on it and have been running reliably for decades. The cost and risk of rewriting them has consistently outweighed the benefits, until now.
What did Persona's verification software actually do beyond age checks?
According to the exposed files, Persona performed 269 distinct checks including facial recognition against watchlists, screening for politically exposed persons, and adverse media scanning across categories including terrorism, espionage, and financial crime. The scope went well beyond the age verification function that Discord publicly described.
How does the storage shortage affect AI development costs?
Storage is a significant input cost for AI training runs and inference infrastructure. When enterprise buyers absorb available production capacity, prices rise and lead times extend. This makes it more expensive to build and operate the data pipelines and model training systems that AI development depends on, particularly for smaller organisations that cannot secure long-term supply contracts.
What should developers know about third-party verification and data privacy risk?
The Discord-Persona case illustrates that third-party services integrated into platforms may collect and process significantly more data than their public documentation suggests. Developers building applications that rely on verification or identity services should audit exactly what data those services collect, how it is stored, and whether its exposure through misconfiguration or breach would create legal or reputational liability.