The Server You Buy Today Could Last A Decade
Thanks to scalable virtualization-ready hardware, there's never been a better time to refresh your server infrastructure.
It's no longer true that most corporate applications require massive computing resources.
It used to be easy to tell when a server was ready to be put out to pasture. With CPU speeds clicking upward like a shuttle launch, it wasn't terribly hard to justify aging out a 1.3GHz Pentium III server for a 2.26GHz Pentium 4 box. But the yearly boost in clock speed hasn't been in effect for a while now. What's more, we're suddenly very aware of just how little clock speed really matters for a large swath of the data center.
A huge number of older single- and dual-core servers are still running, chewing up tons of power and heat, spinning 3.5-inch U320 SCSI drives that seem more and more like dinosaurs every day. But they're quickly becoming relics of the pre-virtualization era. We're starting to look at those servers much like we looked at a 1969 GTO Judge in 1976 -- new enough to be only a few years old and in perfect running condition, but so expensive to drive that you had to get rid of it.
Today, you can't buy a midrange server without getting NUMA and all the massive performance benefits it represents. You can't buy even a small-end server today without at least a quad-core CPU. Even Dell's "My First Server" bottom-line specs include a quad-core option, and the absolute cheapest box comes with a 2.7 GHz dual-core Pentium G630 and 4GB of RAM for $299. That's 75 percent of the cost of the low-end iPad, for crying out loud. I've had far, far bigger bar tabs -- just ask my editor.
As we roll headlong into the virtualization era, we're tossing all these older boxes out the door and realizing the server that once took up four rack units, weighed 95 pounds, and had three 1,000-watt power supplies can now be virtualized -- and run as one of many virtual servers on a 1U server with a few Opterons -- while still enjoying a performance boost. In many companies, the measure of a virtualization initiative's success is not just the total number of servers removed, but in the number of server racks removed. The biggest are measuring in entire data centers.
But what does that mean for the current generation of servers? Exactly how long do we expect them to live? They require less power and generate less heat, which should increase their lifespans. Many have no local disk, so we're left with boxes that have hot-swappable fans as their only moving parts. I think this is all a fantastic development, naturally. (After all, we all know how I loathe hard drives in servers.)
I think we'd need two major improvements occurring simultaneously to mark the next significant step up in server hardware. What lifted the current generation was a combination of mature hardware support for virtualization and the multicore NUMA boosts from AMD and then Intel. Those elements worked hand in glove to get us to a position where most medium-size to large enterprises find they need a remarkably small number of physical servers to run their entire data center. Unless we see a massive rise in software resource requirements, the same state of affairs could hold true for the next decade.
That's really the key. It's just no longer true that most corporate applications require massive computing resources these days. You may find a few hulking Oracle database servers that will devour whatever hardware you throw at them, but in many cases, that's simply because they're designed and implemented poorly. Most of what actually runs the company requires very few resources on modern hardware.
If you take all this into account and add the hot-swap RAM and CPU features in the Nehalem-EX chip, the future looks far less like the recent past of the x86 server and more like the ancient past of the mainframe.
With all these elements converging now, it's quite easy to see there's never been a better time to pull the trigger on that big server upgrade or that virtualization expansion. Go forth and buy servers with as many cores as possible, damn the clock speed. They're likely to be useful far longer than you might think.
Software companies should allow users to run older software on new hardware which will give them a performance boost.
Authentication is the biggest issue that enterprises have to face in a connected world because all systems have flaws.
You’d think that the increase in the number of breaches and their financial impact would increasingly make security a boardroom topic. Nothing could be further from the truth.
Playing with the new MacBook on Monday, I kept having flashbacks to 2008. Remember the first-gen MacBook Air? It captured imaginations the instant Steve Jobs pulled it from a plain manilla envelope on stage.