How To Protect Yourself From Microsoft’s Schannel Security Vulnerability
November 13, 2014UPDATE: 60 Hudson Street Data Center Plans for Major Renovations
November 14, 2014When you think of the government, chances are you don’t think of the word “efficiency.”
But these days, money is tight for everyone. As a result, Americans are finally forcing their government to be somewhat more financially responsible than it’s been in the past—albeit somewhat ineffectively.
While the government, for whatever reason, doesn’t seem to be able to shed money spent on education, entitlements or our military, the evolution of technology has created an environment where costs can be significantly reduced by deploying the latest solutions on the market.
And nowhere is this sentiment truer than in the data center realm.
Between 1998 and 2010, the federal government quadrupled the amount of data centers it operated. But according to the White House, these data centers only used 27 percent of their computing power.
Seems awfully wasteful, right?
But in order to reduce some of the country’s operating expenses, President Barack Obama announced plans to shut down even more data centers. According to the White House, the goal is to shut down more than 800 data centers by 2015, which will save the taxpayers over $3 billion as a result.
At the end of the day, if you can implement new technology that would easily reduce expenses, what excuse do you have for saying no?
How Do You Consolidate Increasing Resources Anyway?
As we as a society begin relying more and more on technology — particularly computing resources that are delivered through the cloud — the need to house all of the infrastructure that powers the country’s collective IT becomes that much more pronounced.
A large data center can consume as much electricity as an entire town. Since that’s the case, it’s imperative that data centers operate as efficiently as possible. That way, while their electric costs might be high, energy is being used efficiently. A data center running at full capacity, for example, is much more efficient than two data centers running at half-capacity.
So how can we store more computing resources in smaller spaces? While the question might sound somewhat ironic, there are actually many reasons as to what’s allowing data center consolidation. With that in mind, let’s take a look at three of them:
Moore’s Law
First and foremost, Moore’s law tells us that computing power doubles every two years. The concept traces its roots back to the 1970s, so you can imagine how much change we’re talking about every two years these days.
Simply put, you’ll need fewer computing resources to complete tasks when those resources are considerably more powerful than they were just a short while ago.
Virtualization
Back to that stat about 27 percent of computing power being used in federal data centers. That number seems low, doesn’t it?
That’s where server virtualization comes into the picture — something that’s integral to the federal government’s ability to reduce the number of data centers it oversees.
Rather than having to dedicate a separate server for each operating system you use, for example, with virtualization, you’re able to layer multiple virtual servers on top of one another. This enables you to ensure that you’re using 100 percent of the computing capacity of a specific machine.
In other words, virtualization would easily allow the federal government to utilize 100 percent of the computing capacity of each server. That 27 percent figure means that simply by virtualizing its servers, the government could roughly consolidate every four servers to one physical machine.
Additionally, in the past, in order to provision a new machine, you’d have to order the equipment, wait for it to arrive, load applications, configure it and the rest of the routine. However, thanks to virtualization, you’re able to quickly provision virtual instances. This accelerates innovation and ensures optimal data center operation, on top of consolidating physical space and reducing hardware, storage and heating and cooling expenses.
Because of the wealth of benefits, it comes as no surprise that the virtualization market continues to thrive.
Green Construction
Thanks to advances in construction, the government is also able to minimize the impact it has on the environment while having to use less electricity to keep all of the physical machines inside the data center cool. By using green construction tactics, data centers are assuredly going to be more efficient as energy is distributed more effectively.
While the decision to build a green data center is certainly one you wouldn’t make lightly — it can be pricey up front — businesses are certain to save considerably over the long term. When you think about it, because data centers consume so much electricity, it’s almost a no-brainer to build with sustainability in mind.
Taxpayers will certainly appreciate paying lower energy bills over the useful life of the data center.
Finally, the Government Does Something Right
Okay, maybe that subheading is an exaggeration.
While it appears the government is moving in the right direction by consolidating its data centers, what the heck was the guy in charge thinking when quadrupling data centers from 1998 to 2010? That decision certainly doesn’t exude the confidence of a decision maker with any semblance of foresight.
But the government could afford to take it a step further, taking a page from Facebook’s playbook. The social networking juggernaut has an Open Compute Project, which seeks to construct the best data center possible in an open source spirit.
The end result? Over $1 billion in savings and a 24 percent reduction in operating costs.
One can reasonably expect that Facebook’s project will result in data center designs that are increasingly more efficient. The question is, if the goal is data center efficiency, will the government participate?