UPDATE: The $25 Linux Machine – Raspberry Pi
January 5, 2015UPDATE: Enhanced Security with Gentoo Linux
January 6, 2015Editor’s Note: The original article (posted below the line) was published on February 24, 2012. An update on the subject matter was sorely needed….
In 2014, we saw a lot more adaptation in cloud computing technologies. In 2015, we believe that energy efficiency will continue to make strides, especially in the data center field.
With more websites and more projects like data mining, come greater needs in housing the data. But it won’t be easy. Just a couple weeks ago, Senator Tom Coburn blocked the Energy Efficiency Improvement Act of 2014 bill in Oklahoma. This bill would have called for energy efficiency improvements for buildings.
The bill would have updated the government’s estimated amount of energy usage for all data centers, which hasn’t been updated since 2007 and is definitely outdated. Simply put, the bill was attempting to get the government with the times of data center needs. However, since Senator Coburn will be retiring at the end of the Congressional session, the bill may still be viable for the following session.
Of course, a huge concern is the impact on the environment. Data centers mean lots of machines, lots of heat, and lots of cooling costs. That’s why managing the energy is so important, especially to keep the costs down. Data usage is only going to go up, but technology is answering that call regularly. Innovation will continue to be done with the cloud, and other resources to limit the energy usage of these centers.
Some of the more obvious way to increase energy efficiency with data centers includes building more centers and incorporating alternative cooling methods like sea water cooling. The companies who use the most data, like Google, are most invested in finding a solution. From acquiring companies who are making waves in the industry, to recruiting teams to do the research and develop new technology, we’re excited to see what’s next for energy saving in this field.
Original:
As technology moves on, it comes under pressure to operate more effectively. With a focus on cost, energy saving and efficiency, what will the data center vision be for the future?
Last year saw the idea of sea water cooling for data centers. Google bought a former paper mill in Finland and now use its long tunnels underground to allow seawater to cool the data center from underneath. The seawater goes through a filtration process and through a heat transfer system. After cooling the data center the water is cooled down before being poured back into the ocean through a pump. Beforehand, ecosystems were monitored to ensure the water wasn’t destroying life when it was pumped back in to the ocean (hence being cooled down prior to re-entering the ocean). This idea shows Google are keeping a watchful eye on the ecosystem surrounding the data center, making sure nothing is harmed in the process.
Energy efficiency is a key consideration in all aspects of IT. In the data center, new ways of looking at how to lower the PUE include effective cooling and delivery of power to the data center.
The average data center in the US has a rating of 2.0 PUE, but this is likely to change in the future. Industry standards are evolving and colocation providers must keep up with efficient usage rating to comply with future efficiency requirements in all aspects of their IT infrastructure.
With some already exciting examples of new methods of cooling and the effect cloud computing is having on data centers, it seems we can hope to see good things in store for the future of data centers. Reduction in downtime, energy efficient running and smaller components seem obvious areas to look at, and it will be exciting to see what form the data center will take in the future.
Author Bio: Amy writes for Direct Sight, a leading retailer of glasses and sunglasses online.