20111028 facebookserver
Amir Michael, hardware design manager of Facebook, introduces Facebook's new server during the media event, 'behind the scenes' to show the latest technology powering Facebook at their headquarters Palo Alto, Calif., on April 7, 2011. - 

Facebook has announced plans to open a five-acre facility in the city of Lulea, located just south of the Arctic Circle. The city of 74,000 people offers easy access to hydroelectric power. It also offers the fact that it stays cold there much of the year.

The cold air makes a pretty big difference when you imagine five acres worth of computers running at full steam. Ever felt the heat of a laptop computer on your lap? Well, multiply that times a zillion. Facebook will be able to, essentially, open the windows and doors and cool the machines with the air already available.

It won't be the first time, either. Facebook's data center in Prineville, Ore., relies on desert air to cool the banks of servers the company has running.

Glenn Fleishman recently toured that facility for a series of articles he wrote in The Economist. He says between new approaches to cooling and machines that can run hotter without breaking down, air conditioning bills aren't as high. "It used to be servers needed to run in very narrow range of acceptable temperatures," he says, "and you had to air condition like crazy so you didn't overheat them. Well instead of adding more air conditioning, they've redesigned server equipment to operate at temperatures that are far above what they used to and so you simply need less cooling to get them to run at optimum conditions."

Cloud computing is becoming increasingly popular and a lot of tech companies are getting their capacity ready for it. Fleishman says, "The approach in past was you take off-the-shelf equipment and throw it in there, throw a lot of cooling at it and power and you just make it work. That's where data centers evolved from not even that many years ago and now, in order to reduce the enormous cost of cooling and of power you instead design from top to bottom to use the least amount of power in the most efficient way and to operate in a much broader range of circumstances."

He offers an apt, if somewhat gross, analogy. "It's like a slime mold. This is a bunch of symbiotic organisms that are working together. I no longer buy the server from Company A and the switch from Company B and the rack from Company C and then stick it in a warehouse. It's how do we design it as one complete organism. All the way down from a resistor on a circuit board up to the entire scale of the building where it's located and which direction it faces. And it's that integrative holistic thinking about data centers that's managed to reduce the power consumption already by large factors."

But while air conditioning can be assisted by an open window to the Arctic, you still need the power to run the actual machines. A large data center, according to Greenpeace (PDF), uses as much power as 30,000 to 40,000 homes. Facebook's new plant will run on hydroelectric power, which is different from its Oregon facility, powered mostly by coal.

Casey Harrell from Greenpeace International says the difference is significant: "It's the difference between deciding you're going to run 30,000 to 40,000 U.S. homes powered by renewable low or no carbon energy compared to one with the highest emitters, and in the U.S., largest source of greenhouse gases, which is coal. So it's a huge impact that most people don't think about when they updated what they did over the weekend or post pictures of their kids online."

Also in today's program, a look at Skylanders. Ars Technica's Ben Kuchera tells us about this video game that involves actual physical objects. He says it's a lot of fun.

Follow John Moe at @johnmoe