Finding a physical space to store our voluminous cloud-based data is a problem, sure, but keeping the servers cooled down is another, much bigger problem--and an environmentally unfriendly one at that. Instead of installing expensive cooling systems, future networked data centers could use the waste heat of computing to keep people warm.
A new paper from Microsoft Research proposes using servers as "data furnaces," installed in homes or businesses and connected to the air ducts. As a rectangular metal cabinet, it would look like any other furnace, attached to the ductwork and hot water pipes. Homeowners wouldn't even notice the difference — except, of course, for the huge power draw that a server requires.
Microsoft researchers Jie Liu, Michel Goraczko, Sean James and Christian Belady, working with Jiakang Lu and Kamin Whitehouse at the University of Virginia, explain that the exhaust from a typical computer server is not hot enough to use for electricity generation. But servers' exhaust typically runs between 104 and 122 degrees F, Gizmag points out, which is enough to heat up a home. The servers are usually cooled with fans and air conditioning systems in data centers. For this reason, data centers might be installed in chilly areas or away from populated areas, but that's not very efficient.
Data furnaces would be much smaller than typical data centers, consisting of 40 to 400 Internet-connected CPUs, depending on the size of the home or business, and would enable homeowners and IT firms to conceivably save money and resources. There would be less need to construct huge new spaceship-like data centers, for instance, and micro-centers distributed throughout a residential area or office park would provide lower network latency, Gizmag notes.
Liu et. al point out that old servers can be easily recycled into homes, serving as backup for disk maintenance.
Security is one obvious question — how could IT companies ensure that a client's confidential data is safe in some random family's basement? What about floods, power outages, or server snafus?
Microsoft answers these questions by suggesting that host households agree to change the air filters occasionally and to shut off the servers if required, in exchange for free heat. What about free Windows updates? No word on that, sorry.
Has anyone run any efficiency calculation on these things? I certainly wouldn't want one in my house hogging down my power.
While I see nothing wrong in utilizing the waste heat coming out of a server stack to heat a building's airspace (provided the server stack was already in use there), adding the servers to a house for no other reason than to supplement the existing heating system is ridiculously inefficient.
Clearly, no one put much thought into this idea.
i'm doing this already! i live in new york, last winter in my 10 by 13 foot room my computer ran non stop, this kept the room toasty and in the 70's. at night i would turn the heater on just because I've spent the last five years in the south.
consensus is that it works great, i'm using the computer anyways, and it's putting off a lot of heat either way, so why waste heat when you can put it to good use.
to mars or bust!
Sure, 105 Degrees and like I need a furnace.
just because summer's hot, doesn't mean winter won't be cold.
why learn from your own mistakes, when you could learn from the mistakes of others?
So one night you bring your date home. Say looks around the room and says, a a what's that? You reply, oh that thing, I use it to play " HALO " on.
What kind of engineer thinks that 104 - 122 degrees F is not enough heat to generate electricity? Any temperature differential could drive a Stirling system or a thermoelectric generator, to say the least. Whatever money these guys are making, it's too much.
Also, Jefro brings up a good point, the servers would still need to be actively cooled during the summer, which would increase the cooling load for the whole structure, so it would be like actively cooling a house or office building + cooling a server room.
And, and this is my final counter point, even in Winter the exhaust heat may be entering the house or building, but without cold air coming in from outside they would still need to AC the servers, and if they are going to use air from outside for cooling they could take out the privatization of the whole thing and just build external air intakes into server facilities to use the cold outside air in the winter. The whole things seems unnecessary to me.
How they just make users store their own information?? Like all of my Facebook and Google info is on my computer, and it heats my house, and for extra security if I want to take everything offline I can do so instantly and temporally.
In Austin Texas we hit 105 today. Now if you can convert the heat energy into a cooling medium then you are on to something.
I don't about you guys, but do I really need 40 power-hogging, internet-connected CPU just to run my house? Come to think of it, I might need them heat my showers and my lunch.
If people actually read the article it suggests that the data centers would be owned by private firms who would also pay the electric bill to operate them. Duh. They are only talking about ducting the heat away into nearby living spaces, which is standardly done with industrial waste heat and is usually called "district heating". However, the quality of the heat is not likely to be great, even for things like heating water for a shower, though it might be ok to pre-heat. As a source for a heat pump it would probably suffice, but I would imagine that it would have to be supplemented by some other source.
Furthermore, computer designers are constantly working on ways to bring down the amount of heat generated by processors. Somebody has to pay the electric bills for the servers, and it's not really "free" heat so much as energy that failed to be converted into computing power.
If people actually read the article it suggests that the data centers would be owned by private firms who would also pay the electric bill to operate them. They are only talking about ducting the heat away into nearby living spaces, which is standardly done with industrial waste heat and is usually called "district heating". However, the quality of the heat is not likely to be great, even for things like heating water for a shower, though it might be ok to pre-heat. As a source for a heat pump it would probably suffice, but I would imagine that it would have to be supplemented by some other source.
However, computer designers are constantly working on ways to bring down the amount of heat generated by processors. Somebody has to pay the electric bills for the servers, and it's not really "free" heat so much as energy that failed to be converted into computing power.
"Mildred, it's getting cold in here! Fire up the P vs NP problem!"
I'm a facilities manager at an office complex in the northeast. In my buildings, we're cooling almost 9 months per year. We start up at the tail end of March and run through the middle of November. Even on warmer winter days we're not producing heat but using outside air to reduce the temp before it builds up to 80 degrees inside.
The heat load is tremendous. All of the lighting, printers, computers, servers, and misc equipment produce a ton of heat. The sun can heat a window shade to 120 degrees even in the fall. The massive amount of people crammed into cubicles also produce heat. If its 50 degrees outside and sunny we may have to run the chillers if we can't cool the buildings with outside air. I don't see how this would work at least at our complex.
I have actually been recycling server heat during the past 3 winters at my Utah home. I have a micro-data center (30+ CPUs) in my basement, which I created from the ground up, DIY. It is fully enclosed on 8-inch concrete walls, fully insulated with anti-static material and is about 100SqrFt. It is fully redundant in all senses: Three, bonded broadband connections (BGP), UPS systems and unattended power generator for blackouts, redundant gateways and routers and and offsite redundant rack at my office in Japan, with a backup server at my Japan home. Anyway, I have been wondering what to do with the heat produced by the servers during the summer and whether it would be possible to generate electricity from it. Maybe I should use it heat my water. Any ideas?