by AnotherHustler on 2/9/17, 11:49 AM with 2 comments
Does anyone have any experience in this space?
Aside from cost reduction, the elephant in the room is ease of server hardware maintenance. If the dead server has been immersed in oil then it's a messy job to replace components. Also, I'm wondering about the oil itself degrading components - for example, I think oil eats network cables?
I'm wondering if this might be the next big thing? Or just a hassle?
by chha on 2/9/17, 12:02 PM
If immersive cooling doesn't significantly reduce the lifetime of components and doesn't add costs (additional mops?) enough to offset the reduction, there is no reason why it shouldn't be useful for most datacenters. Intel was working on this a few years back [2], and as far as I can tell the Cray 2 used immersive cooling for some components in the mid-80s.
[1] - http://perspectives.mvdirona.com/2010/09/overall-data-center... [2] - https://www.technologyreview.com/s/429179/intel-servers-take...
by brudgers on 2/9/17, 2:08 PM
In commercial buildings it is also not uncommon for transformers to be immersed in oil. But oil's flammability creates substantial fire hazard and modern building codes address this by limiting the density at which the oil in transformers can be distributed; specifying fire resistant separation; and other substantial hazard mitigations as the hazard increases.
Since the oil around servers burns just like the oil around transformers, there probably won't be any special exception for oil immersed servers in the building code any time soon. So I'd bet against large commodity data center installations...a state sponsored agency's data center might be another matter.