Tuesday, February 17, 2009

Server Cost Adders for Higher-temp Operation

Numerous industry notables, including Microsoft's Christian Belady, have been advocating the operation of data centers with higher ambient temperatures. The cost savings by reducing or eliminating cooling plant costs could yield considerable savings. But what does it take to build servers designed to operate at these higher temperatures?

As mentioned in a previous post, telecommunications equipment is typically designed to meet the NEBS standards (55°C maximum ambient). Cost adders for NEBS equipment include the following:
  • Higher temperature integrated circuits (ICs). Commercial-grade ICs are generally rated to 70°C; higher ambient temperatures could force the use of extended temp components.
  • Heat sink costs. Higher temperatures often drive more expensive heat sink materials (i.e., copper rather than aluminum) and more use of heat sinks on components that don't need them at lower temperatures. For example, some servers need heat spreaders on DIMMs to be rated to operate at higher temperatures.
  • Corrosive gases tolerance. Telecommunications equipment generally needs to pass tests to ensure reliability in the presence of corrosive gases, including high sulfur-content air. Before dismissing this requirement, consider the case of air-side economizers: if you're bringing in outside air, do you need to worry about contaminants in the air, such as diesel exhaust from nearby trucks or from diesel generators?
  • Wider humidity range. Most NEBS equipment is designed for a wider range of allowable humidity exposure than most data center equipment. The broader use of economizers might make a wider humidity range desirable for data centers.
  • Flame tests. NEBS flame tests may be overkill for most data center equipment, in part because most data centers have sprinklers or other fire suppression controls (unlike telecom central offices, which do not have sprinklers).
  • Shake and vibe tests. NEBS equipment generally is tested to meet seismic Zone 4 earthquake tests. These tests could just as well apply to data center equipment, but it is something beyond what most data center equipment is validated against.
  • Materials selection. The use of V0-rated plastics and HF-1 or better foams in data center equipment is not necessarily a cost adder if designed in up front, but it can add appreciable expense if retrofits have to be made after-the-fact.
  • Air filters. Data center equipment generally doesn't need air filters, so these can be eliminated.
  • Long life. This actually encompasses two aspects: extended availability of certain components and long-life reliability. Telecom products often require the availability of the same components for 5-7 years, much longer than typical data center products. Similarly, telecom products often are designed to meet usable lifetimes that are much longer than most data center refresh cycles.

Which of these attributes are needed for equipment in data centers with higher temperatures? What other attributes are needed for higher temps?


No comments:

Post a Comment