An interview with Gordon Johnson who is a certified data center design professional, Data Center Energy Practitioner (DCEP), CFD and electrical engineer regarding the use of CFD’s and containment.
An interview with a certified data center design professional regarding the use of CFD’s and containment
Data center airflow management engineers have used Computational Fluid Dynamics (CFD) programs for years to determine the complicated movement of airflow in data centers. CFD models pinpoint areas where airflow can be improved in order to provide a consistent cooling solution and energy savings.
Gordon, what is the principle way CFD’s are used with regard to containment?
We use CFD’s to determine two basic data sets. The first is the baseline, or the current airflow pattern. This initial CFD model shows supply intake temperatures to each cabinet. This model also determines the effectiveness of each AC unit as it relates to airflow volume, return air temperature, delta T, and supply air temperature.
The second model is the proposed design of the CFD engineer who uses the information from the base model to enact airflow management best practices to separate supply from return airflow. Typically several models are created in order to adjust airflow volume, set point temperatures, and adjust individual aisle supply volume.
Gordon, Are there situations in which the CFD engineer does not recommend containment?
Not really, because the entire basis of airflow management is the full separation of supply and return airflow. Anytime these two airflows mix there is a loss of energy and consistent supply temperature to the IT thermal load.
We have seen CFD’s used by manufactures to prove product effectiveness. What are some ways CFD’s are made to exaggerate product effectiveness?
Exaggerations usually stem from the principle known as GIGO, short for Garbage In, Garbage Out. This refers to the fact that computers operate by logical processes, and thus will unquestioningly process unintended, even nonsensical input data (garbage in) and produce undesired, often nonsensical output (garbage out).
Let me give you an example. Recently I recreated a CFD model that was used to explain the effectiveness of airflow deflectors. The purpose of the CFD was to show the energy savings difference between airflow deflectors and full containment. We found that certain key data points were inserted into the models that do not reflect industry standards. Key settings were adjusted to fully optimize energy savings without regard to potential changes to the environment. Any potentially adverse effects to the cooling system’s ability to maintain acceptable thermal parameters, due to environmental changes, are not revealed in the CFD model. Thus, the model was operating on a fine line that could not be adjusted without a significant impact on its ability to cool the IT load.
Can you give us any specifics?
The airflow volume was manually changed from 1 kW at 154 CFM to 1 kW at 120 CFM. Industry standard airflow is 154 CFM. The formula most commonly used is as such:
120 CFM airflow does not give the cooling system any margin for potential changes to the environment.
Another key area of unrealistic design is the placement of cabinet thermal load and high volume grates. The base model places high kW loads in specific, isolated areas surrounded by high volume grates. What then happens, if additional load is placed in areas of low volume airflow? Any changes to the rack kW in areas without high volume grates could not be accounted for. At the end of the day, any changes to the IT load would require an additional airflow management audit to determine what changes would affect the cooling solution. Thus, the proposed model is unrealistic because no data center would propose a cooling solution that would require regular modifications.
Are you recommending a CFD study every time you make changes to the data center thermal load?
No. a full separation supply and return airflow eliminates the guesswork with regards to the effect of air mixture. It also eliminates the need of specific high volume perforated tiles or grates to be placed in front of high kW loads. Instead, a CFD model would incorporate expected increases to the aisle thermal load. This falls in line with the “plus 1” kind of approach to cooling. Creating a positive pressure of supply air has many additional benefits, such as lowering IT equipment fan speed, and ensuring consistent supply temperature across the face of the IT intake.
Data centers should not be operated with little margin for changes or adjustments to the thermal load. That is why I always recommend a full containment solution with as close to 0% leakage as possible. This is always the most efficient way to run a data center, and always yields the best return on investment. The full containment solution, with no openings at the aisle-end doors or above the cabinets, will easily allow the contained cold aisles to operate with a slightly greater supply of air than is demanded. This in turn ensures that the cabinets in the fully contained aisle have a minimum temperature change from the bottom to the top of the rack, which allows the data center operator to easily choose predictable and reliable supply temperature set points for the cooling units. The result? Large energy savings, lower mean time between failures, and a more reliable data center.
What do you recommend as to the use of CFD studies and containment?
It’s important to create both an accurate baseline and a sustainable cooling solution design. This model will give data center operators a basis for an accurate representation of how things are being cooled. The proposed cooling solution can be used in numerous ways:
- Accurate energy savings
- Safe set point standards
- Future cabinet population predictions
- The ability to cool future kW increases
- Identify and eliminate potential hot spots
Subzero Engineering endorses accurate and realistic CFD modeling that considers real world situations in order to create real world solutions.