skip to main content

Search for: All records

Creators/Authors contains: "Agonafer, Dereje"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Transistor density trends till recently have been following Moore's law, doubling every generation resulting in increased power density. The computational performance gains with the breakdown of Moore's law were achieved by using multicore processors, leading to nonuniform power distribution and localized high temperatures making thermal management even more challenging. Cold plate-based liquid cooling has proven to be one of the most efficient technologies in overcoming these thermal management issues. Traditional liquid-cooled data center deployments provide a constant flow rate to servers irrespective of the workload, leading to excessive consumption of coolant pumping power. Therefore, a further enhancement in themore »efficiency of implementation of liquid cooling in data centers is possible. The present investigation proposes the implementation of dynamic cooling using an active flow control device to regulate the coolant flow rates at the server level. This device can aid in pumping power savings by controlling the flow rates based on server utilization. The flow control device design contains a V-cut ball valve connected to a microservo motor used for varying the device valve angle. The valve position was varied to change the flow rate through the valve by servomotor actuation based on predecided rotational angles. The device operation was characterized by quantifying the flow rates and pressure drop across the device by changing the valve position using both computational fluid dynamics and experiments. The proposed flow control device was able to vary the flow rate between 0.09 lpm and 4 lpm at different valve positions.« less
    Free, publicly-accessible full text available December 1, 2023
  2. Abstract Over the last decade, several hyper-scale data center companies such as Google, Facebook, and Microsoft have demonstrated the cost-saving capabilities of airside economization with direct/indirect heat exchangers by moving to chiller-less air-cooled data centers. Under pressure from data center owners, information technology equipment OEMs like Dell and IBM are developing information technology equipment that can withstand peak excursion temperature ratings of up to 45 °C, clearly outside the recommended envelope, and into ASHRAEs A4 allowable envelope. As popular and widespread as these cooling technologies are becoming, airside economization comes with its challenges. There is a risk of premature hardware failuresmore »or reliability degradation posed by uncontrolled fine particulate and gaseous contaminants in presence of temperature and humidity transients. This paper presents an in-depth review of the particulate and gaseous contamination-related challenges faced by the modern-day data center facilities that use airside economization. This review summarizes specific experimental and computational studies to characterize the airborne contaminants and associated failure modes and mechanisms. In addition, standard lab-based and in-situ test methods for measuring the corrosive effects of the particles and the corrosive gases, as the means of testing the robustness of the equipment against these contaminants, under different temperature and relative humidity conditions are also reviewed. It also outlines the cost-sensitive mitigation techniques like improved filtration strategies and methods that can be utilized for efficient implementation of airside economization.« less
    Free, publicly-accessible full text available September 1, 2023
  3. Abstract Structural components such as printed circuit boards (PCBs) are critical in the thermomechanical reliability assessment of electronic packages. Previous studies have shown that geometric parameters such as thickness and mechanical properties like elastic modulus of PCBs have direct influence on the reliability of electronic packages. Elastic material properties of PCBs are commonly characterized using equipment such as tensile testers and used in computational studies. However, in certain applications viscoelastic material properties are important. Viscoelastic influence on materials is evident when one exceeds the glass transition temperature of materials. Operating conditions or manufacturing conditions such as lamination and soldering maymore »expose components to temperatures that exceed the glass transition temperatures. Knowing the viscoelastic behavior of the different components of electronic packages is important in order to perform accurate reliability assessment and design components such as printed circuit boards (PCBs) that will remain dimensionally stable after the manufacturing process. Previous researchers have used creep and stress relaxation test data to obtain the Prony series terms that represent the viscoelastic behavior and perform analysis. Others have used dynamic mechanical analysis in order to obtain frequency domain master curves that were converted to time domain before obtaining the Prony series terms. In this paper, nonlinear solvers were used on frequency domain master curve results from dynamic mechanical analysis to obtain Prony series terms and perform finite element analysis on the impact of adding viscoelastic properties when performing reliability assessment. The computational study results were used to perform comparative assessment to understand the impact of including viscoelastic behavior in reliability analysis under thermal cycling and drop testing for Wafer Level Chip Scale Packages.« less
    Free, publicly-accessible full text available June 13, 2023
  4. Abstract Continuous rise in cloud computing and other web-based services propelled the data center proliferation seen over the past decade. Traditional data centers use vapor-compression-based cooling units that not only reduce energy efficiency but also increase operational and initial investment costs due to involved redundancies. Free air cooling and airside economization can substantially reduce the information technology equipment (ITE) cooling power consumption, which accounts for approximately 40% of energy consumption for a typical air-cooled data center. However, this cooling approach entails an inherent risk of exposing the ITE to harmful ultrafine particulate contaminants, thus, potentially reducing the equipment and componentmore »reliability. The present investigation attempts to quantify the effects of particulate contamination inside the data center equipment and ITE room using computational fluid dynamics (CFD). An analysis of the boundary conditions to be used was done by detailed modeling of ITE and the data center white space. Both two-dimensional and three-dimensional simulations were done for detailed analysis of particle transport within the server enclosure. An analysis of the effect of the primary pressure loss obstructions like heat sinks and dual inline memory modules inside the server was done to visualize the localized particle concentrations within the server. A room-level simulation was then conducted to identify the most vulnerable locations of particle concentration within the data center space. The results show that parameters such as higher velocities, heat sink cutouts, and higher aspect ratio features within the server tend to increase the particle concentration inside the servers.« less
    Free, publicly-accessible full text available June 1, 2023
  5. Free, publicly-accessible full text available January 1, 2023
  6. Abstract Data centers are a large group of networked servers used by organizations for computational and storage purposes. In 2014, data centers consumed an estimated 70 billion kWh in the United States alone. It is incumbent on thermal engineers to develop efficient methods in order to minimize the expenditure at least toward cooling considering the limited available power resources. One of the key areas where electronic cooling research has been focusing, is addressing the issue of nonuniform power distribution at the rack, server and even at package levels. Nonuniform heating at the chip level creates hotspots and temperature gradients acrossmore »the chip which in turn significantly increases the cost of cooling, as cooling cost is a function of the maximum junction temperature. This challenge has increased the use of temperature sensing mechanisms to help in finding ways to mitigate the gradients. A very effective way to conserve pumping power and address hotspots on the single or multichip modules is by targeted delivery of liquid coolant. One way to enable such targeted delivery of coolant is by using dynamic cold plates coupled with self-regulating flow control device that can control flow rate based on temperature. This novel technology will have more effective implementation coupled with a good control strategy. This paper addresses the development and testing of such control strategy with minimal sensors along with less latency and optimization of the same.« less
  7. Abstract The adoption of Single-phase Liquid Immersion Cooling (Sp-LIC) for Information Technology equipment provides an excellent cooling platform coupled with significant energy savings. There are, however, very limited studies related to the reliability of such cooling technology. The Accelerated Thermal Cycling (ATC) test given ATC JEDEC is relevant just for air cooling but there is no such standard for immersion cooling. The ASTM benchmark D3455 with some appropriate adjustments was adopted to test the material compatibility because of the air and dielectric fluid differences in the heat capacitance property and corresponding ramp rate during thermal cycling. For this study, acceleratedmore »thermal degradation of the printed circuit board (PCB), passive components, and fiber optic cables submerged in air, white mineral oil, and synthetic fluid at a hoisted temperature of 45C and 35% humidity is undertaken. This paper serves multiple purposes including designing experiments, testing and evaluating material compatibility of PCB, passive components, and optical fibers in different hydrocarbon oils for single-phase immersion cooling. Samples of different materials were immersed in different hydrocarbon oils and air and kept in an environmental chamber at 45C for a total of 288 hours. Samples were then evaluated for their mechanical and electrical properties using Dynamic Mechanical Analyzer (DMA) and a multimeter, respectively. The cross-sections of some samples were also investigated for their structural integrity using SEM. The literature gathered on the subject and quantifiable data gathered by the authors provide the primary basis for this research document.« less
  8. Thermal conductive gap filler materials are used as thermal interface materials (TIMs) in electronic devices due their numerous advantages, such as higher thermal conductivity, ease of use, and conformity. Silicone is a class of synthetic materials based on a polymeric siloxane backbone which is widely used in thermal gap filler materials. In electronic packages, silicone-based thermal gap filler materials are widely used in industries, whereas silicone-free thermal gap filler materials are emerging as new alternatives for numerous electronics applications. Certainly, characterization of these TIMs is of immense importance since it plays a critical role in heat dissipation and long-term reliabilitymore »of the electronic packages. Insubstantial studies on the effects of various chemical compounds on the properties of silicone-based and silicone-free TIMs has led to this study, which focuses on the effect of thermal aging on the mechanical, thermal, and dielectric properties of silicone-based and silicone-free TIMs and the chemical compounds that cause the changes in properties of these materials. Characterization techniques such as dynamic mechanical analysis (DMA), thermomechanical analysis (TMA), differential scanning calorimetry (DSC), Fourier transform infrared spectroscopy (FTIR), and broadband dielectric spectroscopy (BbDS) are used to study the mechanical, thermal, and dielectric characteristics of these TIMs, which will guide towards a better understanding of the applicability and reliability of these TIMs. The experiments demonstrate that upon thermal aging at 125 °C, the silicone-free TIM becomes hard, while silicone-based TIM remains viscoelastic, which indicates its wide applicability to higher temperature applications for a long time. Though silicone-based TIM displays better mechanical and thermal properties at elevated temperatures, dielectric properties indicate low conductivity for silicone-free TIM, which makes it a better candidate for silicone-sensitive applications where higher electric insulation is desired.« less
  9. To reproduce a Digital Twin (DT) of a data center (DC), input data is required which is collected through site surveys. Data collection is an important step since accurate representation of a DC depends on capturing the necessary detail for various model fidelity levels of each DC component. However, guidance is lacking in this regard as to which components within the DC are crucial to achieve the level of accuracy desired for the computational model. And determining the input values of the component object parameters is an exercise in engineering judgement during site survey. Sensitivity analysis can be an effectivemore »methodology to determine how the level of simplification in component models can affect the model accuracy.In this study, a calibrated raised-floor DC model is used to study the sensitivity of a DC component's representation to the DC model accuracy. Commercial CFD tool, 6SigmaDC Room is used for modeling and simulation. A total of 8 DC components are considered and eventually ranked on the basis of time and effort required to collect model input data. For parametrized component object, the object's full range of input parameter values are considered, and simulations run. The results are compared with the baseline calibrated model to understand the trade-off between survey effort/cost and model accuracy. For the calibrated DC model and of the 8 components considered, it was observed that the chilled water piping branches, data cables and the cable penetration seal (found within cabinets) have considerable influence on the tile flow rate prediction accuracy.« less