skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Three-objective shape optimization and parametric study of a micro-channel heat sink with discrete non-uniform heat flux boundary conditions
A water-cooled multi-die heat sink with parallel rectangular micro-channels was designed to satisfy the operational requirements of a multi-die processor. A shape optimization strategy based on the RSM (response surface method) was used to minimize pressure drop and die maximum case temperatures. The effects of the thermal interface materials and heat spreader between the dies and heat sink were captured by the numerical simulation. The optimization was performed for constant values of coolant flow rate and inlet temperature, as well as the power, location, and surface area of the dies. The influence of channel hydraulic diameter, Reynolds number, thermal entrance length, and total heat transfer surface area on the hydraulic and thermal performance of the heat sink was determined using CFD (computational fluid dynamics) simulations at RSM design points. A sensitivity analysis was performed to evaluate the effect of the design parameters on the response parameters. The optimum designs were achieved by minimizing a weighted objective function defined based on response parameters using JAYA algorithm. The results of weighted sum method were compared with Pareto based three objective optimization with a NSGA-II (non-dominated sorting GENETIC algorithm). Finally, a parametric study was performed to see the effect of the design parameters on the response parameters.  more » « less
Award ID(s):
1738793
PAR ID:
10094471
Author(s) / Creator(s):
Date Published:
Journal Name:
Applied thermal engineering
Volume:
150
ISSN:
1359-4311
Page Range / eLocation ID:
720-730
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract This paper proposes a computational fluid dynamics (CFD) simulation methodology for the multi-design variable optimization of heat sinks for natural convection single-phase immersion cooling of high power-density Data Center server electronics. Immersion cooling provides the capability to cool higher power-densities than air cooling. Due to this, retrofitting Data Center servers initially designed for air-cooling for immersion cooling is of interest. A common area of improvement is in optimizing the air-cooled component heat sinks for the fluid and thermal properties of liquid cooling dielectric fluids. Current heat sink optimization methodologies for immersion cooling demonstrated within the literature rely on a server-level optimization approach. This paper proposes a server-agnostic approach to immersion cooling heat sink optimization by developing a heat sink-level CFD to generate a dataset of optimized heat sinks for a range of variable input parameters: inlet fluid temperature, power dissipation, fin thickness, and number of fins. The objective function of optimization is minimizing heat sink thermal resistance. This research demonstrates an effective modeling and optimization approach for heat sinks. The optimized heat sink designs exhibit improved cooling performance and reduced pressure drop compared to traditional heat sink designs. This study also shows the importance of considering multiple design variables in the heat sink optimization process and extends immersion heat sink optimization beyond server-dependent solutions. The proposed approach can also be extended to other cooling techniques and applications, where optimizing the design variables of heat sinks can improve cooling performance and reduce energy consumption. 
    more » « less
  2. Abstract The increased power consumption and continued miniaturization of high-powered electronic components have presented many challenges to their thermal management. To improve the efficiency and reliability of these devices, the high amount of heat that they generate must be properly removed. In this paper, a three-dimensional numerical model has been developed and experimentally validated for several manifold heat sink designs. The goal was to enhance the heat sink's thermal performance while reducing the required pumping power by lowering the pressure drop across the heat sink. The considered designs were benchmarked to a commercially available heat sink in terms of their thermal and hydraulic performances. The proposed manifolds were designed to distribute fluid through alternating inlet and outlet branched internal channels. It was found that using the manifold design with 3 channels reduced the thermal resistance from 0.061 to 0.054 °C/W with a pressure drop reduction of 0.77 kPa from the commercial cold plate. A geometric parametric study was performed to investigate the effect of the manifold's internal channel width on the thermohydraulic performance of the proposed designs. It was found that the thermal resistance decreased as the manifold's channel width decreased, up until a certain width value, below which the thermal resistance started to increase while maintaining low-pressure drop values. Where the thermal resistance significantly decreased in the 7 channels design by 16.4% and maintained a lower pressure drop value below 0.6 kPa. 
    more » « less
  3. Abstract Data centers have started to adopt immersion cooling for more than just mainframes and supercomputers. Due to the inability of air cooling to cool down recent high-configured servers with higher Thermal Design Power, current thermal requirements in machine learning, AI, blockchain, 5G, edge computing, and high-frequency trading have resulted in a larger deployment of immersion cooling. Dielectric fluids are far more efficient at transferring heat than air. Immersion cooling promises to help address many of the challenges that come with air cooling systems, especially as computing densities increase. Immersion-cooled data centers are more expandable, quicker installation, more energy-efficient, allows for the cooling of almost all server components, save more money for enterprises, and are more robust overall. By eliminating active cooling components such as fans, immersion cooling enables a significantly higher density of computing capabilities. When utilizing immersion cooling for server hardware that is intended to be air-cooled, immersion-specific optimized heat sinks should be used. A heat sink is an important component for server cooling efficacy. This research conducts an optimization of heatsink for immersion-cooled servers to achieve the minimum case temperature possible utilizing multi-objective and multidesign variable optimization with pumping power as the constraint. A high-density server of 3.76 kW was modeled on Ansys Icepak that consists of 2 CPUs and 8 GPUs with heatsink assemblies at their Thermal Design Power along with 32 Dual In-line Memory Modules. The optimization is conducted for Aluminum heat sinks by minimizing the pressure drop and thermal resistance as the objective functions whereas fin count, fin thickness, and heat sink height are chosen as the design variables in all CPUs, and GPUs heatsink assemblies. Optimization for the CPU and the GPU heatsink was done separately and then the optimized heatsinks were tested in an actual test setup of the server in ANSYS Icepak. The dielectric fluid for this numerical study is EC-110 and the cooling is carried out using forced convection. A Design of Experiment (DOE) is created based on the input range of design variables using a full-factorial approach to generate multiple design points. The effect of the design variables is analyzed on the objective functions to establish the parameters that have a greater impact on the performance of the optimized heatsink. The optimization study is done using Ansys OptiSLang where AMOP (Adaptive Metamodel of Optimal Prognosis) as the sampling method for design exploration. The results show total effect values of heat sinks geometric parameters to choose the best design point with the help of a Response Surface 2D and 3D plot for the individual heat sink assembly. 
    more » « less
  4. The rapid advancement of 5G technology necessitates the development of efficient thermal management solutions to handle the increased heat dissipation demands of high-power electronic components. This study presents an optimization strategy for a microchannel cold plate designed for a prototype 5G front-end system, featuring four 22-Watt chips as heat sources. The cold plate, constructed from aluminum, incorporates multiple rectangular flow channels evenly spaced to facilitate uniform heat distribution, with an inlet runner. The primary objective of this study is to optimize the geometry of the flow channels and the coolant mass flow rate at the runner entrance to minimize entropy generation, thereby enhancing the heat dissipation capability of the cold plate while minimizing pressure drop. Given these challenges, this study aims to develop an optimization strategy for cold plate design. This research applies Bayesian Optimization (BO), and Response Surface Methodology (RSM) paired with Genetic Algorithm (GA), and FMINCON (sequential quadratic programming, a built-in optimizer of MATLAB). These methods are utilized to fine-tune the channel dimensions and coolant flow rate, and the data that is used to evaluate entropy of the system is obtained from conjugate heat transfer simulations solved by Ansys Fluent. By using the Gaussian Process model to build response surface and predicting function of entropy generation, the results indicate that BO outperforms RSM paired with GA and FMINCON in terms of entropy reduction with same number of samples. 
    more » « less
  5. The increasing prevalence of high-performance computing data centers necessitates the adoption of cutting-edge cooling technologies to ensure the safe and reliable operation of their powerful microprocessors. Two-phase cooling schemes are well-suited for high heat flux scenarios because of their high heat transfer coefficients and their ability to enhance chip temperature uniformity. In this study, we perform experimental characterization and deep learning driven optimization of a commercial two-phase cold plate. The initial working design of the cold plate comprises a fin height of 3mm, fin thickness of 0.1 mm, and a channel width of 0.1 mm.A dielectric coolant, Novec /HFE 7000, was impinged into microchannel fins through impinging jets. A copper block simulated an electronic chip with a surface area of 1˝ × 1˝. The experiment was conducted with three different coolant inlet temperatures of 25◦ C, 36◦ C, and 48◦ C with varying heat flux levels ranging from 7.5 to 73.5 W cm2. The effects of coolant inlet temperatures and flow rate on the thermo-hydraulic performance of the cold plate were explored. In two-phase flow, increasing coolant inlet temperature results in more nucleation sites and improved thermal performance consequently. Thermal resistance drops with flow rate in single-phase flow while it is not affected by flow rate in nucleate boiling region. An improvement in the design of the cold plate was carried out, with the goal of increasing the number of bubble sites and flow velocity at the root fins, by cutting the original fins and creating channels perpendicular to the original channels. Three design parameters, fin height, width of machined channels, and height of short fins preserved through machined channels, were defined. It was observed that widening the machined channels and cutting fins to some point can improve the thermal performance of the cold plate. However, removing fins excessively adversely affects the thermal performance of the cold plate because of loss of heat transfer surface area. Moreover, preserving the short fins through the machined channels decreases thermal resistance as they increase heat transfer surface area and nucleation sites. Furthermore, a deep learning-based compact model is demonstrated for the two-phase cold plate design in the specific range of geometry and flow conditions. The developed compact model is utilized to drive the single and multi-objective optimization to arrive at global optimal results. 
    more » « less