Ecologi

Data Centers Cooling

Data center industry is rapidly growing, with ever greater focus on faster connections and increasing uptime. Cloud data center traffic will grow every year, and the need for greater online storage will drive server capacities higher and higher.

The worldwide energy consumption of data centers increased nearly 56 % between 2005 and 2010, and reached 237 terawatt hours (TWh) in 2010, accounting for about 1.3% of the world’s electricity usage [1]. Cooling systems (primarily air conditioners) in data centers account for a large part of this energy consumption: in 2009, about 40% of the energy consumed by data centers was for cooling purposes [2,3].

Cooling Equipment:

A common cooling method for data centers is the use of computer room air-conditioners (CRAC), these units have several configurations:

  1. Aisle Coolers:

which supply cold air to the IT equipment racks through a raised floor. The air flows across the IT equipment and then removes dissipated heat from the back of the rack. In order to avoid mixing hot and cold air, and thus reduce the cooling efficiency, the typical practice is to arrange alternating rack rows of “hot aisles” and “cold aisles.” Since hot air is lighter than cold air, the hot exhaust air from the IT equipment rises and recirculates into the CRAC, where it is cooled and supplied into the racks again.

  1. In-rack coolers:

In-Rack is the most precise cooling available, as the rack and the air conditioner operate in a closed relationship with one another. Cold air has no choice but to pass through the servers; hot air has no choice but to pass through the heat exchanger. The airflow paths are small, requiring less fan energy. In addition, the exhaust air is captured at its hottest point, maximizing the temperature difference on the cooling coil.

Highly Efficient Cooling Systems:

Cooling systems consumes roughly 40 % of the overall energy consumption in data centers, so there are a lot of opportunities to reduce the power by using energy saving cooling systems. [4]

Data centers usually have a tendency to overcool to prevent equipment downtime and maintain an operating environment of about 20 °C and 50 % RH. There are some “smart” or “adaptive” cooling solutions that allow for dynamic modification of the data center cooling air flow and temperature set points based on heat load monitoring throughout the data center. These methods save excess energy consumption due to overcooling and also prevent the formation of hot spots. [5][6]

  1. Free Air Cooling (FAC):

Free air cooling (FAC) is one of the simple and most promising methods to reduce the energy consumption for cooling. FAC uses air outside data centers to cool equipment directly (under prescribed temperature and humidity levels). When the outside air is cooler than the return air, an airside economizer exhausts the hot return air and replaces it with cooler, filtered outside air, essentially “opening the windows” to cool the data center equipment.

To solve the indoor air quality problem of the aforementioned system, heat exchangers can be added between the indoor and outdoor airs. Therein, rotating wheels are widely used. The following schematic gives the principle of rotating wheel heat exchangers where the wheel keeps rotating at a speed of 10-12 RPM and the airs flows into different paths to avoid mixing. Then the indoor air flows back to the data center for space cooling after heat exchange whereas the heated outdoor air is exhausted. So in order to guarantee a reliable refrigeration system, electrical cooling equipment is often integrated with the rotating wheel heat exchanger.

FAC has been investigated by companies including Intel, Google, Microsoft, and Vodafone.

Intel conducted a 10-month test to evaluate the impact of using only outside air via FAC to cool a high-density data center in New Mexico 2007. The center had 900 heavily utilized production servers. In this test, the system provided 100 % air exchange with a temperature variation in the supply air from 18 °C to more than 32 °C, no humidity control (4–90 % RH), and minimal air filtration. The results showed that about $2.87 million (a 67 % savings in total energy costs) was saved by the new cooling method. [7]

  1. Water Free Cooling (WFC):

the main difference between water-based free cooling system and traditional air conditioning system is that a heat exchanger is installed in parallel with electrical chiller to make full use of free cooling capacities using cooling tower or underground water to exchange heat with the water used to cool the data center. In other words, according to climatic conditions (especially the wet bulb temperature), the whole system can work under three different modes:

(a) when the outdoor temperature is low (winter), the cooling water can be used to produce chilled water directly through the heat exchanger and the chiller can be turned off, so that the system works under “free cooling” mode;

(b) when the outdoor temperature is high (summer), the chiller is activated instead while the cooling tower is only used to handle the condensation heat, so that the system works under “electrical cooling” mode;

(c) when the outdoor temperature is moderate (spring and fall), the chiller and heat exchanger work together in parallel, so the system works under “free cooling + electrical cooling” mode. Therefore, the working conditions of the water-based free cooling system are greatly impacted by the ambient temperature variation. [8]

References:

[1] J.G. Koomey, Growth in data center electricity use 2005 to 2010 (Analytics Press, Oakland, 2011).

[2] A. Almoli, A. Thompson, N. Kapur, J. Summers, H. Thompson, G. Hannah, Computational fluid dynamic investigation of liquid rack cooling in data centres, Appl. Energy (2012).

[3] P. Johnson, T. Marker, Data center energy efficiency product profile, Pitt & Sherry, Report to equipment energy efficiency committee (E3) of The Australian Government Department of the Environment, Water, Heritage and the Arts (2009).

[4] A. Bar-Cohen, B.A. Srivastava, B. Shi, Thermo-Electrical Co-Design of 3D ICs: Challenges and Opportunities. Computational Thermal Science (2013).

[5] A. Bar-Cohen, J.J. Maurer, J.G. Felbinger, Keynote Lecture, “DARPA’s Intra/Interchip Enhanced Cooling (ICECool) Program”, in Proceedings, IEEE CSMantech, New Orleans, La, May 2013.

[6] M.M. Ohadi, S.V. Dessiatoun, K. Choo, M. Pecht, Air Vs. Liquid and Two-Phase Cooling of Data Centers, in Semi-Therm Proceedings, San Jose, CA, March 2012.

[7] S. O’Donnell, “IBM Claim that Water Cooled Servers are The Future of It at Scale”, the Hot Aisle, June 2009.

[8] Yin Zhang, Zhiyuan Wei, Mingshan Zhang, Free cooling technologies for data centers: energy saving mechanism and ap

news-1701

sabung ayam online

yakinjp

yakinjp

rtp yakinjp

slot thailand

yakinjp

yakinjp

yakin jp

ayowin

yakinjp id

maujp

maujp

sv388

taruhan bola online

maujp

maujp

sabung ayam online

sabung ayam online

judi bola online

sabung ayam online

judi bola online

slot mahjong ways

slot mahjong

sabung ayam online

judi bola

live casino

sabung ayam online

judi bola

live casino

slot mahjong

sabung ayam online

slot mahjong

118000616

118000617

118000618

118000619

118000620

118000621

118000622

118000623

118000624

118000625

118000626

118000627

118000628

118000629

118000630

118000631

118000632

118000633

118000634

118000635

118000636

118000637

118000638

118000639

118000640

118000641

118000642

118000643

118000644

118000645

118000646

118000647

118000648

118000649

118000650

118000651

118000652

118000653

118000654

118000655

118000656

118000657

118000658

118000659

118000660

118000661

118000662

118000663

118000664

118000665

118000666

118000667

118000668

118000669

118000670

118000671

118000672

118000673

118000674

118000675

118000676

118000677

118000678

118000679

118000680

118000681

118000682

118000683

118000684

118000685

118000686

118000687

118000688

118000689

118000690

128000676

128000677

128000678

128000679

128000680

128000681

128000682

128000683

128000684

128000685

128000686

128000687

128000688

128000689

128000690

128000691

128000692

128000693

128000694

128000695

128000696

128000697

128000698

128000699

128000700

128000701

128000702

128000703

128000704

128000705

128000706

128000707

128000708

128000709

128000710

128000711

128000712

128000713

128000714

128000715

128000716

128000717

128000718

128000719

128000720

128000721

128000722

128000723

128000724

128000725

128000726

128000727

128000728

128000729

128000730

138000421

138000422

138000423

138000424

138000425

138000426

138000427

138000428

138000429

138000430

138000431

138000432

138000433

138000434

138000435

138000431

138000432

138000433

138000434

138000435

138000436

138000437

138000438

138000439

138000440

208000341

208000342

208000343

208000344

208000345

208000346

208000347

208000348

208000349

208000350

208000351

208000352

208000353

208000354

208000355

208000356

208000357

208000358

208000359

208000360

208000361

208000362

208000363

208000364

208000365

208000366

208000367

208000368

208000369

208000370

208000371

208000372

208000373

208000374

208000375

208000376

208000377

208000378

208000379

208000380

208000381

208000382

208000383

208000384

208000385

208000386

208000387

208000388

208000389

208000390

208000391

208000392

208000393

208000394

208000395

208000396

208000397

208000398

208000399

208000400

208000401

208000402

208000403

208000404

208000405

208000406

208000407

208000408

208000409

208000410

208000411

208000412

208000413

208000414

208000415

208000416

208000417

208000418

208000419

208000420

208000421

208000422

208000423

208000424

208000425

208000426

208000427

208000428

208000429

208000430

238000211

238000212

238000213

238000214

238000215

238000216

238000217

238000218

238000219

238000220

238000221

238000222

238000223

238000224

238000225

238000226

238000227

238000228

238000229

238000230

news-1701