Essay:

Essay details:

  • Subject area(s): Engineering
  • Price: Free download
  • Published on: 7th September 2019
  • File format: Text
  • Number of pages: 2

Text preview of this essay:

This page is a preview - download the full version of this essay above.

Ice Cooling System And pseudo Roof Design for Data Center

CAMRAN MANIKFAN (X16137256) [email protected]

DIVYESH PATEL  (X16135521)                             [email protected]

VIKAS SHARMA (X01628059)                                                                          [email protected]

Abstract

Cloud computing, or the practice of storing data in a company’s servers and accessing it over the Internet, The need to store data in a place which is secure, accessible, and protected is a vital requirement for cloud computing. Data centres are growing and changing every day. The data racks used to store a plethora of information. These machines are vital and they work continuously and radiate heat at every moment and need to be kept cool to keep the data centre operate normally. This paper researches the benefits of cooling a data centre using the natural ice and snow available from the environment in Greenland in place of water and air cooling technologies used by most major corporations. It compares this technology to current ones to see which would be the most efficient to use for keeping the data centre at the right temperature for it to function.

Keywords—cooling; cooling system; data centre; CapEx; OpEx; ice; Greenland; snow

 Introduction

A data centre lies at the heart of cloud computing. Most of us see it as a massive repository for the billions of bytes of data generated each day by zealous users which has to be kept closed and secret so that nothing is stolen. While this is accurate to an extent, there is a lot more to a data centre than meets the eye. A data centre is extremely critical to the functioning of a cloud service. Cloud services store, generate, process, and handle volumes of data, so the place where it is stored is a very important part of its setup. Data center provides a customized dedicated system that gives full control on data and equipment. It allows organisations that run different kinds of applications and complex workloads. The operational costs of a data centre include maintenance, personnel required to manually work on the data racks, storage space on hard disks, and the most important requirement of a data centre: keeping the data centre cool and not letting it overheat. Heat destroys data integrity and causes extremely inefficient operation of a data centre.

What is a Data Centre?

Data center is a big building consist of large network computers, storage devices with infinitely long interconnected cable within the ports of devices. Data center are the modern day powerful processing house. Data centers are the heart of the companies which are dealing with cloud computing. Everyday more than billions of transactions are done by one data center. This is the main reason that there is a growth in the number of data every year. Every company either owns a data center or they take the computational resources from a cloud service provider company and pay for it with a pay-as-you-go system, which is called utility computing. Data centers are huge buildings with a lot of one investment both one time cost like construction, land or buying the equipment or run time investment like employee’s salary, security and many more. Billions of dollars are required to build and operate data center but when it comes to its importance cost can be ignored.

Capital Expenses and Operational Expenses

One organization has to spend money on different kind of aspect for creating a data centre which cover the Capital expenditure and Operational expenditure. These expenditures are described as under:- [1]

Capital expenditure

Capital expenditure is a one time disbursement of an organization. It includes buying land, necessary physical equipments like servers and HDDs, installation costs, construction of buildings, one-time investments for deploying a data centre, etc. Leading cloud service providers invest  millions of dollars to construct a single data center in a specific region. However, there are alternatives springing up which have infrastructures set up beforehand which companies can rent, for example, Amazon Web Services, Microsoft Azure, Google Cloud Platform, etc. These companies offer cloud services to companies, organisations, and individuals at affordable prices. Their objective is to provide low-cost services for the cloud to anyone who needs it.

Operational expenditure

These are the ongoing, continuous expenditure on a data centre. They must be paid at regular intervals of time. These intervals may be hourly, daily, monthly, quarterly, or yearly.

Electricity: It is required to make the data centre function. Every component is wired to work with electricity, so it is essential to have an uninterrupted supply of power including a backup. This takes up about 20% of the total cost of a data centre.

Engineering and installation manpower: This includes setting up the infrastructure and skeleton of the data centre, as well as the rudimentary building blocks needed such as power grids, a water supply, and fibre optic cables, or twisted pair/coaxial cables for centres which don’t require a high-speed data transfers. It takes up 18% of the total cost.

Power and Server Equipment: This is the setup procedure for a continuous power supply and buying hardware and equipment for the servers which will operate the data centre. The servers carry out any operations or computational operations needed by the data centre. It takes up 18% of the total cost.

Facility space: This is the land area which the data centre takes up. It may be rented or owned land. Either way, taxes are paid on this land, and it takes plenty of effort to maintain the land for proper operation. It takes up 15% of the total cost.

Service and maintenance: The data centre requires regular service and maintenance, whether automated or manual. It can be automated or manual. ervice personnel, who check for data integrity, failed equipment, and problems with power or data cables. Automated maintenance can include timed backups, regular checks carried out to ensure data integrity (like hashes and checksums), SMART drive monitoring, etc.

HVAC equipment: HVAC stands for heating, ventilation, and air conditioning. It is a mechanism to keep the indoor temperature and air quality optimal and is therefore useful for data centres. It uses the principles of thermodynamics, fluid mechanics, and heat transfer to ensure that indoor air and temperature is acceptable for the sensitive equipment a data centre has. It takes up 6% of the total cost.

Project Management: This refers to the cost of planning, managing, creating, and executing a specific project requested by a client within the deadline they specify. It takes up 5% of the total cost.

Rack hardware: These are the racks where physical hard drives are placed. They hold every single bit and byte of data a data centre contains and are very crucial to maintain and secure. It takes up 2% of the total cost.

System monitoring: Monitoring the system for unusual activity is critical from both a maintenance and security point of view. If there is unusually high activity at some point of time, for example, it could mean a data breach and extra steps are needed to secure data. It is also possible that some CPUs are overclocked and will overheat if the system is not cooled, so we would need to amp up the cooling system of the data centre to handle such a problem. If the SMART status of a hard drive indicates it is failing, it is time to take steps to back up the drive, check for any data loss, and replace the malfunctioning disk. It takes up 1% of the total cost. [2]

III. State of The Art

CRAC (Computer room air conditioning) units

The first type of cooling used in a data centre was carried out by air conditioning units much like the ones we use in our homes today. The system is very basic in its fundamentals. It uses chemicals which can be easily changed into a liquid from a gas, and vice versa. They send heat from inside a data centre to the outside.

Air conditioners consist of a compressor, condenser, and an evaporator. It also has a working fluid which assists the cooling process.

This fluid first enters the compressor as a cool, low-pressure gas. It gets compressed and leaves the component as a hot, high-pressure gas due to the compression operation. It then enters the condenser as a high-pressure, cool liquid. This is because the condenser dissipates the heat from the gas it receives. The gas turns into a liquid when it leaves the condenser, and finally enters the evaporator. The pressure is reduced so the liquid turns into a gas. As it does, the gas begins to extract heat from the air around it. It exits the evaporator as a cool, low-pressure gas and returns to the compressor to begin the cycle all over again. [3]

Using precision air conditioners for cooling has several advantages. These can be listed as follows:-

Accurate temperature monitoring and processor-based temperature control.

Precise control of humidity because electronic devices are sensitive to changes in humidity levels.

They are optimised to be used 24 × 7 × 365 continuously, unlike room A/Cs.

They are designed to manage high levels of sensible heat, i. e. without humidity, emitted by machines.

They distribute air better than standard A/Cs. They can move larger volumes of air at higher speeds.

They can tolerate higher heat and load densities.

Multiple A/C units can communicate with each other and automatically regulate the temperature.

They have regular firmware/software updates, so new features can easily be introduced or existing ones can be enhanced.

Their support and maintenance is much better than regular A/Cs.

Troubleshooting and monitoring them remotely is possible. [4]

There are disadvantages to using air conditioners for cooling a data centre as well. They are as follows:-

The energy consumption of running air conditioners 24 × 7 × 365 is very high.

Running A/Cs in a data centre is very costly due to the need to leave them on continuously.

Air is not a very good conductor of heat. Water is much more efficient at absorbing heat and transporting it away.

It is a very costly proposition to cool air to the temperatures required by a data centre and then transport it to where it is required.

Data centres are shrinking in the space they occupy. This means that the large fans used by them earlier to move air around are not feasible today.

HVAC System: -

One other cooling system was also used by the organization called HVAC (Heating ,ventilation and air conditioning). This system works within the data center. Whatever the heat air is produced by the physical equipments in the data center that will be cool down by the  HVAC hardware system and this power usage will be depend on the amount of hot air. If pressure of hot air is higher then it will consume more power to cool down the inner system. [5]

Benefits of HVAC system are narrated as under: -

HVAC system gives the better cooling with effective power efficiency.

Digital control system and monitoring the temperature

Maintains the fresh cool air and controlling the inside temperature [5]

Disadvantage of HVAC system

HVAC system’s installation cost is bit high.

HVAC system needs huge amount of water to give better cooling facility.

In the case of any damage or technical issues, recovery and maintenance cost is expensive and quite complex to deal with it.

This system works for heating and ventilating purpose as well so it is worth to use in data centre. [6]

Water cooling

Liquids are much more efficient at cooling a data centre than air.

Water is 3,400 times more efficient than air in removing heat. Earlier, there was the problem of leak developing and overloading system mechanics. If a system developed a leak or crack in the plumbing, water would seep through the crack and ruin the circuits of the computer systems. This can be solved by using rear door heat exchanging systems which are fully self-sufficient. Even if they develop leaks, there is no way for the water to reach electronic equipment. [7]

Another method used is targeted water cooling. CPUs are fitted with metal pads coated with heat conductors like copper, or gold-plated copper for more effective heat absorption, replacing the traditional cooling fins. These metal pads have microchannels. Pure water (without any dissolved gases or salts) passes through the microchannels, taking in heat directly from the CPUs.

The systems currently run at negative pressure. This means that water is sucked around the system as opposed to being pumped. If the plumbing develops a leak, air is sucked in instead of water coming out. The system is monitored continuously so that administrators can take action once they’re informed of the leak. [8]

The advantages of water cooling are:-

Water can absorb and remove more heat than air since it is a better conductor.

Power consumption is much lower than in air cooling.

It does not need a lot of space or large equipment.

There are a few disadvantages:-

The racks for immersion servers can only be accessed from above, so the space which can be used is severely reduced.

Maintenance of the racks and cables can become messy due to contact with motor oils and water. It can be hard to clean.

The infrastructure to support liquid cooling is very large and expensive, and a huge supply of water must be available.

Special HDDs are needed which are sealed or designed to prevent leakage, since they must operate in a gas.

The retrofitting costs of modifying a data centre to use liquid cooling are very high. It is easier to build a data centre designed with liquid cooling in mind. [9]

Novelty  Of  Innovation

The current data centers are built in the civilized and popular dense areas to ensure the power availability and employee facility. Our data center will be located in the Nuuk, the capital of Greenland. Greenland is situated closer to the arctic region. The average temperature of greenland  is quite low throughout the year which help us to keep the data centre’s interior system cool. The temperature of 9 out of 12 month remains under 0ºC and for rest of the 3 months it is on average 5ºC having a rainfall of 80 mm. Greenland have harsh environmental conditions to survive but these conditions can be used as advantages for the cooling purpose of our data centers.

Greenland has a massive snowfall throughout the year especially in the winters. This ice can be used as coolant for data center. Greenland is full of glaciers and snow is available in the environment and we can utilize this ice in our data center for cooling. Ice will be carried to the data centre and it will goes through the raised floor as input. The melted ice will be collected as an output from the data center in reservoirs which further can be retreated.

Nowadays, google is planning to use 100% renewable energy in their data centre to reduce the electricity cost by just deploying devices like wind turbine, solar panel etc. We need to invest one time to deploy and install the devices and it’s operational cost ie very low. Average wind speed in greenland is 18 kmph, so we can utilize the wind to generate the energy which would be very beneficial for the company in terms of finance. Moreover, by making some architectural changes in the basic design of data center these cool breezes will be used to the counter the heat generated from the devices of data centers. [10]

Greenland is called the lonely planet, it is highly isolated from rest of the world and it is peaceful. There is no revolt and there are no chances of any uprising in the future. Greenland also have a politically stable government. All these advantages makes it a secure and unobtrusive` place.

This kind of data centre is intended to be low-cost, efficient, minimalistic, and a step up from the data centres we are seeing today. [11]

Commercial Viability

The benefits of  new technology and infrastructural change will be cost efficient to run a data center.

This system will avoid the extra costs of piping and plumbing to run a water-cooled data centre, and there is no need to compact the ice brought in from outside, so the charges of using a freezer are also mitigated. This slashes costs of running the data centre, since we simply need to bring the ice already present outside and leave it in an open environment to cool the place.

Implementation

Ice Cooling Mechanism

As shown in the image, the data centre has several racks with the hardware required for storage space contained within them. The entire cooling system is underground. Ice and snow is brought in from outside and brought to the data centre from underground, and stored in an ice cabin. The ice is placed on an inclined plane below the data racks, placed on a raised floor.. Due to the slanting plane, the ice will move downwards with the force of gravity and reach a wire mesh grill on the opposite end. In this way, ice blocks will be allowed to accumulate on the plane until full capacity.

When the ice starts melting, it will cool the air around it. This air is allowed out upwards through the floor, which is porous to allow exchange of air. The hot air radiated by the data racks will rise upwards since it is lighter, and cool air from the underground ice will take its place, keeping them at optimal temperature. This continues until all the ice has melted and turned into water.

While the ice turns into water, it is drained out through the mesh grill. The grill leads to the drainage systems of the data centre, where it will be collected in a reservoir. This water will be led through underground piping and directed back to the ice cabin to be recycled and reused in the ammonia plant to make more ice.

When all the melted ice has been let out through the wire mesh, more ice is brought in from the cabin and allowed to accumulate on the inclined plane. The cold air from the ice will cool the data racks, and the entire process starts all over again.

B.  Roof cooling

The roof also plays an important role  in cooling our design of the data centre by using air vents. As shown in the diagram, there are output vents in the roof design. They are strategically placed in a way which allow hot air from the data racks to escape, while cold air, being denser and heavier, will sink down to cool the data centre further.

             

Fig. 1. A diagram of the roof architecture of the data centre

A specific building plan is designed for the removal of hot air. A smooth para wall is designed to help the hot air to go vertically. hot air after striking the smooth para wall will blow sideways. Just above the para wall there are two vertical hinged wall. these hinged wall stops the air to move further more upward and with the help of exhaust fans some amount of hot air is pushed out. the remaining air will strike again to the smooth para wall and will removed with the help of exhaust fans. At the top of the

The ventilation method uses exhaust fans to direct the hot air which is rising up from the data centre towards the multiple exhaust vents located on all four walls of the roof. Hot air exits the data centre and cool air remains inside. Cool air is heavier than hot air, so it sinks down to the data racks and keeps them cool as it absorbs the heat they generate. When it is hot enough, it will rise again to the roof, and cool air from there sinks down below. The hot air is once again expelled from the vents, continuing the cycle of heating and cooling.

Findings:

In the ever evolving world of technology there is always scope of improvement. There are some issues which influence the power consumption in the cooling. Efforts were made to reduce the cooling cost but still it is higher than it should be. for reducing the cost a lot of research is continued throughout the years and a lot of new technologies like testing new methods, new hardware. As the number of data centers increasing rapidly, the cooling power consumption remains undesirably higher. The new technology mentioned above will provide a cost efficient, power efficient hence it will reduce the operational expenditure.

B.   Conclusion:

This specific paper describes the new methodology of cooling the data center by using ice and making some architectural design changes in the roof

Acknowledgments

We would like to thank our parents and family for their unending support and belief in us.  This project would not be complete without their support. Special thanks go to our teachers in helping and guiding us through the project throughout its development. Our sincere thanks go to them for their help and kindness.

References

\"Data Center TCO (total cost of ownership) - Ongoing Operations\", Ongoing Operations, 2017. [Online]. Available: https://ongoingoperations.com/data-center-pricing-credit-unions/. [Accessed: 12- Apr- 2017].

P. Williams and P. Williams, \"Data Centers -- What are the Costs of Ownership?\", StorageCraft Technology Corporation, 2017. [Online]. Available: http://www.storagecraft.com/blog/data-centers-costs-ownership/. [Accessed: 12- Apr- 2017].

“How Does An Air Conditioner Work?\", Energyquest.ca.gov, 2017. [Online]. Available: http://energyquest.ca.gov/how_it_works/air_conditioner.html. [Accessed: 12- Apr- 2017].

\"Advantages of Precision Air Conditioners for Data Centers/Server Rooms\", excITingIP.com, 2017. [Online]. Available: http://www.excitingip.com/3419/advantages-of-precision-air-conditioners-for-data-centersserver-rooms/. [Accessed: 12- Apr- 2017].

\"Cisco Unified Computing System Site Planning Guide: Data Center Power and Cooling\", Cisco.com, 2017. [Online]. Available: http://www.cisco.com/c/en/us/solutions/collateral/data-center-virtualization/unified-computing/white_paper_c11-680202.pdf. [Accessed: 12- Apr- 2017].

\"Limitations/Disadvantages - HVAC Variety\", Sites.google.com, 2017. [Online]. Available: https://sites.google.com/site/hvacvarietyradiantswamp/radiant-floor/limitations. [Accessed: 12- Apr- 2017].

\"The Advantages of Liquid Cooling | Data Center Knowledge\", Data Center Knowledge, 2017. [Online]. Available: http://www.datacenterknowledge.com/archives/2010/07/02/the-advantages-of-liquid-cooling/. [Accessed: 12- Apr- 2017].

\"Water cooling vs. air cooling: The rise of water use in data centres\", ComputerWeekly, 2017. [Online]. Available: http://www.computerweekly.com/tip/Water-cooling-vs-air-cooling-The-rise-of-water-use-in-data-centres. [Accessed: 12- Apr- 2017].

\"What’s Stopping Liquid Cooling?\", The Data Center Journal, 2017. [Online]. Available: http://www.datacenterjournal.com/whats-stopping-liquid-cooling/. [Accessed: 12- Apr- 2017].

\"We’re set to reach 100% renewable energy — and it’s just the beginning\", Google, 2017. [Online]. Available: https://blog.google/topics/environment/100-percent-renewable-energy/. [Accessed: 13- Apr- 2017].

\"Weather Averages for Nuuk, Greenland\", Holiday-weather.com, 2017. [Online]. Available: http://www.holiday-weather.com/nuuk/averages/. [Accessed: 13- Apr- 2017]

...(download the rest of the essay above)

About this essay:

This essay was submitted to us by a student in order to help you with your studies.

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, . Available from:< https://www.essaysauce.com/essays/engineering/2017-4-13-1492075493.php > [Accessed 23.10.19].