The new Arizona Azure region doesn’t just use solar panels for energy; it uses less carbon-intensive solar panels built on the same kind of production line as flat screen TVs, using cadmium telluride – mining by-products that would otherwise end up in waste heaps.
Microsoft is offsetting energy use in its data centre with renewable energy from the 150MW Sun Streams 2 solar plant. Solar energy, even with batteries to extend availability into the evenings, aren’t yet ready to power a data centre for the full 24 hours, but it’s still a step forward.
The panels at Sun Streams 2 come from Arizona company First Solar, who presented at Microsoft’s Ignite conference in 2019 about how they build their panels – and the Azure services they use to run their factories, uploading 100GB of data to Azure a day to track and optimise its manufacturing line using machine learning.
SEE: IT Data Center Green Energy Policy (TechRepublic Premium)
Usually, making solar panels isn’t itself a particularly green process; it takes something like two to five days and involves high temperatures to melt down a polysilicon ingot to get the photovoltaic cell, and a lot of water to cool and clean the panel as it gets made. First Solar starts with a sheet of glass, deposits layers of silicon on it and scribes that with lasers, so it takes less than four hours to create a panel that only needs 1% of the semiconductor material and uses 24 times less water.
First Solar panels are also designed to be recycled; the company can melt the glass back down and re-use 90% of the panel, including all of the semiconductor material, which can go into a new solar panel.
The Arizona Azure region should also need less power for cooling because, while it gets pretty warm in Arizona, it also gets quite cold and half the year the data centres can use external air (adiabatic cooling), with the less power-intensive evaporative cooling used the rest of the time to reduce the amount of power and water needed.
It’s becoming more common in data centres, but Microsoft’s introduction to external air cooling came when building its Dublin Azure location. Celebrating getting the cooling system working by going out for a drink, some Azure employees walking down the street were struck by the temperature shown on a large thermometer: the same 55F they’d just spent two days getting the data centre air down to.
‘We could have just opened the door,’ they joked – and proceeded to find a way to do just that. Outside air needs to be filtered for pollen, insects, particulates and too high or low humidity, and the temperature needs to stay constant for the longest component life, so it’s not just about opening all the windows.
SEE: Cloud computing: Microsoft sets out new data storage options for European customers
The bigger picture is how Microsoft is learning to design data centres that are more efficient in their environment. That also includes the human side. No two data centres are alike, and neither is the weather that they have to cope with.
When Azure first used shipping containers to put servers in at its Quincy site in eastern Washington, Microsoft soon realised it needed to build a roof over them – because the blowing snow in the winter built up against the walls and made it hard for the staff to get in to replace any failed hardware.