One of the unexpected delights of the heatwave was the sound of a Conservative transport secretary talking sense. Grant Shapps was on the Today programme on Tuesday morning explaining a basic principle of good engineering design: get the specifications right. When you’re creating a new piece of public infrastructure you need to be able to specify the constraints under which the design is expected to function.
Shapps explained that the railway system over which he currently presides was designed to operate between temperatures of -10C and 35C. And, in an astute move to preempt a furious Daily Mail editorial about staunch British rail tracks surely being able to cope with temperatures a mere five degrees above their design limit, he pointed out that if the air temperature is 40C, the actual temperature of the rails might be twice that. They are, after all, made of steel and could conceivably buckle in the heat, which is why some lines had been closed that day.
Interestingly, the railway industry was not the only one that couldn’t take the heat. When the temperature reached 40.3C on Tuesday, datacentres operated by Google and Oracle had to be taken offline. According to The Register “Selected [Google] machines were powered off to avoid long-term damage, causing some resources, services, and virtual machines to become unavailable, taking down unlucky websites and the like.” And at 3:41pm Oracle customers received an alert telling them that: “As a result of unseasonal temperatures in the region, a subset of cooling infrastructure within the UK South (London) Data Centre experienced an issue. This led to a subset of our service infrastructure needing to be powered down to prevent uncontrolled hardware failures. This step has been taken with the intention of limiting the potential for any long term impact to our customers.”
None of this will come as a surprise to anyone who has been lucky enough to have visited one of these centres. (Since the tech companies are exceedingly sensitive about them, invitations are rare.) Basically, they are huge, windowless metal sheds, often constructed in remote locations, and surrounded by military-grade perimeter fencing. Inside are many thousands of stripped-down PCs (called servers) arranged in vertical racks.
Centres come in different sizes. The average one covers 100,000 sq ft. The biggest one (predictably in China) covers 6.3m sq ft. The number of servers in any given centre depends on its area and the tasks that the servers perform – ranging from hosting websites (low power), through running Javascript bots (mid-range) to graphics and video rendering or running machine-learning algorithms (high intensity). It’s been estimated, for example, that a 1m sq ft centre could hold anything from 800,000 to 6.3m machines in varying combinations.
Everything about a datacentre comes down to one thing: electrical power. It’s needed to run the servers – and to cool them, because they run hot. Imagine 800,000 powerful PCs running continuously and you’ll get the idea. Accordingly, air-conditioning of their oppressive interiors is a critical function. It’s why tech companies often try to put them in places where the electricity supply is cheap and stable, and where cooling is easier because prevailing temperatures are lower.
In a way, datacentres (which were once called “server farms” until the tech industry decided that was too mundane a term) are the cathedrals of our networked world. But unlike cathedrals, which just have symbolic, aesthetic, cultural or religious significance, datacentres constitute a tangible, utilitarian necessity of the online world. Without them, after all, our smartphones would be just expensive paperweights. Whenever you post a photograph to Instagram, send a text message to a friend, or consult a weather app to see how hot it is in Doncaster, you’re interacting with a datacentre somewhere.
Which is why what happened to the Google and Oracle centres last week is interesting. It’s a reminder that railways are not the only part of our society’s critical infrastructure that is vulnerable to global heating. As Shapps was explaining the realities of metallic expansion to Radio 4 listeners on Tuesday, the question that was running through my mind was: what were the design parameters for the two centres that had to go offline on Tuesday? Were the engineers who specified them assuming that the sheds would never have to cope with more than 35C? If so, then they – and we – could be in trouble.
It’s easy to understand why railways constitute the critical infrastructure of a society: they’re tangible, intensely physical things. But the technology behind our smartphones seems, somehow, ethereal. The useful thing about datacentres is that they remind us that it’s not.
What I’ve been reading
I post therefore I am
What are the challenges of teaching an “introduction to philosophy” course to the Instagram generation? Marie Snyder examines the subject in an interesting 3 Quarks Daily essay, On Tossing the Canon in a Cannon.
Medical bulletin
Donald McNeil’s Monkeypox: What You Actually Need to Know is a really informative piece on the Common Sense substack.
Critical distance
If you feel in need of some perspective, an essay by Max Fisher in the New York Times, Is the World Really Falling Apart, or Does It Just Feel That Way?, explores trying to take the long view.