The explosion in growth of data centres
As the need for large, secure premises grows, Associate Director Witold Nowakowski from Chapman Taylor’s Warsaw Studio, explains the origins and future direction for data centres.
A data centre is a building, dedicated space within a building, or a group of buildings used to house computer systems and associated components, such as telecommunications and storage systems. Since IT operations are crucial for business continuity, the centres generally house redundant or backup components and infrastructure for power supply, data communication connections, environmental controls (e.g. air conditioning, fire suppression) and various security devices and systems.
The history of data centres
Data centres originated just after WWII. An enormous computer room housed American ENIAC (Electronic Numerical Integrator and Computer), the first real programmable, electronic, digital computer. ENIAC was first launched on December 10, 1945, and decommissioned a decade later on October 2, 1955.
The computer served military purposes and was used mainly to calculate firing tables. The machine was heralded as a "giant brain" by the American press.
During the boom of the microcomputer industry, in the early 1980s, users started to deploy computers everywhere, in many cases with little or no care about operating requirements.
The explosion of data centres occurred during the last four years of the 20th century, associated with the so-called ‘dot com’ bubble.
Various companies required reliable connections to strengthen their presence on the web. Many of them started to build purpose-made facilities, to provide capabilities such as crossover backup. At that point, they were known as Internet data centres.
Projected growth
Often hidden in plain sight, data centres are the backbone of our internet. They store, communicate, and transport the information we produce every single day. The more data we create, the more vital our data centres become.
Data centres are at the heart of the world’s growing technological focus and continue to experience physical transformations of their own. With over 175 zettabytes of data expected by 2025, these facilities will continue to play a vital role in the ingestion, computation, storage, and management of information.
The larger network providers are putting a significant amount of resources into upgrading their infrastructure. As a result, we will see the gigabit rollout with enhanced network and node sites popping up everywhere to support 5G, revolutionising healthcare, the way we live, work and play.
Network providers are putting a huge investment into the gigabit cities which means that millions of users at home and work will have access to 1 Gigabit broadband speeds, allowing them to download an Ultra High Definition 4K films at a dramatically higher rate, and use multiple streaming devices at the same time. It opens up a world of possibilities, including cloud-based gaming, 8K streaming, remote health telemonitoring and advanced telepresence, which could enable consumers to go ‘virtual reality shopping’ or watch live broadcasts of holographic sports events.
The market worth
Currently, the global data centre services market is estimated at $22B. The global data centre construction market could be worth up to $57-60B by 2025.
The sector in the US was estimated to be worth £4.42 billion in 2020, while China – the world’s second-largest economy – is projected to grow to £4.18 billion by 2027.
Once a niche investment and inflexible asset for global enterprises, these facilities are now a cornerstone of the information economy, and $100 billion has poured into the asset class over the past decade.
Market share
The largest cloud platform providers – Amazon, Google and Microsoft – became the most influential players in many markets, altering data centre sizing by a factor of 10. The 10-megawatt (MW) data centre that was impressive ten years ago now pales in comparison to 30-MW leases now signed with increasing regularity.
The biggest markets
Worldwide = US, China, Australia, Canada, Japan and Singapore.
Europe = London, Paris, Berlin, Milan, Zurich, Dublin and Warsaw.
Location
Access to fibre networks, the price of energy sources, and the surrounding environment all play a role in choosing where to build data centre facilities. Placing energy-hungry data centres next to cheap power sources makes them more affordable to run. This can also be a cleaner option. As emissions rise and big tech companies are increasingly criticised for using dirty energy to power data centres, clean sources are an important consideration.
In addition to proximity to cheap, clean energy sources, companies building data centres are also looking for cooler climates. Places near the Arctic Circle, like Northern Sweden, can allow data centres to save energy on cooling. An older Google facility in Hamina, Finland uses seawater from the Bay of Finland to cool the facility and reduce energy use.
Some organisations are experimenting with submerging their centres underwater to make cooling easier and more energy-efficient. The constant supply of naturally cool deep seawater means that heat is transferred from equipment into the surrounding ocean. Because a data centre can be placed off any coast, it can elect to be connected to clean energy — like Microsoft’s 40-foot-long Project Natick, which is powered by wind from the Orkney Islands grid.
Data centre company Verne Global has recently opened a campus in Iceland that taps into local geothermal and hydroelectric sources. It’s on a former NATO base and located midway between Europe and North America, the world’s two largest data markets.
Connection to fibre
For many years, fibre networks have been a key factor in choosing where to build data centres. While data is often delivered to mobile devices via wireless networks or local Wi-Fi, that’s usually just the last part of the journey.
Most of the time, data travels to and from data storage facilities through fibre-optic cables, making fiber the nervous system of the internet. Fibre connects each centre to cellular antennas, in-home routers, and other data storage facilities.
Security
Security is another important consideration. This is crucial for data centres that house sensitive information.
As an example, Norwegian financial giant DNB partnered with Green Mountain Data Centre. Green Mountain placed DNB’s data centre in its high-security facility; a converted bunker inside a mountain. Green Mountain claims that the mountain is entirely impenetrable “against all kinds of dangers, including intruders, terrorist attacks, volcanoes, storms, earthquakes, and crime”.
Another example is “Swiss Fort Knox”, located underneath the Swiss Alps, with a door camouflaged to look like a rock.
Due to the high-security nature of the servers at certain centres, there are plenty whose locations have never been revealed. For these data centres, location information can be wielded as a weapon.
The physical security of the data centre is paramount to protect customers’ data. There are usually several layers of security. Google facilities use six layers, for example.
Security layer one refers to the property boundaries such as signage and fencing. It is worth noting that there are anti-climb fencing equipped with fibre so the technology can tell if somebody is near to the fence or touches it.
Layer two includes the main entrance gate; security features such as thermal cameras and 24/7 guard patrols or a vehicle crash barrier that is designed to stop a fully loaded truck from crushing through the front entrance.
Layer three is the building access. Once through the gate there will be a secure lobby where iris eye scanning will be undertaken to authenticate the person, along with their ID.
Layer four includes the security operations centre or SOC, a hive of activity that is monitoring the data centre 24/7, 365. All cameras, scanners and fences are connected to there. If anything out of ordinary happens the security will be able to pick it up.
Layer five – the data centre floor that is very seldom visited, even by permanent staff. Only the technicians and engineers that have to be there to maintain, upgrade or repair the equipment are ever allowed there.
Layer six is the so-called ‘crusher room’ – a place where disks are erased and destroyed and the fewest number of people are allowed to enter.
There are security testing programmes in place. One such programme hires companies to try to break into data centre sites from the outside and others to try and break security protocols from the inside.
Edge data centres
Small, distributed data centres, called ‘edge data centres’, are being deployed to provide hyper-local storage and processing capacity at the edge of the network.
While cloud computing has traditionally served as a reliable and cost-effective means for connecting many devices to the internet, the continuous rise of the Internet of Things and mobile computing has put a strain on data centre networking bandwidth.
Edge computing technology is now emerging to offer an alternative solution to bigger facilities.
This involves placing computing resources closer to where data originates (i.e. motors, pumps, generators, or other sensors) — in other words, the ‘edge’. Doing so reduces the need to transfer data back and forth between centralised computing locations such as the cloud.
While still nascent, the technology already provides a more efficient method of data processing for numerous use cases, including autonomous vehicles. Tesla cars, for example, have powerful onboard computers which allow for low-latency data processing (in near real-time) for data collected by the vehicle’s dozens of peripheral sensors. This provides the vehicle with the ability to make timely, autonomous driving decisions.
The edge data centres, which are typically the size of a shipping container, are placed at the base of cell towers or as close to the origination of data as possible.
Mega data centres
On the other end of the spectrum are ‘mega data centres’. These contain at least 100,000 m2 of data centre space. These facilities are large enough to serve the needs of tens of thousands of organisations at once and benefit greatly from economies of scale.
While these mega data centres are expensive to build, the cost-per-square metre is far better than that of an average data centre.
One of Microsoft’s most recent investments is Project Osmium, located on 200-acres of land in West Des Moines, Iowa. It’s expected to be completed in 2022 at a cost of $3.5B. This mega data centre totals 300 000m2 of space.
Designing a data centre
There must be a carefully considered, highly resilient infrastructure to ensure that service is maintained, enabling the business, operations and systems to operate as effectively as possible. Clients in this arena tend to be hands-on and are looking for total commitment from the design teams to deliver such projects.
Uninterrupted critical services are required as a standard. This poses a unique challenge for the design teams. The design teams themselves are large and complex, typically consisting of architects, fire protection specialists, cooling systems engineers, power systems specialists, and security and ICT professionals with significant input from sustainability consultants and CFD (Computer Fluid Dynamics) analysts.
Such projects are managed in a sophisticated BIM environment.
----
Witold is responsible for overseeing the implementation of our BIM strategy at our Warsaw studio and is an RICS-qualified BIM Manager. Get in touch via email: wnowakowski@chapmanytaylor.com