It is the brain of a company and the place where the most critical processes are run. Find out why data centers are necessary and – looking at SAP’s data center in St. Leon-Rot as an example – what they contain, and how they are operated.

Large-scale computer systems have been around for a while, and many people are already familiar with the term data center. In the 1940s, computers were so large that individual rooms had to be specially set aside to house them. Even the steady miniaturization of the computer did not initially change this arrangement because the functional scope increased to such an extent that the systems still required the same amount of space. Even today, with individual PCs being much more powerful than any mainframe system from those days, every large-scale operation has complex IT infrastructures with a substantial amount of hardware – and they are still housed in properly outfitted rooms. Depending on their size, these are referred to as “server rooms” or “data centers.”

The Data Center in Pictures

  1. Early morning at SAP’s data center in Germany. Well protected.

  2. Rows of server racks for internal applications and the cloud.

  3. Cooling. Technology in generous proportions.

  4. Circulation pumps distribute coolant to the server rooms.

  5. Video cameras in the data center. Everything is monitored.

  6. Heat exchangers on the roof.

  7. Diesel generators for the emergency power supply.

  8. Highly connected: fiber optic cables.

  9. Control station for the diesel generators.

  10. All throughout the data center: complex technology.

Data centers are commonly run by large companies or government agencies. However, they are also increasingly used to provide a fast-growing cloud solution service for private and business applications.

The basic characteristics are the same regardless of the size of the data because every company’s success invariably depends on smooth software operations – and those have to be safeguarded.

Computers, of course, require electricity, as well as protection from theft and the accidental or intentional manipulation of hardware. Put simply, one has to safeguard data centers against external influences and provide them with sufficient cooling. After all, there is a lot of powerful hardware sitting in one place.

In addition to these “hard” factors, one must also take into consideration organizational measures, such as periodic backups that ensure operability. As a rule, the more extensive and critical the hardware and software become, the more time and effort are required to provide optimal protection.

For that reason, a data center preferably consists of a well-constructed, sturdy building that houses servers, storage devices, cables, and a connection to the Internet. In addition, the center also has a large amount of equipment associated with supplying power and cooling, and often automatic fire extinguishing systems.

An indicator of the security level is provided by the “tier” rating as defined by the American National Standards Institute (ANSI).

  1. SAP's data center guarantees an availability of 99.995%.

  2. Downtime for Tier 4 customers is less than one hour per year.

  3. Critical components are designed redundantly to ensure greater security.

During the design of the SAP datacenter the Tier 4 requirements were used as guiding principles.
The key to success lies in the robust design of every individual component and especially in the redundancy of all critical components. This ensures that SAP can count on its “brain” at any time, and SAP customers can rely on the contractually guaranteed availability of cloud applications running in the data center.

Power supply

The data center is connected to two separate grid sectors operated by the local utility company. If one sector were to fail, then the second one will ensure that power is still supplied.

In addition, the data center has 13 diesel generators, which are housed in a separate building. Together, they can produce a total of 29 megawatts, an output that is sufficient to cover the data center’s electricity demand in an emergency. The diesel motors are configured for continuous operations and are always in a preheated state so that they can be started up quickly in the event of an incident. It only takes an outage in just one of the external grid sectors to automatically actuate the generators.

Both the local utility company and the diesel generators deliver electricity with a voltage of 20 kilovolts (kV), which is then transformed in the data center to 220 or 380 volts.

Within the data center, block batteries ensure that all operating applications can run for 15 minutes. This backup system makes it possible to provide power from the time a utility company experiences a total blackout to the time that the diesel generators start up.

The uninterruptible power supply (UPS) also ensures that the quality remains constant. It compensates for voltage and frequency fluctuations and thereby effectively protects sensitive computer electronic components and systems.

A redundantly designed power supply system is another feature of the data center. This enables one to perform repairs on one network, for example, without having to turn off servers, databases, or electrical equipment.

Several servers or storage units have multiple, redundant power supply units, which transform the supply voltage from the two grid sectors to the operating voltage. This ensures that a failure of one or two power supply units does not cause any problems.


All electronic components and especially the processors generate heat when in operation. If it is not dissipated, the processor’s efficiency decreases, in extreme cases, to the point that the component could fail. Therefore, cooling a data center is essential, and because of the concentrated computing power, the costs to do so are considerable.

For this reason, servers are installed in racks, which basically resemble specially standardized shelves. They are laid out so that two rows of racks face each other, thereby creating an aisle from which the front side of the server is accessible. The aisles are covered above and closed off at the ends by doors. Cool air set to a temperature of 24 to 26°C is blown in through holes in the floor, flows through the racks, and dissipates the heat emitted by the servers.

Generally, a server room will contain several such “enclosed” server rows. The warm air from the server room is removed by the air-conditioning system. Yet, even the air-conditioning system has to dissipate the heat. When the outside temperature is below 12 to 13°C, outside air can be used to effectively cool the heat absorbed by the air-conditioning systems.

At higher outside temperatures, the air-conditioning systems are cooled with water, made possible by six turbo-cooling units. They are not all used to cool the data center, given that some are used as reserve units. Should a cooling system fail, the time until the backup unit is operational must be covered. To that end, 300,000 liters of ice-cold water (4°C) are available to absorb the heat from the air-conditioning systems during this period.

To top it off, the turbo-cooling units also have to dissipate heat. There are 18 heat exchangers on the data center’s roof for this purpose, which release hot air into the environment.

At outside temperatures above 26°C, the heat exchangers are sprinkled with water in order to make heat dissipation more effective through evaporative cooling. The large amounts of water consumed in the summer are covered by waterworks allocated to the data center. The municipal water supply system provides a reserve supply in this case and acts as a failsafe.

Controlled access

  1. Single-person access and mantrap systems allow authorized individuals to enter the data center.

  2. Technicians can access rooms for facilities maintenance via separate entrances.

  3. Maintenance staff has further authorization to enter the server rooms.