Sustainable Information Technology Series – Server Room & Data
The proper set up of a server room is certainly not on the top of
most small businesses lists of mission-critical issues, although
perhaps it should be. Over the years, we have seen many server rooms
and data centers, some good and some are not so good. (The worst
we’ve seen was sharing space with a men’s room!)
What most surprised us was seeing how some medium and large
businesses operate their server rooms and data centers. Some may
appear to be well designed and maintained, but this façade is
quickly revealed when the power or HVAC fails. Even the
best-designed data centers that follow all best practices have
failed, despite all their built-in protections. And some have failed
repeatedly over the course of a few months.
Large Data Centers
You have probably seen depictions of data centers either in person,
if you work in high-tech, or if not, in the movies. The archetypal
data center has a raised white floor, is temperature controlled with
a sophisticated fire and smoke detection system, moisture sensors,
controlled power with an uninterruptible power supply, and huge
bandwidth to the Internet. Of course, data centers also have very
heavy-duty security control systems with electronic key locks, 24x7
security guards, and surveillance cameras. This is because the data
center security mantra is well understood by IT professionals:
physical access = full access and full access = zero security.
Why is a raised white floor part of the ideal data center? A raised
floor performs three functions:
It prevents flood: Whether caused from external,
environmental factors or something as simple as a broken pipe, a
raised floor is key to reducing equipment damage. In a minor flood,
where water doesn’t reach the equipment on the elevated floor, an
electronic short will trip the circuit breakers, preventing a major
incident. Of course, the floor will not prevent electrical or
equipment failure if the water gets too high. If your data center
gets hit by a tidal wave, you’re out of luck.
However, equipment sitting on the floor is still in danger. It can
short out before the circuit breakers have a chance to trip by
something as small as a quarter inch of water from something as
simple as an air conditioner leak.
It maintains a clean, organized environment: It is much
easier to keep the center organized and clean with a raised floor.
Think of all the connections needed to and from the equipment and
electronics in the room: power cords, telephone lines, and data
equipment wires and cables are in abundance.
With a raised floor, all these can be run under the floor and easily
accessed by removing the floor tiles. Cords run between wall plugs
to Uninterruptable Power Supply (UPS). Power cords run from the UPS
to the equipment; network cables run from all the switches, routers,
and telephone equipment, in addition to keyboard, mouse and video
Imagine if you didn’t have a raised floor: wires are exposed and
anyone can trip over any cable coming out of the server. Imagine
this happens with your server that holds, for example, all of your
accounting information, causing a panicked recovery and ruining your
It controls the flow of air: Air-conditioning can be run
under the equipment as well as over it, providing more options for
airflow management. Holes drilled in the tiles allow cool air to
come up wherever you want, just as it does from the vents in the
ceiling. Also, by putting the equipment in a housing that has fans
on the top, cool air gets pulled from the floor up to the ceiling
Air flow and A/C
Why is A/C so important? Heat is the enemy of electronic equipment:
the hotter it gets, the shorter its life. A server kept at 86
degrees may last only half or a third as long as one kept at 75
degrees. Operated in 90+ degree rooms, servers can fail in days or
While the server specifications indicate a very broad range of
operating temperatures, operating at the extremes is not
recommended, and what is not indicated is how much a server’s life
is shortened as it approaches either end.
Consistency of temperature is important. By increasing and
decreasing the temperature, it is possible to pop the chips off the
circuit boards, crack the soldered connections and generally wreak
havoc with components. Think of thin bits of metal heated then
cooled over and over and how likely they are to warp or break from
Therefore, data centers are kept in an air-conditioned environment,
preferably below 65 degrees. If the room is not full of equipment,
the center can become quite a bit colder, since the A/C is normally
planned for full capacity. In some cases, the thermometer can only
be set so low without affecting the airflow in the room. You can
have too much A/C: it can freeze the ducts and turn off before
humidity is removed from the air, so condensation can form on
components for a self-created flood or at minimum, moisture damage.
A/C output is measured in Tons, and heat output of equipment is
measured in British Thermal Units (BTU) per hour. Something to keep
in mind: 1 Watt of power generates 3.4 BTU/hour; 1 Ton of A/C
handles 13,000 BTU/hour, about 3600 Watts of equipment.
As well, you’ll notice that there are frequently no windows in data
centers since sunlight can play havoc with your A/C calculations.
Airflow matters as much as A/C cooling power. Data equipment does
not do well in uneven temperatures, and good airflow prevents hot
and cold spots. Consistent airflow also keeps the heat that the
equipment generates moving and dissipating, keeping the room
temperature even and consistent.
Large data centers primarily house data racks. Data racks are
measured in Rack Units (RU). A typical full data rack or enclosure
has 42 RU, each about 1¾ inch, making the rack a little over 6 feet
high. Rack-mounted equipment can improve organization by keeping
everything bolted down, making the equipment more stable. In some
cases, the racks have slide-out rails, making the equipment easier
Very large data centers have conditioned power to the whole floor or
the whole building, with UPS and standby generators. Downtime is
costly, and sudden downtime is deadly.
Power ideally comes in from two totally different power grids,
either of which is able to handle the needs of the entire data
center. The amperage can be in the normal range of 15 or 20 amps, or
if the equipment is especially powerful, 30 to 100 amps with
circuits at 110 or 220 volts. Circuits can generally be stepped down
in amperage or voltage, but stepping up can be much harder, since at
some point you may run out of power.
To get high-speed access, many computers require large throughput
connections to the Internet, but by dividing the cost as well as the
bandwidth, these connections can be justified. For example, a DS-3
line at about 45 Mb/s can be shared in a data center by 100
computers, allowing each to have a minimum of.45 Mb/s. The fewer
computers, the more Mb/s each can have. T-1 to OC-255 can be
contracted for guaranteed uptime and bandwidth, and for most data
centers this is worthwhile.
Server Rooms for Small to Medium-sized Businesses
Many of our clients are small to medium-sized businesses that need
the features and benefits of a full-blown data center, but think
that they don’t have the space or the means. They’re wrong. Many
aspects of a large data center can be scaled down in cost-effective
ways. For example, a small room can serve as a small data center,
sometimes called a server or computer room.
Below are some critical elements:
Security: Security practices include locks on all doors and
limited access to only key personnel. Security monitoring can be
inexpensively provided by a camera with an IP address as a network
camera, and as well, other sensors are surprising reasonable.
Power Requirements: If possible, get the office computers
wired on separate circuits. At minimum, don’t put your computer(s)
on highly variable power, for example, on the same circuit as a
refrigerator or elevator. Refrigerator condensers and elevators can
draw huge loads intermittently.
Get one or more Uninterruptible Power Supplies (UPS). We recommend
getting a model that runs on a battery with a backup wall plug that
keeps the battery charged. These are the most reliable at providing
predictable power, although not always with a clean sine wave
output. UPS are typically rated at 60% of their sustainable
capacity, so a 1000 Volt Amp (VA or 1kVA) unit will handle 600 watts
If you need to bring an electrician out to help with wiring and you
anticipate company growth of your server room or data center, you
should consider bring in multiple amperages for future needs. 20 amp
outlets are keyed for both 15 and 20 plugs, which will be adequate
for most computers. Some high-end servers, networking equipment and
SAN or iSCSI based storage arrays require 30 amps. Normally, unless
you’re using Enterprise level equipment, more amperage is not
Communication links: All modern server rooms require good
communication links, especially data communication links. Ideally,
the building where the server room is housed would accommodate more
than one type of data communication link, such as DSL, T1, Wireless,
Ethernet or Fiber to Internet.
Temperature controls: Adequate A/C and air circulation needs
to be provided to the room. Because this is so key, it may be
worthwhile to bring in an A/C consultant familiar with the needs of
small server rooms.
Some factors to consider are:
Add 4000 BTUs for each room below a ceiling or
roof which is not insulated (not recommended).
Add 1500 BTUs for each window which receives
significant daily sunshine.
Add 1500 BTUs if room is above a boiler room or
kitchen (if it is in use).
Add 600 BTUs for each person in the room. You
can subtract 1000 BTUs if you are on the shaded side of a
The side of the building you are on can affect A/C
calculations; south or west sides of buildings get more afternoon
sun and therefore absorb more heat.
Many commercial building will turn off A/C on nights and weekends.
While commendable from a power usage perspective, this can play
havoc with your server room’s temperature. We have seen south-facing
server rooms that went up to 85 degrees at night and higher on
weekends in all but the coldest months.
Sometimes the only solution is to get dedicated A/C for your server
room. Small A/C units used for this purpose can work quite well. You
might consider getting one with a higher amp draw since they
typically out-perform those that function at 15 or 20 amps. But also
remember that circulation is key. Putting the computer in a colder
area of the room is not as efficient as exhausting the warm air and
circulating the cool.
Physical Layout: It is best to have full access to your data
equipment or have it on wheels so it can be pulled out. Computers
have connections on the back and the front, and hardware maintenance
or upgrades require side panel removal.
Computers can be raised off the floor without a specially designed
white tile floor, either bolted to the floor on a full or half-sized
rack, depending on your anticipated needs or placed on wire shelving
that allows for good air circulation around your equipment. Cable
management systems can be added to the rack or shelves for better
To bring data from your telephone closet (where the DSL, cable modem
or T-1 are located) to the server room, all that is typically
required is a RJ-45 wire.
Good cable management is recommended. Tie or wrap cables together in
bundles to keep things neat and organized and to reduce the
likelihood of individual cables being pulled out accidentally. There
are many commercially available products to organize your cables.
Monitor & Management: There are network-reachable monitors
for fire, moisture and other security. In fact, in some cases the
same system you use for security and fire prevention can be modified
for network access.
Many server-class computers have management hardware and software
that can track internal temperatures. A small temperature sensor
attached to the network that can monitor external temperature can be
very cost-effective. Of course, it will not do much good without
monitoring software to track temperature changes and alert the
appropriate personnel to take action.
Many of our clients take advantage of our 24x7x365 real-time
monitoring and management. Some of the key benefits are:
Reduce short and long-term capital spending on
additional hardware and software which would be required to
provide the same level of SSOC services.
Reduce time spent on overall IT maintenance
issues. It has been estimated that maintenance issues can
consume 50% of total IT time.
Reduce errors made in performing IT maintenance
by taking advantage of the TNS procedural approach and
Reduce staff time spent after hours and on
weekends by taking advantage of the SSOC 24x7x365 operation.
Ability to have 24x7x365 real-time monitoring
and notification of:
Ability to monitor your critical equipment
and server room temperature (-40°F to 212°F), humidity (5% to
95%) & wetness (1% to 100%).
Ability to have 24x7x365 real-time corrective
action taken by our SSOC engineers, remotely or on-site.
Ability to patch your critical OS and
applications after hours and on weekends by our SSOC engineers
remotely or on-site.
Ability to restart your mission-critical
applications remotely or on-site.
Ability to track the possible network
intruder by our SSOC engineers remotely or on-site.
Ability to provide up-to-date network
documentation (Green Book) for your review.
No So Data Center / Server Room
Some places we don’t recommend you keep your servers or data
Outside, especially where it can get wet. Yes, we’ve actually seen a
company with their server and telephone equipment in a shed.
Under desks. If you are going to put your servers in the main office
space rather than a server room, small wire racks such as baker’s
racks can be effective at getting the equipment off the floor. Racks
with wheels are especially useful.
In an unsecured shared space. People can and do steal servers.
Remember: physical access = full access. Obviously more security is
better. A knowledgeable person with a flash drive can break into
your server or computer in less than a minute. The Internet, for
good or bad, brings easy access to knowledge, so it is not hard to
be a knowledgeable hacker anymore.
To sum up, the key elements you need to consider for
your server room or data center are:
Monitoring & management systems
Regardless of the size or your server room, the
issues and solutions are similar. With a little planning, it is very
possible to scale the correct solution to each.
Jerome Ware Biography
Back to Top
Information Request Form
Select the items that apply, and then let us know how to contact you.