What is the University Shared Data Centre?
The University Shared Data Centre (USDC) is the University's state of the art,
resilient data centre run by OUCS on behalf of the entire University.
It can house any IT equipment which is rack-mounted for departments
and colleges. Each can have their own secure space, networking and 24/7 access.
Examples of equipment currently located in the USDC include:
- SAN disk systems
- NAS storage arrays
- Network switches and firewalls
This equipment is providing a wide variety of services to a number of departments,
colleges and research groups including:
- File servers
- Print servers
- Web sites
- Research computing clusters
- Large-scale core university services (such as the HFS)
Why use the University Shared Data Centre?
The USDC is designed to be resilent at all levels. It offers resilience features that would
be costly to implement in a small server room. It is designed to be an always-on facility
for housing vital IT services on which the University depends. Specific
resilience-related features of the USDC include:
- Dual electrical grid connections and transformers
- 2N Uninterruptible Power Supplies (UPS)
- N+1 Environmental Systems
- Water leak detection
- VESDA early warning fire detection system
- Localised high-pressure water-vapour fire suppression system
- Dual-path data centre network
- Per rack environmental monitoring
These protect against events such as: a failure of the power coming into the building,
a network failure, a fire, a cut cable, a water leak/flood and a failure in any cooling
equipment which may cause the computing equipment to overheat.
The USDC was designed with security in mind from the outset. Operated as a near
lights-out facility it includes many technical measures to ensure all equipment
is securely housed including:
- Biometric, proximity reader entrance system
- Anti-tailgating security portal
- 365 degree CCTV systems
- Proximity card access to racks
- Full audit trail at room-level of who entered/exited and when
- Full audit trail at rack level of who opened a rack and for how long
These measures ensure the facility can be operated unmanned whilst still allowing
24/7 access for those with equipment housed in the USDC.
Designed with energy efficient technologies, equipment located
within the USDC benefits from reduced on-going energy costs. The data centre incorporates:
- A holistic design: the data centre assists in heating and cooling the rest of the building
- Tri-generation technologies
- Cold aisle containment to maximise cooling efficiency
- Detailed energy usage monitoring
These technologies ensure that the USDC uses as little energy as possible to maintain the environment
for the IT equipment located within it.
In a traditional server room the energy usage will be between
2 and 2.5 times the amount of energy used directly by the IT equipment (this figure is known as the Power Utilisation Effectiveness or Power Usage Efficiency - PUE). This additional energy goes into keeping the enviorment cool, maintaining any CCTV or security and also includes the losses of any UPS systems as these are typically only 80% to 90% efficient at low electrical loads. For comparision the USDC is designed to use less than 1.5 times the energy.
Locating equipment in the USDC could see energy savings (and the associated carbon and cost savings) of between 25% and 40% each year compared to locating the equipment in a small server room. Placing services onto the private cloud would reduce emissions further.
How to make use of the USDC
The USDC provides a number of options depending on your exact requirements. There are a number of choices:
Step 1: How much space?
Rack mounted equipment is measured in Rack Units of 1.75 inches in height known as a "U". We offer three different sizes of racks to accomodate vartying amounts of equipment:
- Full rack – 40U, 32 C13 and 8 C19 power sockets
- Half Rack – 16U, 24 C13 power sockets
- Third Rack – 9U, 16 C13 power sockets
- Per-U – space in a shared rack for just one or two servers
All full, half and third racks are independent and permit 24/7 access to only your assigned staff.
The half and third racks operate with stable doors and
each area is segregated
from those above or below with its own networking and power
The cost of using the USDC is determined by the amount of space you need and how much power your equipment requires. For further details see our Pricing pages.
Step 2: Which Network Connection?
For each rack (or half or third or Per-U server) there is a choice of network provision
. Each can have:
- A single connection to the Campus network
- This network configuration makes the rack appear via the annexe connection on your local Frodo. Frodo port charges will not be levied for this - neither the port in the USDC nor the port at your local Frodo even if you do not have any annexe buildings currently. One 1Gb connection will be provided to the rack and all switching within the rack is your responsibility.
Equipment in the rack will appear on your subnet behind your firewall - your local LAN is extended into the USDC.
- A single connection to the Datacentre network
- Your rack is provided with a single 1Gb connection to the data centre network upon which you will be allocated a new small subnet (typically a /28 or /29). This subnet will be linked to your DNS suffix in the Interface for Hosts Update DNS tool. Switching and firewalling within the rack is your responsibility.
Equipment in the rack will appear on a new subnet.
- Multiple connections to the Datacentre network
- Your rack is provided with multiple 1Gb connections to the data centre network upon which you will be allocated a new small subnet (typically a /28 or /29). This subnet will be linked to your DNS suffix in the Interface for Hosts Update DNS tool. There is no firewalling and all switching is our responsibility. We can also offer private management/cluster heartbeat networks over the data centre infrastrucutre for those that require this. This option offers the highest resilience for equipment requiring an always-there connection. There is no single-point of failure for equipment connected via multiple connections.
Equipment in the rack will appear on a new subnet and can have multiple connections for maximum resilience.
Step 3: Contact Us
Lastly, please email email@example.com to further discuss and refine your requirements, get a rack allocated, get connected and start taking advantage of the USDC.
Service Level Descriptions
The USDC co-location service is governed by the following service level
For further information please see the Colocation information sheet, look at our FAQ or email firstname.lastname@example.org.