2. Infrastructure

2.1. Network Infrastructure

2.1.1. Connection to Janet

The previous annual report noted the introduction of our connection to SuperJANET 4 in March 2001. This was achieved through the Thames Valley network (TVN, comprising Oxford, Oxford Brookes and RAL), which ran at 622 Mb/second. In July 2002 this bandwidth was increased to 2.4 Gb/second, which gives us the potential to utilise the full 1 Gb/second connection bandwidth available from the TVN. As is illustrated by the graphs in Figures Figure 1, Janet Monthly Inwards Traffic, MB/Day and Figure 2, Janet Monthly Outwards Traffic, MB/Day , these increases have allowed the rise in demand for network access to continue unabated.

Janet Monthly Inwards Traffic, MB/Day
Figure 1. Janet Monthly Inwards Traffic, MB/Day
Janet Monthly Outwards Traffic, MB/Day
Figure 2. Janet Monthly Outwards Traffic, MB/Day

2.1.2. University Backbone Network

The gigabit Ethernet backbone, installed in 2000-1, has proved extremely reliable in operation, and has accommodated the growth in demand, as illustrated in Figure Figure 6, Gigabit Ethernet Daily Traffic (GB/day).
Nodes connected to
     Oxford's Backbone network
Figure 3. Nodes connected to Oxford's Backbone network
The basic configuration of the backbone network, a double star network consisting of two central switches and 10 edge switches, connecting to about 190 departmental and college subnetworks, remains unchanged. Figure Figure 4, shows the geographical areas of Oxford covered by the network, and Figure Figure 5, is a schematic of the connections within Oxford and to the Janet network. The number of nodes connected via the subnetworks continues to rise at a steady rate [Figure Figure 3, Nodes connected to Oxford's Backbone network]. New applications, in particular those related to the Grid, are expected to bring even greater demands, and plans are already in hand to upgrade the backbone to 10Gb/second in the latter part of 2003. This upgrade will allow individual units to connect at up to 1Gb/second.
Figure 4.
Figure 5.
Gigabit
     Ethernet Daily Traffic (GB/day)
Figure 6. Gigabit Ethernet Daily Traffic (GB/day)
Dial-up Calls
    (weekly figures)
Figure 7. Dial-up Calls (weekly figures)

2.1.3. Dial-Up Service

Although the dial-up service continues to be popular and heavily used, the fall in usage noted last year has continued this year [Figure Figure 7, Dial-up Calls (weekly figures)]. This is probably because of the increased availability of broadband connection to the home, and the use of VPN through outside ISPs. The service will be kept under review, and line capacity reduced ,if appropriate.

2.2. Network-based Services

Apart from the basic functions of the network described in section 2.1. Network Infrastructure above, a wide variety of key infrastructural network services are also provided by OUCS. These are used by all sectors of the University, some consciously but others unconsciously in the background. These services include:-
  1. Domain Name Server (DNS)
  2. Email relay server
  3. Network time server
  4. Dynamic Host Configuration Protocol (DHCP) service
  5. Web cache
  6. Web servers
  7. Email POP and IMAP servers
  8. Network News server
  9. Mailing List server
  10. File backup and archive server (Hierarchical File Store) (HFS)
  11. Windows Internet name server
  12. Novell Directory Services (NDS) tree server
  13. Virtual Private Network (VPN) server
Most are based on PC equipment running the Linux operating system, with a few using Sun equipment running Solaris. The choice is determined by the requirements of the application software, though the PC/Linux solution is preferred (for cost and supplier-independence reasons) where feasible. Typically, all require and exhibit very high reliability.
In addition to the above computer-based services, other key network services provided more directly by staff include:-
  1. Oxford Emergency Computer ResponseTeam (OxCERT)
  2. Ethernet Maintenance service

Details of all these services can be found on the OUCS Web pages at www.oucs.ox.ac.uk/network/. Several of these services are discussed in more detail below.

2.2.1. Email services

2.2.1.1. Email Relay

The Email relay service handles the vast majority of the University's incoming and outgoing email, and inter-system email within the University. It directs email to the appropriate email server, performs address checks and rewrites addresses to the standard form (where requested by the relevant department to do so), handles distribution of multiple-recipient email, "spools" email intended for non-responding recipient systems, etc. The number of messages handled during the year approached 170,000/day on average [Figure Figure 8, Daily Average Numbers Through Email Relay], with the volume of traffic amounting to nearly 4,000 MB/day on average [Figure Figure 9, Mean Daily MB Delivered by Email Relay]. Both the volume of messages and the size of each message rise markedly every year, and will continue to do so as the size and complexity of email attachments increases.

Daily Average Numbers Through Email Relay
Figure 8. Daily Average Numbers Through Email Relay
Mean Daily MB Delivered by Email Relay
Figure 9. Mean Daily MB Delivered by Email Relay
2.2.1.2. Herald Email Server

The central email server, Herald, which has been running since 1998, offers a mail-store facility to all University members. The mail can be accessed by desktop mail clients using IMAP or POP, or by a dedicated web interface, WING. All new students are pre-registered with accounts on Herald. Demand on this service, as on every email service, increases inexorably, and the servers have been regularly upgraded and expanded to meet this demand. The discrete model chosen has been proved to be as scaleable as planned. It now supports more than 29,000 users, and regularly has a concurrency of over 3,000 users.

2.2.2. General Web Servers

The central web servers hold the University's top-level pages, the OUCS web pages, many departmental and college pages, and those for many individual users. In all they support about 130 domain hierarchies, and pages for about 3,800 individuals. In total this amounts to about 500,000 files, occupying some 21 GB. The number of web accesses, the ‘hit rate’, is now about 6 million per week. During this year the service was split between two servers; response time remains excellent.

2.2.3. Hierarchical File Server

The Hierarchical File Server (HFS) service started in 1995 to provide large scale central filestore services to the University community. The HFS runs IBM software named the Tivoli Storage Manager (TSM, which was formerly known as ADSM). The main services provided by the HFS are (1) Desktop and Departmental/College Server backup service and (2) long term data repository service for university digital assets.

Since the original procurement of the HFS computer systems, a rolling programme of upgrades of the principal hardware components (IBM p-Series computers, an IBM 3494 Automated Tape Library and 3590E tape drives) has kept system capacity ahead of demand. Two TSM servers are currently hosted on the M80, one for the server backup service and the other for the long term archive service. The H70 runs a single TSM server and provides the desktop backup service. All services have had satisfactory performance throughout the year. The main challenge has been in keeping apace with recent substantial growth in demand, particularly in the backup service. As in the previous year, the server service is showing the fastest rate of increase; both desktop backup and archive continue to grow steadily.

Figure Figure 10, Desktop Storage (TB) shows almost 50% growth in the desktop backup service over the year, measured in TB data held on the server and is broken down by divison. Figure Figure 11, Desktop Storage share (%) shows the proportion of desktop backup data, over the year, by divison. Figure Figure 12, Server Storage (TB) shows almost 90% growth over the year for the departmental/college server backup service; Figure Figure 13, Server Storage share (%) shows the divisional proportions. There has been a steady 23% growth in archive, without any significant change in the overall distribution by division.
Desktop Storage (TB)
Figure 10. Desktop Storage (TB)
Desktop Storage share (%)
Figure 11. Desktop Storage share (%)
Server Storage (TB)
Figure 12. Server Storage (TB)
Server Storage share (%)
Figure 13. Server Storage share (%)
Figures Figure 14, Backup & Archive (TB) and Figure 15, Backup & Archive (Millions) give an indication of overall growth for the combined archive and backup services detailing growth in files and data in TB. They are indicative of substantial rates of growth. Figure Figure 16, Service share by service type gives a proportional breakdown of HFS data into the three types. Figure Figure 17, Service share by division gives an end of year divisional breakdown of HFS data held overall. Figures Figure 18, Desktop clients by platform and Figure 19, Server clients by platform gives a snapshot breakdown for the Desktop and Server backup service clients by operating system type. These breakdowns are an indicator of overall platform take-up across the university.
Backup & Archive (TB)
Figure 14. Backup & Archive (TB)
Backup & Archive (Millions)
Figure 15. Backup & Archive (Millions)
Service share by service type Service share by division
Desktop clients by platform Server clients by platform

The HFS web pages have been completely re-written and amalgamated into the revitalised OUCS web pages. The TSM server and client software were upgraded from Version 4.1 to 4.2.

The contents of all primary tapes which reside in the automated tape library were re-written onto new double-length media during 2002; the operation began in February and was completed by the end of the summer, freeing up many tape slots in the library and helping to cope with the all round service growth in demand.

The site-wide HFS service continues to offer a highly reliable and available facility with great convenience and simplicity to the end user. Sustaining the level of service without any increase in HFS manpower or hardware resources was a very significant challenge. The HFS has been awarded significant HEFCE funding in the 2003/2004 period for major upgrades to the infrastructure, including the tape and robotics systems. The performance data collected during this period will be invaluable in planning the enhancements.

2.3. Central Services

2.3.1. Central Unix Service

Planning the close down of the remaining central Unix server, Ermine, began in this period, and will come to completion before the end of 2002-3. Ermine has run reliably and required very little attention, although there has been concern about increasing levels of security threat to equipment of this type.

2.3.2. Computer Room Operations

The computer room operations staff are responsible for all operational duties associated with substantial computing and network equipment housed there, especially the Hierarchical Fileserver and its robot tape library. In addition, OUCS houses various major computers belonging to other University departments, including the main OLIS library server and the parallel computer complex run by the Oxford Supercomputing Centre. The computer room equipment is protected by a large Uninterruptible Power Supply, and multiple air-conditioning systems.

The reorganisation of OUCS frontdesk service provision allowed operations staff to play a much greater role in user support, and OUCS continues to move towards greater integration of all its user-facing activities.

2.4. Security Services

The danger to the University of attack on its computers and network facilities cannot be overestimated. The threat of denial-of-service attacks, aimed at the internet as a whole or specifically at our networks, has received wide publicity, as has the damage caused by the computer infections usually known as viruses, but encompassing many specialist forms of attack. The defence usually adopted in the commercial world, of a very restrictive firewall policy, is not available in a University whose lifeblood is the range of communications it has to support with outsiders, and with its own members working from elsewhere, and so the firewall that OUCS runs on the external network connection has to be far more permissive than is desirable for tight security. The problem is greatly exacerbated by the wide range of computers connected to the network, and the differing support standards of these computers.

The security team at OUCS is responsible for monitoring the networks for indication of compromise and for taking action as soon as a compromise is detected. It also has a general pre-emptive role in trying to be aware of and blocking routes for potential attack, and educating the vast number of systems administrators and systems owners around the University of the need for security precautions on their own computers. It has proved very successful at this, and our large and complex computer network, and the computers connected to it, have suffered very little disruption. However, every new exploit that comes to light reveals computers on the Oxford network that have not kept up with security precautions.

OUCS is assisted in this task by the OxCERT security team, a group of systems experts from OUCS and other departments which provides advice and assistance on issues relating to computer security. The OxCERT team has achieved standing and recognition within the international community, and is a full member of FIRST, the Forum of Incident Response and Security Teams. During this year, OUCS's Security Manager was Vice-chair of the FIRST Steering Committee.

Up: Contents Previous: 1. General Overview Next: 3. Technical Support Services