3. Network-based Services

Apart from the basic functions of the network described in section 2 above, a wide variety of key infrastructural network services are also provided by OUCS. These are used by all sectors of the University, some consciously but others unconsciously in the background. These services include:
  1. Domain Name Server (DNS)
  2. Email relay server
  3. Network time server
  4. Dynamic Host Configuration Protocol (DHCP) service
  5. Web cache (discussed above)
  6. Web server
  7. Email POP and IMAP servers
  8. Network News server
  9. Mailing List server
  10. File backup and archive server (Hierarchical File Store, HFS)
  11. Windows Internet name server
  12. Novell Directory Services (NDS) tree server
Most are based on PC equipment running Linux, with a few using Sun equipment running Solaris. The choice is determined by the requirements of the application software, though the PC/Linux solution is preferred (for cost and supplier-independence reasons) where feasible. Typically, all require and exhibit very high reliability.
In addition to the above computer-based services, other key network services provided more directly by staff include:
  1. Oxford Emergency Computer Response Team (OxCERT)
  2. Ethernet Maintenance service

Details of all these services can be found on the OUCS Web pages at www.oucs.ox.ac.uk. Several of these services are discussed in more detail below.

3.1. Email Services

3.1.1. Email Relay

The Email Relay service handles all the University's incoming and outgoing email (with the exception of a couple of departments who still operate independently). It also handles all inter-system email within the University. It directs email to the appropriate email sever, performs address checks and rewrites addresses to the standard form (where requested by the relevant department to do so), handles distribution of multiple-recipient email, `spools' email against non-responding recipient systems, etc. The number of messages handled during the year approached 120,000/day on average [figure 21], with the volume of traffic amounting to over 2,000 Mbyte/day on average [figure 22]. Both the volume of messages and the size of each message continue to rise inexorably [figure 23].
Some email cannot be delivered immediately, and must be stored until systems become willing to receive them, but the proportion being delayed in this way has steadily declined over the years [figure 24]. Some indication of the source and destination of email messages can be gleaned from figures 25 & 26.

3.1.2. Herald Email Server

There are now 10,000 users on Herald. Performance and availability remain excellent. The IMAP/POP service had two outages: one for a scheduled reboot of the cluster nodes to avoid a problem that occurs in that vintage of Linux kernel when 497 days uptime is reached; the other when the memory and disk controller of one IMAP server both failed and the node needed replacing with a warm spare. There was also one WING outage (the web-mail gateway) on a separate occasion. The amount of disk store available was increased to allow for a larger allocation to staff users.

3.1.3. Mailing List Service

The Mailing List service is provided by a dedicated PC running Linux. It uses the public domain Majordomo software, and managers over 700 lists. Support and encouragement for the national Mailbase service (which in total handles 2,300 mailing lists) is also provided.

3.2. News Service

The news service is provided by the server news.ox.ac.uk. Two minor updates of the server software were completed during the year. There was no unscheduled downtime.
The news feed is received from the JANET Usenet News service (http://www.ja.net/usenet/ - out of date). The primary server is at Rutherford and the secondary one at ULCC. This service continues to be extremely reliable. The amount of news received by news.ox.ac.uk has stayed relatively constant, with an average of 168,000 messages being received per day [figures 27 & 28]. This is approximately half a gigabyte in volume. The average number of client connections to the server was 7,829 per day, but peaked at over 11,500 per day during term [figure 29]. At the end of July 2000 the server was carrying about 5,200 newsgroups in 50 different hierarchies.

news.ox.ac.uk continues to provide a news feed to other sites, with NAG and OUP being the main recipients.

News Activity by Month
Figure 1. News Activity by Month

3.3. Web Server

By July 2000, it was acting as Web server for 108 domains — mostly colleges and departments within the University, with addresses of the form `www.department.ox.ac.uk'. Weekly hits rose to a fairly stable average of 3 million, peaking to 5.6 million one week [figure 32]. It holds the home pages of 1800 users, and holds 325,000 web pages in all, totalling 10 Gbytes. Of that, 14,000 pages totalling 300 Mbytes make up the OUCS Web hierarchy [figure 31].

This system has been up continuously for 462 days (since the last hardware upgrade) with no outages, scheduled or unscheduled.

3.4. Backup and Archive File Server

The Hierarchical File Server (HFS) project was started in 1995 to provide large-scale filestore services to the University community. The HFS runs software called ADSM (the name changed recently to Tivoli Storage Manager) which is from Tivoli, a wholly owned subsidiary of IBM. The two main services provided by the HFS are (1) a site-wide backup service [figures 33 & 34] and (2) long term data repository service for university assets.
Since the original procurement of the HFS computer systems, we have instigated a rolling programme of upgrades of the principal hardware components (IBM RS/6000 computers, an IBM 3494 Automated Tape Library (ATL) and 3590E tape drives) which has kept the system capacity ahead of demand. The RS6000/H70 which was installed in July 1999 (reported on last time) to take over the entire backup service for desktop systems, as well as departmental and college servers, has performed very well throughout the year.
The improved performance of the H70, coupled with the gradual introduction of the GigaBit Ethernet backbone earlier this year, has seen the backup load grow significantly. Whilst not saturated performance wise, the ADSM database on the H70 was seen to grow at a rate which could not be sustained on a single server in the medium term. The database records information about every file backed up: more than 120 million were recorded by summer 2000.

Analysis of the workload showed that the `desktop' service and the `departmental/college server' service were approximately equal so in July 2000 an RS6000/M80 was installed to take over the `server' workload with the `desktop' service to remain on the H70. The intention was also for the M80 to take over running the ADSM Archive/HSM server, transferring it from one of the original R40s [figures 35 & 36].

The 3590E tape drives, installed during the early summer in 1999, have settled into service after some teething problems with the 3590E microcode. They have performed very well, are faster than the previous generation technology and have double the capacity of the 3590s they replaced (20 vs 10 GB each). Over the winter, all 3590 media were rewritten at the 3590E double density specification; this involved processing around 6000 tapes which hold multiple copies of all files held on the ADSM servers. This operation freed up many slots in the ATL, giving capacity for service growth. All the original tapes were reused; the small percentage which failed the re-write at the higher specification were replaced (around 150 tapes). No data were lost.

The site-wide backup service continues to offer a highly reliable and available facility with great convenience and simplicity to the end user.

The archival repository holdings grew steadily in terms of new projects and data stored with the major holding continuing to be dominated by the Celtic and Mediaeval Manuscripts pilot. This five year project which came to a close this summer had archived 4.5TB data comprising more than 140,000 valuable images.

Oxford hosted an ADSM Symposium in September 1999 which attracted about 170 attendees, mostly from Europe but including users from USA and Australia as well as ADSM developers.

Up: Contents Previous: 2. Network Infrastructure Next: 4. Central Servers