[CC(88)17][CCC(88)40] (Ref Nos CHE/24, CHE/24/3)

COMPUTING COUNCIL

CENTRAL COMPUTING COMMITTEE

Annual Report of the Computing Service, 29 June 1987 - 24 July 1988


Introduction
Software and Service Changes
User Liaison
Computing in the Arts
Performance and Reliability
Appendix A. Staff in post - July 1988
Appendix B. Current User Guides

Table 1. Machine Usage
Table 2. Reliability and MTBF
Table 3. ULCC Usage
Table 4. UMRCC Usage


Introduction

Last year's report ended with the formal procurement process successfully concluded, the Convex C1-XP2 system installed and offering a trial service, and the new VAX cluster shortly to be delivered. The principal topic of this year's report is the commissioning and development of the new services.

The VAX cluster was delivered on schedule during the last few days of June 1987. After a short delay while essential electrical work was completed, the hardware was commissioned by Digital engineers during July, and a VMS system to provide the required user image was developed by Digital and our own staff working in collaboration. As anticipated, this took a considerable amount of work since the cluster was one of the largest commissioned by Digital up to that time.

Cluster interfaces and Ethernet connections had earlier been installed in the two existing VAX 11/785s, VAX1 and VAX2, to allow them to be incorporated into the cluster when the latter was installed. VAX3, the smaller 11/780, remains in service attached to the cluster via Ethernet, but not a full member of the cluster.

The system was handed over to the University on 11th August, and a further two weeks' work was carried out by Service staff testing local and third-party software on the cluster. On 24th August the VAX1 usernames and their associated files were transferred across to the cluster filestore and given access to the cluster service. The process took a working day. The transfer of VAX2 users followed three days later. The cluster service was also then made available to users of the ICL 2988: users were able to transfer their own ICL 2988 files directly to the cluster filestore by making use of the existing links from the 2988 to VAX1 and VAX2 and begin immediately on converting their work.

There were, inevitably, a few minor problems in transferring the users and their workload, and a few teething troubles with the cluster service itself. On the whole, however, the transition was quite smooth and the benefit of continuity of the VAX/VMS environment has been generally welcomed.

The frequency of hardware and software incidents in the early months of the cluster service was perhaps greater than expected, for proven hardware from a major manufacturer. On this more is said below. It has though been noteworthy that the resilience associated with a cluster appropriately configured is a real advantage in minimising the loss of service to users. Even when one (or even two) of the four 8000 series processors has been out of action (sometimes for longer than we would wish), the service to users has continued to run albeit with a reduction in the processor capacity available.

The cluster service was thus in operation and reasonably stable for the start of the academic year. During Michaelmas Term the Convex was connected to the Service Ethernet, and commissioning of new graphical output devices was put in hand. Arguably the most important development during this period was the work done to assist the bulk of the ICL 2988 users to transfer to the cluster. Software was developed to allow recovery of files from the 2988 file archive directly onto the VAX cluster. Users were encouraged to move their files to the cluster in good time for the closedown of the 2988, and many responded to this advice. During the last two weeks of the ICL service, however, a major file copying exercise took place in which all character files remaining in the 2988 filestore were transferred to the VAX cluster, and archived there using the VAX file archive system.

The ICL 2988 was finally switched off on 18th December, not without a few regrets at the severing of what has been a long association with ICL and several generations of their equipment.

Since Christmas the picture has been one of consolidation. The ICL equipment has been removed, and it is once again possible to move freely around the machine room. Plans are in hand to reorganise parts of the user area, to provide more and better facilities to users and an improved working environment.

A new arrival in the machine room, additional to the Computing Service systems, is an IBM 9370 Model 90 configuration which is to be the platform for the next phase of the University's library automation programme, using IBM's DOBIS/LIBIS software. Although most aspects of the system management remain the responsibility of the Libraries Board, the Service is accommodating the hardware and providing communications access both via the terminal network and, in due course, via Ethernet to the VAX cluster and thence to JANET.

Looking ahead, a Sun 4/110 and three Sun 3/60 systems have been ordered to provide a general-purpose Unix service and some advanced graphics workstations for the User Area.

Finally, it is fitting to conclude this introduction by welcoming two new University working parties set up by the Computing Council. The first of these was to examine the General Board's staffing and financial baselines for the Computing Service, in the light of the new services which have been successfully implemented and the ever-increasing demands from users for new facilities, especially in support for microcomputers and other aspects of distributed computing. Its remit was subsequently extended to include a wider review of all academic computing, including distributed computing in departments and other institutions.

The second working party is known as the Information Technology Working Group, and its brief is to advise the Computing Council on the need for developments in information services (including administrative and library services), computing services and teaching media. Its membership reflects the breadth of these terms of reference. The group has a demanding task, but one which is extremely important in helping the University to make progress through the present uncertain and difficult times.

[Top]

Service and Software Changes

VAX

Two main aims were pursued when the service on the VAX cluster was set up. These were firstly that as far as possible a standard VMS environment should be offered to users, and secondly that the cluster should present the image of a single mainframe, with the user needing to know little or nothing about the individual cluster machines.

All applications software is licensed on all four main machines (the 8800s VAXA and VAXB, and the 8700s VAXC and VAXD). The terminal network is set up to allocate a user logging in to the machine with the smallest number of interactive sessions at the time. Generic batch queues, preserving the names familiar to users of the previous VAXs, share the batch jobs between execution queues on the four machines.

Various problems were found with the cluster software which made the single image less than perfect. Some, such as the lack of a common cluster accounting file, could be hidden from the users by locally written procedures. The main deficiencies visible to users were the lack of cluster-wide commands for system status information, the number of queues which had to be scanned to locate a job, and the fact that outgoing file transfers were machine-specific. The last problem means that a user queuing a file transfer (including mail transfers) must discover and remember the machine from which the transfer originated, and then log in again to that same machine, if the status of the file transfer is to be checked at a later time. Apart from this the networking functions have been made transparent, and the cluster presents a single image to JANET.

These relatively minor difficulties should not be allowed to obscure the success of the transfer to the new service. For each of the two older service VAXs, the transfer of usernames and filestore was accomplished within a single day. Most users, logging in after this day's interval, will have been able to work exactly as before noticing only that the machine was faster. ICL 2988 users were allowed three months of parallel running in which to transfer their work, transferring files at will over a high speed link to the cluster, before the ICL service was finally withdrawn. The 2988 archive database was installed on the cluster, and commands provided to retrieve files archived on the 2988 into the cluster filestore. Finally, all character files remaining on the 2988 at the end of parallel running were archived on the cluster in VAX format for later retrieval if desired.

The withdrawal of the 2988 brought the end of the HASP service to the National Centres. By that time a reliable JTMP service had been developed on the cluster, and this is now the only route for job transfer to remote sites. JTMP access to ULCC and the Rutherford Laboratory has been generally satisfactory. The MVS/Roscoe JTMP software at UMRCC caused numerous problems to the relatively small number of UMRCC users; these difficulties appear to have been overcome by changing to a different version of JTMP running under VM/CMS at UMRCC.

It was possible to install on the cluster many of the packages formerly offered only on the ICL 2988 - as well, of course, as the packages already forming part of the previous VAX service. New software mounted falls principally into three categories: database software, tape handling, and graphics.

Database provision was effected by mounting INGRES, one of the leading relational packages, and BASIS which is a text retrieval package. The MEDIA tape management system was introduced to provide tape security and control, functions which in VMS are primitive or non-existent, and the ARCHIVE 2000 package to provide a user-driver archive facility somewhat similar in function to the 2988 archive.

In the case of graphics, GHOST-80 has been retained only on one of the 11/785s (now renamed VAXG), for which it was already licensed. To replace it, UNIRAS and SIMPLEPLOT have been mounted on all cluster processors. Device drivers for all the new output devices (Calcomp 1044, Calcomp 5825 and Hewlett Packard 7550) have been written and tested. All package graphics have been converted to use libraries other than GHOST-80, and it is likely that GHOST-80 (and GINO, which is little used) will be withdrawn next year.

Convex

The Convex service has made slow progress since its introduction; expectations of major gains in utility have several times been frustrated. It had been hoped to set up a VMS-like environment using the COVUE shell, and provide an easy route for job submission from the VAX cluster using COVUENet. In the event COVUE has been found to be seriously flawed, so much so that the current version is almost unusable. Major errors found and reported when the software was installed at Christmas are still outstanding in July, and Convex offer no hope of improvement for another six months or so. The decision has therefore been taken to concentrate documentation and user support on the native Unix interface.

COVUENet, when delivered, was found to lack much of the functionality needed in our environment, in particular transparent file access to the cluster filestore from Convex jobs. A partial solution has been found by mounting the NFS file access software on the cluster.

The Convex is nevertheless providing a valuable service to its user population, most of whom are using the Unix interface. The continued uncertainty over the COVUE products, which have made difficult the choice of user interface and the planning of a stable service, are for this reason the more unfortunate.

Microcomputer Support

Demand from users continues to exceed by massive amounts the support which the Service is able to provide. Resources were particularly strained during periods when one of the two full-time support posts was vacant. Happily the section was restored to full strength in April and has remained so since.

Although so far the constraints on the Service budget have prevented the appointment of more full-time microcomputer support staff, a significant part of the time of the User Services Group is now directed to supporting applications packages on micros and to teaching courses in their use. A fuller description of this activity is given in a later section.

[Top]

User Liaison

Allocations

The allocation and budgeting scheme was implemented on the VAX cluster with the incorporation of features drawn from both the former VAX scheme and the ICL 2988 service. New software written by the Service now allows more flexible use of VAX filestore, in that group managers can overallocate within their overall group budget, and individual users can also run overdrafts on their normal allocation for limited periods. Processor time and filestore on the Convex have, for the time being, been allocated by Service staff.

Queen Elizabeth House, Pathology and the Computing Teaching Centre have been added to the list of allocations groups.

Details of all projects, users and usage statistics are now maintained in an INGRES database. This facilitates checking the status of all projects, and greatly improves the efficiency of administering changes in registration of users. Group managers are provided with access to the records of their own groups. In the future the database may be used to generate additional information, such as a username directory, which could be made available to all users.

Courses and Seminars

The year has seen a major increase both in the number of different courses and in their frequency. Introductory courses are now run on a regular basis for microcomputer users as well as mainframe users. The staple mainframe topic is Getting Started on VAX/VMS, which is normally teamed with a Getting Started course on one of the two most commonly-used editors, EDT and ECCE.

These mainframe courses have been in existence for some time. The microcomputer introductory courses, on the other hand, have only settled into their present form this year. They are:

All other courses are classified as "Special Topics" and include more advanced VMS facilities, statistical packages, databases, graphics, text processing and file and mail transfer over JANET from the VAX. Special topics for microcomputer users include directories and batch files in MS-DOS, six special modules on WordPerfect, and an overview of statistical packages on microcomputers.

With the planned departure of the ICL 2988 and the installation of the VAX cluster a conversion course for 2988 users was run regularly from August to December.

A booklet describing the different course we offer was published in November and distributed to all departments and colleges.

During the year the small lecture room has been used to capacity and some courses have been fully booked up to a month in advance. It is clear that the new courses are addressing a real need. A second lecture room is being converted and equipped, which will allow larger classes to be run.

In all, almost 400 course modules were given during the year, to over 3000 participants. It is an interesting reflection on the changing role of the Service that almost 1800 participants attended microcomputer courses, compared with about 1300 on the mainframe courses.

Documentation

With the change to new services the opportunity was taken to reorganise the classification of user guides. A list of those current is given in Appendix B. It is noteworthy that even after discontinuing the ICL VME service there are still 55 user guides in use.

National Centre Computing

The peer review scheme for the national supercomputer services came into effect on 1st August 1987. Oxford users secured peer-reviewed time (the so-called Class 1 projects) at the three centres as follows:

ULCCCray 1-S1 project 40 hours
UMRCCCyber 2052 projects 250 hours
RALCray X-MP/489 projects 9600 hours

(The large figure on the Rutherford Appleton Laboratory Cray X-MP includes 8400 hours allocated to a large collaborative Atmospheric Physics project, in which Oxford is one of the institutions taking part.)

For smaller projects (Class 3), the University was awarded 100 hours of Cray 1 time at ULCC and 30 hours of Cyber 205 time at UMRCC. This time has been divided between some 25 projects. In addition, the Amdahl 5890 at ULCC is used to a minor extent for scalar work. Detailed figures for Class 3 usage are given in Tables 3 and 4.

[Top]

Computing in the Arts

Lasercomp

In the last report the question of replacing the Mark 1 Lasercomp was noted. The issues were again discussed with users, both at the autumn meeting of external representatives, and at a special meeting of Oxford users. A proposal was made from a user the Service should try to secure a second-hard (but more up-to-date) Lasercomp. This would provide continuity of Monotype software and fonts for the time being, without preempting too large a share of the funding which the Computer Board might devote to national typesetting services. A breathing space could thus be obtained while possible alternative output standards, such as PostScript, were evaluated.

Monotype were in fact able to locate a suitable machine, and the Computer Board agreed to fund the purchase of a second-hard Lasercomp Mark 2 70i system at a considerable saving on the new price. The new Lasercomp was delivered in December. It was hoped to offer pagination software on the new system, in addition to continuity of the Lasercheck user interface with which all current users are familiar. This software system has proved unexpectedly difficult to establish, and at the time of writing the new Lasercomp is not yet in full production service. Fortunately the Mark 1 system continues to give good service and to be heavily used.

It is pleasing to report that the external advisor's post was filled in September 1987, after a period during which the typesetting service had been rather short-handed.

Kurzweil Data Entry Machine

The service continues to run satisfactorily. After a visit to Kurzweil in July 1988 it was learnt that the company are discontinuing the manufacture and sale of the larger intelligent scanners, in order to concentrate on the desktop scanner market. The desktop machines are however not trainable in the same way, and appear intrinsically less suitable for multifont work and unusual scripts. The current model Kurzweil 4000 systems are being offered to customers on special terms, and consideration is being given to making an application to the Computer Board to purchase one of these.

Oxford Concordance Program

The notable event this year has been the release both of Version 2 of the mainframe software and of Micro-OCP. Publication and distribution of the latter are being handled by Oxford University Press. Take-up of the microcomputer version has been encouraging, but it is perhaps surprising that only some 70 sites have upgraded from Version 1 to Version 2 of the mainframe software in the first few months of its availability. This may possibly be a consequence of the existence of the micro version; many research workers in the humanities now doubtless prefer to carry out their work on microcomputers provided that the necessary software tools can be obtained.

Oxford Text Archive

During the year 544 texts were issued, and 63 new texts were deposited. The recent acquisitions include a new versions of Collins' English Dictionary (in the form of a set of Prolog rules) and of the CATSS Septuagint database; the novels of Jane Austen edited by J. Burrows; the Edinburgh Associative Thesaurus; and a number of Tudor and late Mediaeval texts.

The British Library funded Text Archive Assistant took up her post in January and embarked on an extensive survey of current and former archive users, as well as carrying out significant research into the legal and financial problems involved in running the Archive. An interim report on her activities was presented at the recent International ALLC Conference in Israel.

Standards for the description of machine readable texts were discussed extensively in collaboration with the ESRC Data Archive and a Dutch research group.

Computers in Teaching Initiative

The University was successful in securing funding under this initiative for a project involving the Service together with four arts faculties. The purpose is to develop software for searching and analysing literary texts which form part of the undergraduate syllabuses in Literae Humaniores (Classics), English, Modern Languages and Oriental Studies. The staff came into post in September 1987, and the project has made satisfactory progress all through the year. A microcomputer-based user interface has been developed to allow students to access OCP and BASIS on the VAX cluster in a user-friendly manner, and a substantial number of the required texts have been set up for analysis using OCP, BASIS or both. The system uses the Duke Toolkit to display non-standard alphabets on the screen, and has been successfully tested with classical Greek.

Pilot courses were run during Trinity Term for Modern Languages (Italian) and Literae Humaniores. It is hoped to use the software more extensively during the coming year and to evaluate its effectiveness in a range of courses.

General

The courses on literary and linguistic computing (primarily for Modern Languages graduate students), and the computer in text analysis, were given once each. Staff have as usual given lectures and seminars elsewhere, and participated in a number of international activities. One deserving of mention is an international initiative to recommend standards for the encoding of computer-readable literary texts.

[Top]

Performance and Reliability

VAX

The new VAX machines passed their acceptance tests with few problems, apart from some system hang-ups caused by hardware and software faults associated with the PRO380 operating console systems. The problem was circumvented by not using these devices, and operating the machines from directly connected terminals instead. Various problems with the PRO380 consoles have recurred throughout the year, and it is by no means certain that all have been cleared.

Performance during the latter half of 1987 was generally good, although marred by memory and processor faults on VAXB and VAXD. March and April of 1988 saw a succession of faults on VAXA which led to Digital changing a number of boards, some several times. The problem was eventually traced to a backplane fault.

In general the resilience of the cluster configuration enabled the user service to continue through these faults, although on some occasions failing processors appeared to cause others to crash. Concern was expressed to Digital about lack of thought for the cluster service on the part of engineers attending a fault which only affected one processor.

Various cases of slow running were investigated; some appeared to be alleviated by tuning. A more intractable problem lay in the handling of the queue file. The system is configured with some 40 generic and specific batch queues (to allow different sizes of jobs on the five cluster processors) and about 25 device queues to handle the various printers, plotters, the typesetter etc. A very large number (hundreds) of entries can accumulate, especially of batch jobs to be run overnight and of output waiting to be spooled after a weekend. This large scale use of the queue file seemed to be far beyond Digital's worst expectation, and inefficiencies in the queue handling software meant that at busy times response to commands to queue a file or check the statue of a queue entry would stretch to several minutes. Means of avoidance suggested by Digital seemed to have little applicability to a large system. After considerable pressure Digital eventually found a software patch which has given some relief; a full solution is promised in the next release of VMS.

Convex

The Convex system has been extremely reliable throughout this period, although there were some disc problems in October and November.

Environment

There have been a number of fairly minor air-conditioning faults, some of which, especially at weekends, have caused a noticeable amount of downtime.

Usage

Usage of the VAX Cluster built up rapidly, and the main machines were running at full capacity for most of Hilary and Trinity terms. It should be noted however that in general around 90% of the CPU power consumed (though not of the number of jobs processed) is by batch work, and half of this is "free" work; i.e. jobs of low priority run outside the allocation scheme.

The Convex system is used to between 60% and 90% of capacity, although most of this is work for a few users with particularly large-scale requirements.

[Top]

Appendix A. Staff in post on 1 June 1988


M Murphy
DirectorA G Robiette
Deputy DirectorC E Phelps
Director's SecretaryA R Beer
 
Development Group
Group ManagerA R Gay
Software Section Manager J R Douglas
Hardware Section Manager G W Litchfield
Group Secretary K S W Tomlinson
Programming Staff M D Austen
J A Burnell-Higgs
C Curran
K A Lewis
M K Malik
D A Miles
D W Rischmiller
J T Thomason
R F Treweek
S E Treweek
Network ControllerG B Lescott
Technical StaffA D Kew
A G Marlow
 
User Services Group
Group ManagerL Hayes
User Liaison ManagerC Bateman
Computing in the ArtsS M Hockey
Group SecretaryR L Turner
Programming Staff:L D Burnard
E M Crutch
G Edwards
C M Griffin
P Griffiths
R L Hutchings
J M R Martin
L C Munro
W Phillips
R L D Rees
D J Rossiter
P S Salotti
E W Taylor
KDEM ServiceG E Cooper
A E Holl
G A Jackson
A B Sabin
Library AssistantM E Franks
Budgets OfficeC Windridge
ReceptionistsC M Dale
M J Smith
Technical TypistE C Hussey
 
Operations Group
Operations ManagerR F Hufton
Microsystem SupportS E Evans
A E Lawrence
P G Higginbotham
N Rudgewick-Brown
Assistant Operations ManagerR I Saxton
Shift SupervisorsD C Hastings
B H Martin
Operations StaffM E J Brighton
S Dass
G A Fegan
A R Knight
A F Martin
R D McKechnie
P Parker
A M Rumble
D R Saxton
K Snidvongs
Operations AssistantN Smith
 
Administration
Administration OfficerD Clarke
Administrative AssistantH J McNab
L A Mills
Print UnitA C Hunter
D Peters
General and Cleaning StaffJ Bunce
D W F Cantell
J Mann
J M Towner
W J Towner

[Top]

Appendix B. Current User Guides

a1.1/1 Introduction to the Computing Service

a2.1/0 Glossary of VAX/VMS Terms (A4.2/3)

a3.1/1 Budgeting, Accounting and Scheduling on VAX/VMS

b1.1/1 Introduction to MS-DOS on the Zenith Microcomputer

b1.2/1 Further Use of MS-DOS on IBM-Compatible Microcomputers

b2.1/1 Kermit on IBM-compatible Microcomputers

b2.2/0 Kermit on BBC Microcomputers (M5.11/2)

b2.4/0 Kermit on CP/M Machines (M6.4/3)

b3.1/1 Flexible-Disk Conversion Service

b3.1/1a Flexible-Disk Conversion Service (amendment)

b3.2/1 PDP-11 Magnetic Media Conversion Service

b3.3/1 Accessing Public-Domain Software from the VAX

b4.1/1 Getting Started with WordPerfect

b4.2/1 Further Use of WordPerfect

c1.1/0 Using the Convex (draft)

d1.1/1 Getting Started on VAX/VMS

d1.2/1 Further Use of VAX/VMS

e1.1/1 Getting Started with the Editor EDT

e2.1/0 Getting Started with the Editor ECCE (G1.2/3)

e2.2/0 Creating & Editing Files, using ECCE (G1.1/2)

f1.2/0 SIR: Scientific Information Retrieval (V3.3/1)

f2.1/1 BMDP Statistical Software

f2.2/1 Minitab: An Interactive Statistics Package

f2.3/0 PSTAT: Princeton Statistical Program (V3.4/1)

f2.4/1 SAS: Statistical Analysis System

f3.1/1 CLUSTAN: Cluster Analysis Package

f3.2/1 GENSTAT: General Statistical Program

f3.3/1 GLIM: Generalised Linear Interactive Modelling

f3.4/1 MDS(X): Multidimensional Scaling Programs

f3.5/1 TSP: Time Series Processor

g1.2/0 Dacoll M249 Monochrome Graphics Terminals (M5.5/3)

g1.3/0 Datatype X5A Colour Graphics Terminals (M5.10/3)

g1.4/0 Hewlett Packard 7475 Plotter (M5.13/2)

g1.5/0 Calcomp 1012 Plotter (M5.6/3)

g4.1/1 ASPEX: Automated Surface Perspectives

g4.2/1 GIMMS: General-Purpose Geographic Processing System

g4.3/1 SYMAP: Lineprinter Mapping

h1.1/0 NAG Library (G5.1/2)

h2.1/1 FACSIMILE for Differential Equations

h2.2/0 MACSYMA: An Algebraic Manipulation Package (M2.5/2)

h2.3/0 Maple (M2.10/1)

h2.5/0 REDUCE (M2.6/1)

h2.6/0 SIMAN (M2.7/1)

h3.2/1 OCP: Oxford Concordance Program

h4.1/1 LASERCHECK at OUCS

i1.1/1 VAX Cluster File Archive

i1.2/0 (2988) File Archive (Draft)

i2.1/0 Digital VT220 Terminals (M5.14/1)

i2.2/0 Trident TT100 Brown Terminals (M5.2/4)

i3.2/0 EPS 2700 Laser Printer (G2.5/1)

i4.1/0 Magnetic Tape on the VAX (G2.4/1)

i4.1/0a Facilities for Reading non-VAX Files on the VAX (M4.5/1)

i4.2/0 Standards for Magnetic Tape Transfer (M4.2/1)

r1.1/1 VAX Network Commands

r2.1/0 Using the PAD (F3.1/2)

[Top]

Table 1. Machine Usage

VAX
4 Weeks Ending No. of Jobs CPU Hours No. of Interactive Jobs Batch CPU Hours
26/07/87 26165 319.81 6397 851.45
23/08/87 23301 275.46 6665 707.29
20/09/87 Changeover to VAX Cluster - No figures available
18/10/87 39518 225.26 11366 1333.54
15/11/87 42520 242.65 11932 1882.97
13/12/87 50559 351.08 17515 2980.64
10/01/88 26493 148.57 13712 3161.34
07/02/88 49495 323.26 28772 2957.81
06/03/88 48344 324.10 38235 2962.68
03/04/88 48880 328.52 32070 3378.84
01/05/88 47397 312.83 30390 3725.75
29/05/88 56822 400.88 34101 3517.06
26/06/88 55345 359.84 29329 3612.14
24/07/88 48590 332.97 43841 3451.28

[Top]

Table 2: Reliability and Mean Time Between Failures

VAX1 VAX2 VAX3
4-Weeks Ending Rel (%) MTBF (hrs) Rel(%) MTBF (hrs) Rel (%) MTBF (hrs)
26/07/87 94.41 336.00 95.09 224.00 92.86 134.40
23/08/87 97.21 67.20 95.61 61.09 98.21 134.40
VAXA VAXB VAXC VAXD
4-Weeks Ending Rel (%) MTBF (hrs) Rel(%) MTBF (hrs) Rel (%) MTBF (hrs) Rel (%) MTBF(hrs)
18/10/87 99.40 134.40 98.37 84.00 99.45 112.00 80.42 61.09
15/11/87 99.78 672.00 99.29 134.40 99.46 224.00 91.62 336.00
13/12/87 98.21 336.00 98.27 672.00 98.27 672.00 97.96 336.00
10/01/88 95.75 336.00 95.80 672.00 95.80 672.00 95.61 168.00
07/02/88 99.36 224.00 99.90 672.00 99.77 672.00 99.79 672.00
06/03/88 94.50 134.40 93.54 84.00 92.88 112.00 94.49 134.40
03/04/88 76.61 48.00 98.96 168.00 99.78 224.00 99.91 336.00
01/05/88 89.40 112.00 99.18 134.40 99.27 224.00 99.20 224.00
29/05/88 96.32 672.00 96.51 672.00 95.96 336.00 95.51 672.00
26/06/88 99.56 336.00 99.83 672.00 98.04 112.00 100.00
24/07/88 100.00 100.00 99.24 336.00 100.00

[Top]

Table 3. ULCC Usage

4-Weeks Amdahl Cray-1S (COS1M) Cray-1S (COS2M)
Ending Jobs %Jobs Units %Units Jobs %Jobs Units %Units Jobs %Jobs Units %Units
26/07/87 1554 2.2 3673 0.6 418 3.4 17943 4.0 264 3.4 51320 7.8
23/08/87 1338 1.6 2040 0.2 163 1.6 16283 3.6 194 2.2 27227 3.9
20/09/87 1004 1.4 2684 0.3 464 3.9 4422 1.0 159 1.6 31769 4.3
18/10/87 627 1.1 1369 0.3 439 3.4 11929 3.3 162 1.8 16574 2.6
15/11/87 382 0.8 1323 0.3 956 7.0 32595 8.1 52 0.5 1289 0.2
13/12/87 1163 1.6 2417 0.3 1116 9.7 37712 10.2 186 1.6 21603 3.6
10/01/88 561 1.2 577 0.2 959 14.3 106856 27.4 52 0.7 15554 2.2
07/02/88 1038 1.3 1071 0.3 1007 7.8 54168 12.8 637 5.6 22642 3.7
06/03/88 1274 1.4 951 0.2 949 6.8 34820 8.2 377 2.8 29401 4.5
03/04/88 923 1.2 1216 0.4 950 7.3 52106 12.0 453 4.2 50988 7.6
01/05/88 1077 1.6 2018 0.7 1186 10.3 76044 18.7 321 3.3 24652 3.9
29/05/88 1175 1.6 1110 0.3 2443 14.8 101307 26.6 278 2.3 22005 3.5
26/06/88 743 1.1 1100 0.2 1103 9.1 36562 9.6 151 1.6 16600 2.3
24/07/88 585 0.8 1289 0.3 1371 11.0 47517 11.8 88 0.8 7444 1.0

[Top]

Table 4. UMRCC Usage

4-Weeks CDC 7600 Cyber 176 Cyber 205
Ending Jobs %Jobs Units %Units Jobs %Jobs Units %Units Jobs %Jobs Units %Units
27/07/87 - - - - 309 7.2 104700 13.9 205 4.4 108912 7.2
24/08/87 - - - - 21 0.2 3467 0.5 21 1.2 21343 2.1
21/09/87 - - - - 32 0.2 14934 1.8 25 1.2 20265 1.3
19/10/87 - - - - 134 0.9 28476 3.7 33 1.0 5571 0.4
16/11/87 - - - - 110 0.8 16458 2.4 34 1.0 8786 0.5
14/12/87 - - - - 48 0.4 6827 0.8 8 0.2 3372 0.2
11/01/88 - - - - - - - - 10 0.6 4842 0.4
08/02/88 - - - - - - - - 113 3.2 17129 0.5
07/03/88 - - - - - - - - 18 0.5 7207 0.5
04/04/88 - - - - - - - - 15 0.4 6732 0.5
02/05/88 - - - - - - - - 13 0.4 6192 0.4
30/05/88 - - - - - - - - 20 0.5 9520 0.5
27/06/88 - - - - - - - - 18 0.5 7851 0.5
27/07/88 - - - - - - - - 14 0.3 6610 0.3

Note: The Cyber 176 was withdrawn from service at the end of December 1987. The service on the CDC 7600 was withdrawn at the end of July 1988. Oxford did not purchase any time on the CDC 7600 for the period 1987/88. The time being used on the Cyber 205 is used by one user (Dr J.P. Jakubovics of Metallurgy) for a Class 1 project.

[Top]