Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena...

14
reprinted from COMPUTER magazine Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts Institute of Technology Richard P. Parmelee, IBM Copyright © 1985 The Institute of Electrical and Electronics Engineers, Inc. Reprinted with permission from COMPUTER, 10662 Los Vaqueros Circle, Los Alamitos, CA90720

Transcript of Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena...

Page 1: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

reprinted from

COMPUTERmagazine

Computing in HigherEducation: The AthenaExperience

Ed Balkovich, Digital Equipment Corporation

Steven Lerman, Massachusetts Institute of Technology

Richard P. Parmelee, IBM

Copyright © 1985 The Institute of Electrical and Electronics Engineers, Inc.Reprinted with permission from COMPUTER,

10662 Los Vaqueros Circle, Los Alamitos, CA 90720

Page 2: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

Computing in HigherEducation: The Athena

ExperienceEd Balkovich, Digital Equipment Corporation

Steven Lerman, Massachusetts Institute of Technology

Richard P. Parmelee, IBM

Project Athena at MIT isan experiment toexplore the potentialuses of advancedcomputer technology inthe university curriculum.About 6O differenteducationaldevelopment projects,spanning virtually all ofMIT's academicdepartments, arealready in progress.

Computing has been widelydiscussed as having the po-tential to radically change

higher education.1 The advent oflow-cost, high-performance per-sonal computers has been touted bysome as the savior of the Americanuniversity system and condemnedby others as the beginning (or eventhe final stage) of its downfall.

With the veritable blitz of mediaattention that has been focused onthe educational uses of personalcomputing over the past few years,it has become increasingly difficultto realistically assess the actualchanges this new technology is likelyto accomplish. Part of this difficultyhas resulted from the largely tech-nology-driven frame of referencein which computation has beenviewed; universities have often sim-ply accepted the specific technolo-gies of current hardware and soft-ware before asking how they mightuse them.

In this article the use of compu-tation in higher education is ap-proached from an entirely different

'See The President's Report, 1983-1984, by D. Bok,Harvard University, Cambridge, Massachusetts, March1985, for a recent and cogent contribution to this debate.

perspective. We begin by speculat-ing on how computer technology—in its broad sense—might be of ac-tual use in the curriculum, and wedo not overly concern ourselveswith the question of whether thoseuses are practical within the contextof current technology. In particular,we seek to identify areas where cur-rent educational methods have ob-servable deficiencies that might bealleviated by the use of appropriatesoftware/hardware combinations.

After presenting some pedagogi-cally useful paradigms concerningthe use of computation in the cur-riculum, we then proceed to developa model of computation that couldsustain those potential uses. Finally,we examine how the wide-scale im-plementation of this model might inturn influence the university as aninstitution.

The ideas presented here are de-rived from our work on Project Ath-ena at the Massachusetts Institute ofTechnology (MIT). Introduced inMay 1983, Project Athena is an ex-periment to explore the potentialuses of advanced computer technol-ogy in the university curriculum.MIT has enlisted the assistance ofthe Digital Equipment Corporation,

112 ACM/IEEE-CS Joint Issue © 1985 ACM OOO1-O782/85/O11O-O1WSOO.75 COMPUTER

Page 3: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

the International Business MachineCorporation, and other corporatesponsors, each working indepen-dently with MIT. These corporatesponsors are providing (as grants toMIT) the hardware, software, equip-ment maintenance, and some ofthe technical staff. The universityitself is raising $20 million to sup-port the needed software, opera-tions, and staff. By the end ofthe project, MIT will have installeda network of about 2000 high-per-formance graphics workstations.More importantly, about half ofMIT's $20 million budget has beenmade available to faculty membersfor the creation of new-applicationssoftware to be used in MIT's owncurriculum. About 60 different ed-ucational development projects—spanning virtually all of MIT's aca-demic departments and collectivelyoffering a wealth of ideas about howcomputation can be used in the cur-riculum—are already in progressunder the auspices of Project Ath-ena.2

Potential educationaluses of computation

In an academic setting, computersystems could probably be used infive distinct ways. First, the originaluse of computers as an integral partof teaching and research in com-puter science and electrical engi-neering will undoubtably continue.Second, the current administrativeuse of computers for financial man-agement and other record keepingwill also remain. Third, the fairlyrecent availability of generic soft-ware for text processing, spread-sheet analysis, personal database

2See Faculty/Student Projects, Project Athena, MIT, Cam-bridge, Massachusetts, 1985, for more detailed descrip-tions of these projects.

management, etc., has already be-come widespread within universi-ties. Fourth, the use of computersfor electronic mail and related formsof asynchronous communication isstill in its early stages. And, fifth,what will probably be the most con-troversial use is that of the computeras an integral part of the instruc-tional process. Focusing on the fifthpoint, we will explore in this sectionsome possible future academic usesof computing.

The computer as a simulator ofcomplex systems. One of the mostcommon requests in faculty mem-bers' proposals to Project Athena isto use the computer as a simulatorfor physical or social phenomenathat are too complex to be under-stood from mathematical formula-tions. The subject areas involved areas diverse as orbital mechanics, sup-ply/demand equilibriums in eco-nomic systems, turbulent flowsof fluids, electromagnetic fieldsaround charged objects, and theanalysis of policies on releasingwater from dams into major riverbasins.~' In each of these areas, studentshave usually learned mathematicalformulations that model the phe-nomena under study. Many stu-dents, however, find it difficult totranslate the abstract, mathematicalformulation into a real, intuitive un-derstanding that they can internal-ize and use in subsequent courses.Although all students should under-stand the formal mathematics, notall can exercise the model on prob-lems of meaningful size, either be-cause of the computational effort orbecause the answers are too hard toportray. In many cases, faculty con-fine examples to problems that are"solvable" on the blackboard, andstudents' assignments are limited tosimilar, often trivial, examples. Ap-propriately constructed software—

running on hardware with the ca-pacity to solve and to portray largerscale (and consequently more real-istic) applications of the mathemat-ical formulations—allows studentsto test their intuition against thepredictions of formal models. Thestudent can be asked to pose a va-riety of "What if?" questions aboutproblems so sufficiently rich thatnot even the qualitative answersmay be obvious. A measure of thesuccess of the user interface andpresentation software, if not of theunderlying simulation code itself, isthe extent to which the student isdrawn into this "What if?" se-quence. It should be fun! To theextent that this is successful, itbrings to the curricula a powerfulnew mode of learning.

The obvious hazard of simulationis generally that the chances are in-creased that the results presentedmay be artifacts of flaws in themodel, in its evaluation, or even inthe presentation itself. Making sim-ulation more central to the educa-tional process carries as a concomi-tant the requirement to give it somegreater emphasis in the curricula.Simulation is now emerging as apotential form of scientific inquiry,paralleling the two more traditionalmodes of theory and experimenta-tion. If this admittedly controversialtrend continues, the computer willundoubtably become a central ele-ment for teaching students how todevise simulations as part of the sci-entific method.

The computer as a laboratory in-strument. The creation of meaning-ful laboratory experiences, particu-larly at the undergraduate level, hasoften been a frustrating task for uni-versities. High-quality laboratoriesare expensive, and many universi-ties have been unable to recapitalizethem as technology has advanced.

November 1985 113

Page 4: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

Most students can recall the dis-appointing experience of recordingdata in a lab subject, only to findthat the results bore virtually norelationship to the theory taught inclass. The cause of this frustrationwith laboratory experiences may goback to the unrealistic way in whichwe have gone about creating suchcourses. First, real insight fromexperimentation is rarely derivedfrom a single experiment—studentsshould have a wide range of experi-mental data available when seekingto draw inferences or to validate atheory. Second, the student in thelab often does not have the oppor-tunity to correct flaws in experimen-tal technique because the data areacquired in a single, brief lab sessionthat does not provide an immediatemeans of checking data as they arecollected. Also, the particular theorythat the lab is intended to illustrateis often too disconnected (in thelogical rather than physical sense)from the experiment the studentactually conducts. Finally, exper-iments that yield truly significantresults often require data to besubjected to sophisticated analysistechniques in order to reduce data,filter noise, etc., before meaningfulinferences can be drawn.

Instead of the traditional studentlabs, let us consider an alternativein which data are acquired digitallyby a computer. The computer al-lows the data to be displayed, com-pared directly with predictions fromtheory, postprocessed, stored forsubsequent analysis, shared withdata from students running thesame or related experiments, andcompared with data acquired frommore sophisticated experimentalapparatus. The most obvious bene-fit of this approach is that many ofthe unnecessarily tedious tasks as-sociated with doing the lab exercisecan be eliminated, allowing the stu-dent to focus on the physical phe-

nomena under study. More subtleis the possibility for students to runtheir own experiments, with databeing subject to analysis and com-parison with standardized resultsright in the lab. Students could beasked to analyze not only their ownexperimental results (which are of-ten too limited for meaningful in-ferences to be drawn), but also thedata of other students' experiments,which would collectively span a

A potential hazard ofusing the computer as avirtual laboratory is thatit could displace actualexperimentation.

wider range of interesting condi-tions. The predictions of alternativetheories taught in class could be di-rectly compared with experimentalresults, permitting students more in-sight into flaws in the design of theexperiment and its relationship tothe application of a particular the-ory. Thus, the range of validity ofa particular model or experimentcould be readily explored. With theability to move data easily from lab,to classroom, to residence—an in-tegral part of such a system—morecareful and detailed analysis and re-porting would result.

This type of laboratory experi-ence would have the potential ofshifting lab subjects away from anemphasis on obsolete experimentaltechniques to one where the spiritof scientific discovery is learned.

The computer as a virtual labo-ratory. The experimental laboratoryremains an important pedagogicaldevice for allowing students to ex-plore the relationship between the-ory and experimentation, but there

are many fields of inquiry wherethose same benefits cannot begained at all. For example, experi-ments on nuclear-power-plant op-erations are obviously too hazard-ous to allow students to discover keyoperational trade-offs through di-rect experience, and many situa-tions in physics involve phenomenathat occur only at velocities ap-proaching the speed of light. In suchinstances, scientists have tradition-ally invoked the concept of a "ge-danken" experiment—to have stu-dents think through a conceptuallyfeasible but physically impracticalexperiment. With the aid of ap-propriately constructed software,such "experiments" would becomeless abstract, and students wouldgain more from the experimentalsituations.

The distinction between "virtuallaboratory" and "simulator" para-digms rests on how the programsare intended to be used. For exam-ple, a program that evaluates popu-lation levels of certain insects as afunction of climate, insecticides,etc., may lie at the heart of both asimulator and a virtual laboratory.The simulator covers this programwith a layer of software that permits,even encourages, the student tochange and redisplay variables. Thevirtual laboratory uses a layer thatencourages, or even requires, sys-tematic identification of key param-eters, their range and sensitivity.

A potential hazard of the use ofthe computer as a virtual laboratoryis that it could displace actual ex-perimentation in the curriculum. Inour view, this would represent a se-rious misuse of computation. A vir-tual laboratory, however, may makeactual lab experiences more produc-tive. For example, students mightreview laboratory techniques in avirtual laboratory first, so that theywould be better prepared for theactual lab experiments later.

114 COMPUTER

Page 5: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

The computer as a tutor. Most ofthe early applications of computersin education (beyond using them toteach about computers themselves)focused on the computer as ateacher. Programs that presentedmaterial, asked questions, andbranched appropriately were by farthe most common educational ap-plications, and the experience withsuch programs was not particularlyencouraging. However, recent ad-vances in artificial intelligence haveopened up the possibility of usingcomputers as "expert tutors." Thekey distinction between this form ofsoftware and early computer-aidedinstruction (CAI) is that in an experttutor the student remains the pri-mary agent in the student-computerdialogue. With the early CAI soft-ware, the student tended to be pas-sive, with the computer controllingmuch of the interaction. In an ex-pert tutor, the student acts as theprimary problem solver, and thecomputer offers advice on request.

Consider as an example an Ath-ena project in civil engineering thatis building a tutoring program forintroductory-level structural analy-sis courses. The program developstypical problems for students tosolve, but the student is then left toformulate an appropriate set ofequations describing the equilib-rium of forces and moments. Thesoftware can analyze the system ofequations and diagnose common er-rors on request. For example, stu-dents often forget to impose a con-vention on algebraic signs of forcesto represent whether they are up ordown. On request, the program canoffer increasingly detailed com-ments about what is wrong with thestudent's proposed solution.

The creation of this type of pro-gram requires much more sophisti-cation than did the early CAI soft-ware. With expert-tutor software, ageneral inference capability for the

class of problems under study mustbe developed, and a careful study ofcommon errors made by studentshas to be undertaken. The softwareneeds to work with the mathemati-cal symbols that are required to rep-resent the subject under study. Inthe case of engineering and science,this will often require algebraicmanipulation of mathematical ex-pressions. Subjects such as foreignlanguages will require computer in-

A critical advantage ofcomputer-generatedimages over statictextbook pictures is thatstudents canmanipulate them.

terfaces that can cope with naturallanguages.3 However, there are stillunresolved questions about how tobest construct expert-tutor softwareand whether the state of the art inknowledge-based systems is at thepoint where useful expert tutors canbe created for complicated subjectareas.

The computer as a textbook. Inengineering and the sciences, thereare often established conventionsfor graphical representation of in-formation. For example, in chem-istry there is a well-defined nota-tional system used to display thecomposition and geometry of mol-ecules. Textbooks rely on these ab-stractions to display informationabout two- and three-dimensionalgeometry. However, for many stu-

3MACSYMA, a program developed by Joel Moses at MIT,illustrates that software to support mathematical manipu-lation can in fact be constructed. Progress in natural-language processing has been far slower. Within ProjectAthena, work is underway on the development of softwarecapable of natural-language recognition in five foreignlanguages at the level of a first- or second-year universitystudent.

dents, the translation of these iconicpictures into the mental models ofthe actual objects they are intendedto represent can be difficult. A crit-ical advantage of computer-gener-ated images over static textbookpictures is that students can manip-ulate them—that is, rotate, com-press, or edit a particular display.Such graphical tools place the stu-dents in control of the representa-tion of material and free them frombeing locked into a single set of pre-defined pictures.

The importance of allowing stu-dents to control and manipulategraphical representations is per-haps most clear in computer-aideddesign (CAD) applications. Theneed for architects and engineers torearrange components of designseasily has been crucial to the successof CAD in industry, and it is reason-able to suppose that the same bene-fits would extend to learning thedesign process itself. Another ad-vantage lies in the use of animation,particularly in helping students vis-ualize what happens in time-de-pendent systems involving movingobjects, vibration, or the evolutionof complex social systems (such asthe national economy). Finally, theorder in which the components of apicture are drawn can often providestudents with as much insight asdoes the picture itself. For example,when displaying the magnetic fieldaround a set of charged objects, it isimportant to display lines of sym-metry and to exploit such symmetryin the solution. Computer-gener-ated displays that first draw in onequadrant and then reflect that draw-ing into a symmetric quadrant, domore than illustrate the field understudy; they also help the studentgain insight into more general prob-lem-solving methods.

The computer as a blackboard.Most of the limitations of textbooks

November 1985 115

Page 6: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

apply to blackboards as well. Al-though students do benefit fromwatching how images are con-structed on a blackboard, they mayalso have to bear the burden of theinstructor's limited artistic abilities.Moreover, many complex imagesare very difficult to draw, and oncedisplayed on the blackboard, theyare difficult to change in a clear way.Large-screen, high-resolution pro-jection is a significantly better me-dium, particularly in large lectures.In addition, the computer-generatedgraphics used in class can easily bemade available to students for fur-ther study outside of class.

The computer as a special-pur-pose learning environment. Certainacademic disciplines require re-sources that, for most students, areunavailable. For example, studentcomposers of orchestral music ide-ally should (and rarely do) haveaccess to a complete symphony or-chestra to hear their own composi-tions. The computer, however, cansupply the foundation for a special-ized environment that provides forinput on a clavier keyboard and out-put from sound-synthesis hardwarewith high-resolution graphics to dis-play musical notation. This formof "composer's workbench" wouldoffer the student a powerful toolfor exploring music and would pro-vide a qualitatively different expe-rience from more conventionalapproaches.

The computer as a communica-tions medium. Universities are, to alarge degree, in the communicationsbusiness. Written reports, if not cen-tral to a technical curriculum, areimportant. The lab report is the stu-dent's prose description of the the-ory (including math symbols) andthe lab experience (including graphsand charts). Computer-supportedprocessing of intermixed text, sym-

bols, and graphics, though possible,is still too awkward for student usefor project reports. Thus the currenthandwritten, cut-and-paste ap-proach necessary to prepare a labreport is a high barrier to "Whatif?" revisions. The payoff for text-processing support for these techni-cal reports will be at least as high asthat already observed for simpletext. The result is not just betterreports for equal or less effort; it isthat greater academic emphasis can

With personal-computeraccess, the library canenhance some of itspresent services andprovide entirely newones as well.be placed on the quality of the re-port itself, allowing it a more centralplace in the curricula.

The role of the library in the cur-ricula will also shift. Currently thelibrary is viewed as a place ratherthan as a set of services. Students gothere, when it is open, to search theliterature. With personal-computeraccess, the library can enhance someof its present services and provideentirely new ones as well. An Ath-ena project by the library to allowcomputerized access to its catalogsis the first step in this direction. Asmore material becomes machinereadable (for example, videodiscstorage of whole journals can be online), students will not only searchfor relevant material, but will obtaincopies as well. By making the librarymore accessible, scholarship and therole of the library in the curriculacan receive greater emphasis.

In addition to making librariesmore accessible, computers will de-liver entirely new services. Person-alized newspapers culled from newsservices are already available at

MIT. An Athena-supported projectto provide campuswide access to thecourse catalog and to students' eval-uations of courses will certainlyevolve into a rich environment con-taining course outlines and sampleproblem sets. Finally electronic mailand bulletin boards will help de-velop a sense of community.

The computer as a mediator.Linked computers would offer thepossibility of using multiplayergames to teach in subject areas (suchas economics, logistics, and politicalscience) where the true complexityof the system under study arisesfrom the interactions of numerousindividuals. Such computer-medi-ated simulations have been used aseither totally self-contained activi-ties or as a preliminary procedureto a further classroom-based exer-cise.

Obviously, computer-based com-munication in mediated games hasa quality different from direct, face-to-face contact. In some situationsthis makes computer-mediatedgaming a somewhat artificial expe-rience. However, there are instanceswhere it is important to be able toexert control over the informationthat can be transmitted to actors ina game. For example, students canbe taught how changes in the abilityof mock superpowers to communi-cate during hypothetical crises canaffect likely outcomes. By regulatinghow often information can be ex-changed or by altering the allowedcontent of messages, various possi-ble forms of communication can besimulated. An analogous situationexists in business simulations wherelegal restrictions that may affectmarket outcomes are imposed ondiscussions between competingbusinesses.

The computer as recreation. Itwould be incomplete to think of

116 COMPUTER

Page 7: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

computers at a university only interms of their use in the curriculum.For many students, the computersprovide a rich and varied source ofextracurricular activities. In addi-tion to games, computers supportthe activities of many campus clubsand individuals. For example, a clubat MIT is using Athena's computa-tion resources to develop a simu-lated mission of the space shuttle.Another club, the Student Infor-mation Processing Board, has beengiven access to the UNIX sourcecode and a machine to experimentwith, and routinely makes signifi-cant modifications in both the ker-nel and utilities.

The computationalenvironment foreducation

Now that we have examinedsome possible future academic usesof computing, we will turn our dis-cussion to the computing environ-ment that would be needed to sus-tain these uses. Our point of view isa practical one. Although the costof some components of campuscomputing systems is expected todecline, computing will never befree. This is particularly apparentwhen some of the indirect costs ofcomputing—such as the faculty andstudent time required to develop,maintain, and administer academicsoftware—are taken into consider-ation.

We believe that the envisionedcomputing environment can be re-alized in a few years. At the mo-ment, however, MIT and the ven-dors involved in Project Athena aresubsidizing the resources needed toexperiment with the computingmodel. We do expect that the com-puting environment will be self-sus-taining by the end of the experi-ment. ,

Assumptions. In projecting futurecomputing needs, we have assumedthat computing will be available foruse in a variety of courses spanningmany disciplines and that it will beincorporated into many settings ona campus. We also expect that thestrategies for capitalizing computingwill be flexible. They could rangefrom personal computers purchasedby individual members of the aca-demic community to computingplants owned entirely by the insti-tution. MIT, like many other cam-puses, will most likely use com-puters acquired both by individualsand by the university. In any case,academic computing must be cost-effective in order to sustain itself; ithas to be designed and delivered ina way that takes into account the"true" costs of development and op-eration. Finally, computing needs topreserve individual and institu-tional investments in the presenceof rapidly changing technology.

Implications. Ubiquity. If com-puting is to become an integral partof the educational landscape, itmust be treated as a utility. A time-sharing system whose response de-grades in midafternoon must be re-garded as a failure mode of a publicutility in much the same way that a"brownout" is viewed as a failuremode of an electrical utility. Cam-pus computing facilities need to bepredictable. We believe that this canbe achieved by systems that dedicatea computer(s) to one user for asession.

Time-sharing systems do notscale to support an entire campus.Even networks of time-sharingsystems fail to provide predictableresponses for computing environ-ments that emphasize computa-tionally intensive applications orinteractive graphics. Because per-sonal computers lack the informa-tion-sharing and communications

features of time-sharing systems,they are an inadequate solution tothe problem unless they are set upas part of a larger computer networksupporting information sharing. Ofcourse, there must be a sufficientnumber of individual computersavailable in order to avoid shiftingqueueing delays from the CPU of atime-sharing system to the key-boards of a network-based system.What we do not yet know is howmany workstations are needed tosupport extensive educational usesof computing.

Ownership. While future edu-cational computer support will mostlikely be provided by a mixture ofindividually owned and institution-ally owned computers, institutionswill own and operate communica-tions networks that link private andpublic computers. The acquisitionand operation of such networks arerapidly becoming the responsibilityof traditional computing centers.The data network will be a basicutility of the campus, and like anyother utility, it may require its sub-scribers to pay a one-time installa-tion charge or a recurring servicecharge.

Computers connected to the net-work can be divided into two cate-gories: service computers and work-stations. Service computers willprovide generic services (e.g., filingor authentication) to one or moreclient computers, and they may in-clude special-purpose or high-per-formance hardware. This categoryof computing will be owned andoperated by institutions. Labor-in-tensive services (e.g., archival stor-age or a highly reliable filing system)and services involving consumables(e.g., a print service) will probablybe offered on a fee-for-service basis.This style of operation will empha-size the need for authentication innetwork-based educational comput-ing systems.

November 1986 117

Page 8: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

Students are mobile and educa-tion takes place in many locationson a university campus, and al-though they will use computers in avariety of settings and for a varietyof reasons, we do not envision stu-dents making extensive use of port-able computers. Portable computersdo not now have, and are not likelyto have, certain features needed forfuture educational applications(e.g., high-resolution bit-mappedgraphics). We also expect that com-puters will be installed in specializedfacilities (e.g., laboratory com-puters) and that students and facultywill be using one or more worksta-tions in their day-to-day routine. Ifthe training required for the opera-tion of different workstations variessignificantly, it could be a barrier totheir effective use.

Both individuals and institutionswill most likely own workstations,and in each case, a workstation willbe operated by one individual persession. Whereas privately ownedworkstations will be located in resi-dences or in offices, institutionallyowned workstations will be locatedin the public facilities of universities(e.g., laboratories, classrooms, li-braries, and other public work areas)and will be available to any legiti-mate user.

Cost-effectiveness. The cost-ef-fectiveness of educational comput-ing must be judged from severalstandpoints. Computing will con-tinue to add value to education, butthe acquisition of the computingpower needed to improve the qual-ity of education will require signifi-cant and direct institutional andpersonal investments.

There are equally large indirectcosts. For example, the training re-quired to use computing in a subject(e.g., foreign languages) representsan educational overhead. If com-puting is used extensively, universi-ties will have to limit the amount of

training time. If not, the trainingcosts could reduce the amount oftime available for presenting coursematerials. Thus, this indirect costmust be carefully managed so thatit does not detract from the primarypurpose of education.

In the course of a four-year edu-cation, significant improvements incomputer technology are likely tooccur. Student ownership of com-puters is one way for universities tocontinually upgrade their comput-ing facilities with each freshmanclass. However, this too must becarefully managed to ensure an ad-equate spectrum of facilities to meeteducational objectives and to avoidintroducing serious incompatibili-ties that would render older systemsunusable.

Since educational software rep-resents a tremendous institutionalinvestment that must be preservedin the presence of technologicalchange, institutions must make cer-tain that the software they createand acquire can tolerate changes intechnology. The technology avail-able at a given price will improvewith time, and new students willnaturally wish to buy the most cur-rent technology. It should also bepossible to move existing educa-tional software to new hardwarewith minimal effort.

Different types of hardware maybe provided by a single vendor—representing valid price-perform-ance trade-offs. For example,student-owned computers may besignificantly different from the uni-versity-owned computers used in aCAD laboratory, and educationalsoftware must be able to toleratesuch differences. Ideally, it shouldbe possible for students at otherworkstations to use (in a limitedfashion) materials prepared in aCAD lab. For example, a studentmight use a private workstation toprepare a lab report that would in-

clude graphics generated earlier inthe lab.

Sometimes the cultural orien-tation or business policies of in-dividual institutions may lead tomultivendor solutions. Computingsupplied by multiple vendors pre-sents universities with the samechallenges to their investment in ed-ucational software as do diverseworkstations owned by both stu-dents and the university.

These observations suggest that,in order to be cost-effective, educa-tional uses of computing must an-ticipate changing technology. Ap-plications have to be built on higherlevel abstractions that hide the un-derlying technology. Ideally, theseabstractions evolve slowly and canbe reimplemented on new techno-logical bases to preserve existing ed-ucational software. Such abstrac-tions can be provided by a singlevendor—for example, through awell-integrated set of operating sys-tems, communications protocols,and layered products that span thevendor's hardware products. Alter-natively, the abstractions might begeneric—for example, an operatingsystem, network protocols, and lay-ered products that span multiplevendors' products. If every user in-terface to educational computingpresented a computing environ-ment based on the same set of ab-stractions, training costs could bereduced. In addition, if the choiceof abstractions were to be widelyaccepted by other universities, thisapproach would also encourage theexchange of educational software. •

Athena's computationalenvironment

The first two years of ProjectAthena were supported by a cam-pus-wide network of time-sharedcomputers. This system was de-

COMPUTER

Page 9: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

signed to be deployed quickly andused to support the initial develop-ment of academic software by fac-ulty and students. It simulatedmany of the features we expected toinclude in the computing environ-ment provided by workstations.

We are now entering the secondphase of the project, which is addingpublic and private workstations tothe network and which is convertingexisting time-shared computers toservice facilities. The conversionshould result in a system with thefollowing general features.

The system. Athena will provideits users with a system of individualcomputers linked by a network.Users may own a computer (privateworkstation) or may use public fa-cilities (public workstation). Allworkstations will be dedicated toone individual per session and willbe designed to deliver the comput-ing power needed for interactionand graphics. The network will offeraccess to other computers operatedby MIT and will provide services toall workstations.

The owner of a private worksta-tion may assume that the machineis never used by anyone else. A pub-lic workstation will be dedicated toa single individual per session; how-ever, users will need to assume thatothers use it as well. Public worksta-tions are "serially reusable" re-sources, and it will be possible for auser to "reset" a public workstationto a known state and to easily recon-struct information "cached" in itslocal mass storage.

Members of the campus com-munity will be able to use any publicworkstation or privately ownedequipment, and they will not be lim-ited to just one computer. The sys-tem will provide facilities that willmake it possible to share infor-mation and to access data andprograms from any computer. For

example, service computers willprovide access to systems libraries(containing system software neededto reconstruct information cachedin a workstation's local mass stor-age), to class libraries (containingcourse-specific data and programs),and to user lockers (containing user-specific programs and data).

The characteristics of worksta-tions are motivated by their appli-cations, and Project Athena as-sumes that advanced workstationtechnology will enable significantadvances in the application of com-puters to education. It also assumesthat the cost of such advanced tech-nology will be reduced with time.The hardware used in the project islimited to a few types of computersand is provided by multiple ven-dors. The general features of an Ath-ena workstation include

• a memory from 1 to 16 Mbyteswith memory-management fea-tures that can be used to com-municate among independentaddress spaces and to effect dy-namic linking in a large addressspace (required to constructsoftware such as LISP-based tu-tor);

• a high-performance processorcapable of executing at leastone million instructions persecond (to explore realisticproblems requiring complexcalculations);

• a high-bandwidth, bit-mappeddisplay (to provide high-qualityinteractive graphics);

• an "open" design that can beextended to include peripheralssuch as keyboards, sound-gen-eration devices, video input/output, data acquisition, andcontrol devices (required byspecific laboratory applicationsand special-purpose environ-ments such as a composer'sworkbench).

The network, which is represent-ative of networks being developedat other universities, is implementedwith multiple technologies and isbased on a high-speed backbonenetwork. Existing campus facilitieshave access to high-speed, local-areanetworks connected to the back-bone network via special-purposegateways. Future facilities are likelyto use both high-speed, local-areanetworks and lower speed, wide-area networks connected to thebackbone network.

Coherence. Project Athena mustminimize unnecessary differencesand limit training costs in order tosustain itself. Academic applica-tions need to share common codeand build on one another. It oughtto be possible to make only a one-time investment in training that isapplicable to all uses of the systemor to make the use of applicationsand system software self-evident. Itmust also be possible to run mostapplications and to access informa-tion using any public or privateworkstation.

A key software module is the op-erating system, which insulates theuser and applications from changesin technology and differences in thehardware supplied by multiple ven-dors. Other major software modulesinclude discipline-specific software(e.g., software that illustrates elec-tromagnetic fields) and general-pur-pose software (e.g., word process-ing). These modules define threeinterfaces that must be coherent:

• system interface—the interfacebetween the operating systemand applications;

• application interface—the in-terface between applications;

• user interface—the interfacebetween the user and the appli-cations and operating system.

November 1985 119

Page 10: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

These coherent interfaces providethe base of abstractions needed topreserve the investment in educa-tional software and limit trainingcosts.

The computing environment,which includes operating-systemfunctions and general-purpose ap-plications and libraries, is the sameon all workstations. Project Athenauses a single operating system anda single communication protocolfamily to implement its computingenvironment. The operating systemis UNIX, which provides the foun-dation needed to port all applica-tions to all types of workstations. Inaddition, the same suite of general-purpose software (e.g., an editor, aword processor, a graphics library,etc.) is available at every worksta-tion. UNIX hides the underlyingcomputer technology from the user,limits the training costs, and pro-vides a framework for writing(ex)portable applications that can bemade to work with different typesof workstations. At the moment, thetraining required to use UNIX andthe general-purpose software is sig-nificant. However, it is a one-timeinvestment that we believe will bereduced with experience and withrefinements to the system.

The TCP/IP protocols are used toimplement the same "virtual net-work" for all network technologiesused in the project. The virtual net-work hides technology-specific fea-tures. A uniform virtual network isneeded to provide common com-munication functions that spanmultiple vendors' computers. Withcareful planning, the TCP/IP pro-tocols could be replaced by anotherprotocol suite with similar func-tions.

The investment required for thedevelopment of new academic ap-plications can be reduced in severalways. One approach is to look for"vehicles" other than one-of-a-kind

programs for delivering educationalmaterials. Some examples of exist-ing vehicles are spreadsheets or ap-plication generators for drill andpractice materials. Another ap-proach is to support methods andtechniques that would reduce theeffort required to create new soft-ware. An example of this would bea library of routines for 3-D graph-ics. The former approach requiresconsiderable innovation and is bestleft to exploration by faculty andstudents; the latter can be system-atically attacked by defining and im-plementing a coherent interface be-tween applications.

A working hypothesis of ProjectAthena is that most scientific andengineering applications can use-fully interact only when employinga small number (20-30) of datatypes (e.g., graphs, arrays, tables).The application interface can be de-fined by implementing commonrepresentations of these data typesand mechanisms for manipulatingthem. This motivates our goal ofkeeping the number of "supported"programming languages small be-cause it is practical to provide suchimplementations for only a smallnumber of languages.

If the number of ideas in the ap-plication interface is small and con-cise, then the programming inter-face of an application is easier tounderstand. If an application's useof the concepts in the interface iscomplete, then programs can becombined in standard ways. A pro-gram that takes advantage of suchan interface is more likely to be usedwith other programs. For example,a word-processing program shouldbe able to incorporate the tables orgraphs produced by another pro-gram performing an analysis of lab-oratory data. The ability to combineapplications should also lead to in-terdisciplinary efforts. For example,programs that model thermody-

namic systems could be used with avariety of discipline-specific appli-cations.

Experience with the system hasdemonstrated that few applicationsin Athena's existing computing en-vironment have coherent interfaces.Improvements in this area representa major technical thrust of the proj-ect and should have a major impacton the use of computing at MIT.

New technology (i.e., bit-mappeddisplays and computers dedicated toa single user) and the potential forthe comprehensive use of comput-ing motivate the need to refine userinterfaces. A "standard" frameworkfor the user interface, common toall applications (including the op-erating system), can reduce theamount of training needed to useapplications. New applications willemphasize interaction and graphics.If code for user interfaces can begeneralized, it will reduce theamount of new code required todevelop an application.

There are two ways of thinkingabout a standard framework for userinterfaces. The first is to require thatall applications use a single frame-work. The interface may be com-plex enough to require training, butbecause there is only one interface,training becomes a one-time invest-ment. The alternative is to allowuser interfaces to differ, but to re-quire that all interfaces be self-ex-planatory.

The major risk of any standardframework for user interfaces is thata standard framework, if appliedprematurely or blindly, suppressesinnovation. Concepts employed inthe design of a user interface spandiverse areas of research (e.g., hu-man factors and graphics), and thereis little agreement on what contrib-utes to a good user interface. Al-though Project Athena is addressingthe issues of user interfaces, they

120 COMPUTER

Page 11: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

remain difficult issues, and we havemodest expectations for success.

Ownership and operation. Thecampus network is owned and op-erated by MIT. Although the net-work's growth has been somewhatad hoc, it has followed the generalmodel of subnetworks linked by a"spine" network. The network hasbeen subsidized by Project Athena(its largest client). A long-termplan for financing the network'soperation and expansion is underdiscussion and will be based on op-erational experience with ProjectAthena.

Service computers are also ownedand operated by MIT. As Athena'scomputational environment hasstabilized, part of the operationalsupport for its computers has beentransferred to a centrally adminis-tered staff. The staff is responsiblefor providing computational sup-port to a larger community consist-ing of the instructional research andadministrative units of the univer-sity.

Project Athena has subsidized theacquisition of workstations. Whilepublic workstations have been pro-vided by the project, private own-ership can only be simulated bymaking workstations available to asubset of the campus communitythat uses them on an "experimen-tal" basis.

In both cases, the user is expectedto play an important role in oper-ating the workstation. Users areresponsible for managing the infor-mation they create (using remova-ble media or network file services).They are also responsible for report-ing failures and supporting repairactivities (e.g., providing access to aprivate workstation). The existingcomputing infrastructure of MIT isnow being expanded to include re-tail outlets for computers and facil-ities and services for repairing work-stations.

Open issues

The three most critical and as yetunresolved issues that will have tobe addressed by universities seekingto implement the proposed modelof computation are information pri-vacy, software licensing, and payingthe costs of such a system.

The problem of maintaining theprivacy of information on a networkdesigned to encourage informationsharing represents a formidablechallenge. The characteristics ofcomputer systems that make shar-ing easy are often precisely thosethat make protection of informationdifficult. Such problems can extendbeyond protecting students and theuniversity from the most obviousforms of abuse, such as widespreadand largely undetectable cheating.As large distributed systems evolve,there will be a strong temptation forfaculty and administrators to usethe mail and other communicationsfacilities to transmit potentially sen-sitive information such as studentgrades, staff evaluations, and salary-related data. Over time there will bestrong reason to link MIT adminis-trative machines into the Athenaenvironment. This will present theopportunity of having a security au-dit by a group outside the academiccomputing environment. That is,the administrative side does notnow connect its computers to thecampus net and will not unless itsdata-security requirements can bemet.

The resolution of these dilemmaswill require a three-pronged ap-proach. First, to the extent that isfeasible, distributed systems must bemade more secure without forfeitingthe benefits of information sharing.This will probably require both theencryption of network traffic andsecure ways of distributing keys. Itis unlikely that these technologies

will be cost-effective until encryp-tion becomes a standard option onhigh-volume, low-cost network in-terfaces. Second, users will have tobe made aware of the potential haz-ards of sending unencrypted mes-sages on data between sessions, us-ing easily decrypted passwords, andgenerally placing information—which simply should not be there—in the computer system. The thirdand most difficult task is to create astrong sense of ethics about infor-mation privacy within an academiccommunity. It is evident from ourexperiences in Project Athena thatthere is no clear consensus aboutappropriate standards regarding theuse of information within the uni-versity. Students who would noteven consider invading someoneelse's paper files have less clearstandards about looking at thatsame individual's electronic files.Universities may have to stress that"electronic" privacy has the samestatus as other forms of privacy, andthey will have to work with studentsto make sure that members of thecommunity have a common under-standing of what constitutes appro-priate behavior.

The problem of licensing softwarefor large systems of networkedworkstations is an emerging con-cern for the software industry.Within Project Athena, we havestruggled to develop site licensesthat offer reasonable compensationto third-party software providersand that adequately protect theircode, while still keeping costs rea-sonable and permitting use of thenetwork for software distribution.For public workstations, strategiessuch as requiring "key disks" to runsoftware or special hardware keysare almost impossible to manage. Amore reasonable strategy is to havea network-based installation proce-dure that tailors the program so that

November 1985 121

Page 12: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

The Carnegie-Mellon Project

Activity similar to ProjectAthena is currently under way atCarnegie-Mellon University, a tech-nically sophisticated institutionlocated on a geographically com-pact campus. These attributes, inconjunction with a supportive ad-ministration, have given rise to aproject to produce a campus-wideintegrated personal computer en-vironment.

Three interacting groups areworking on CMU's computing fu-ture. The Computation Center hasalready deployed over 3000 per-sonal computers for various pur-poses, the Information TechnologyCenter is developing the softwarefor a future system, and the Centerfor the Design of EducationalComputing is facilitating the crea-tion of educational software.

The Information TechnologyCenter. ITC's computer system,named Andrew, after AndrewCarnegie and Andrew Mellon, isthe prototype for Carnegie-Mellon's main campus computingsystem. It runs on the UNIXoperating system. The two majorcomponents of Andrew are thecommunications system and theworkstation operating environ-

ment. The ITC is sponsored byIBM, has been in operation sinceJanuary of 1983, has expendedabout 70 person-years of effort,and has generated over 400,000lines of program code.

The communications system.The model for the communica-tions system is simply a filesystem like that found on a con-ventional time-sharing system.What is interesting is its size andinternal structure: It is expected tohave between 5000 and 10,000 ac-tive users, to hold millions of files,and to be implemented by a10,000-plug high-speed networkand hundreds of dedicated com-puters called file servers.

The first version of the networkand file system was brought up atthe end of 1984. In cooperationwith the Computation Center andseveral academic departments, acampus-wide network of local areanetworks linked with fiber-opticcables and routing computers wascreated. Over 400 computersowned by various groups are con-nected to this network, all usingthe same communications proto-col. Of these machines, about 100

run Andrew software and are sup-ported by seven file servers. Thereare about 250 registered Andrewusers. All users can send andreceive electronic mail as part ofthe general campus mail network.

The workstation environment.Workstation efforts have beenfocused almost exclusively on thecreation of general tools for ex-ploiting a bit-map display andmouse. The building block for allapplications is the window manag-er that virtualizes the displayscreen, dividing it into a number ofrectangular areas whose size andshape are under control of theuser. An extensive subroutinelibrary provides the means tostructure multifont, editable textinside a window.

The Computation Center. TheCMU Computation Center hasdistributed over 3000 IBM PCs andApple Macintoshes. At the end of1984, the ITC began to deploy over50 workstations to the universitycommunity, mostly for educationalsoftware development. This re-quired a large set of activitiesbeyond software development andbuttressing the existing campus

Continued on page 123

it will run only on the machine towhich it is downloaded.

Even with this type of arrange-ment, there is a question of whatstudents may be allowed to takewith them on graduation. If soft-ware is site-licensed to the univer-sity, does that license require grad-uating students to return copies tothe university? Alternatively, doesthe license provide students with theright to take software with them,perhaps by exercising an option tobuy at some predetermined fee? Ifalumni have access to the univer-sity-based software, will the univer-sity provide a mechanism for themto obtain updated copies as the soft-

ware is revised? All parties want aresolution of these issues, but theiruncertainty as to where the softwaremarket is going is great, and formany the risk of false steps is veryhigh. At present the best that can beachieved is to maintain a good dia-logue between producers, distribu-tors, and consumers. This is clearlythe role that MIT generally andProject Athena in particular can ful-fill. To that end, Project Athena4

hosted a roundtable at MIT's Endi-cott House in November 1984, and"it was clear that the parties repre-senting the various academic, com-mercial, and government sectorsneeded to be brought together to

gain a better understanding of theissues involved and the perspectivesfrom which each came." (Proceed-ings published by MIT IndustrialLiaison Program ; the precedingquote is from this report.)

The final issues are economic.Widespread use of computersis far more likely to increase

the total cost of education than todecrease it—the payoff of com-

*See Commercialization of Educational Software, ProjectAthena, MIT, Cambridge Massachusetts, 1985, for a sur-vey of the discussion. The conference was jointly spon-sored by Pro|ect Athena and the Alfred P. Sloan Founda-tion.

122 COMPUTER

Page 13: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

continued from page 122

network: routine methods forregistering users, user and pro-grammer documentation, a hotlinefor system problems, and userconsulting via electronic mail.Each member of the ITC technicalstaff served as a liaison to one ofthe groups using Andrew.

After an agreement with IBMallowing the distribution of thecurrent version of the Andrewworkstation software was secured,the software was publicized inUNIX circles and to the ICECconsortium.

The Center for the Design ofEducational Computing. The pur-pose of CDEC, which is supportedby a variety of sources, is tofacilitate the creation of softwareto support undergraduate educa-tion. A few demonstration pro-grams have been written to assessthe potential of the Andrew sys-tem for graphical illustration ofphysical concepts: a game illus-trating the properties of accelera-tion, an orbit tracing program, anoptics workbench, a generalmathematical function plotter, anautomated aid to organizing lay-outs of information within win-dows, and an animation editor.

Implementation of the Micro-Tutor language, designed by Bruceand Judy Sherwood, is also wellunder way. This is the fifth versionof the Tutor language begun at theUniversity of Illinois in 1967. Over10,000 hours of course materialhave been developed under vari-ous versions of Tutor, and while itis not likely that CMU will importmuch of that material, the ex-istence of this proven tool willgreatly enhance the developmentand importation of new software.

Finally, a large team from CDECand the nascent Philosophy De-partment have embarked on a ma-jor effort in logic-teaching pro-grams. The suite of logic softwarewill be developed in both PC ver-sions and Andrew versions.

Campus communications. Ex-tensive time has been spent con-sulting with the campus communi-ty on all aspects of educationalcomputing, from basic strategy tocoping with programming bugs.Over three dozen campus educa-tional software projects werefunded, most targeted for Andrew.Software developers using Andrewmeet weekly with various mem-

bers of the ITC and CDEC staffs;there is a newsletter, a seminarseries, and a catalog of educa-tional software developed at CMU.The three dozen entries are limitedto projects that have producedrunning software and to brief one-or two-page descriptions.

The CDEC Software Library hasa full-time librarian, who is also onthe FIPSE Technology StudyGroup, a conduit to educationalcomputing and software databaseprojects nationally. A databasehost will be chosen for the infor-mation being gathered, and col-laborative opportunities with Mich-igan's Library Science School forresearch into classification andretrieval schemes for software arebeing explored.

Software publication. CMU hasexplored the possibilities for publi-cation of educational software andwill concentrate on a few high-profile, high-quality packages thatrun on IBM PCs and Apple Macin-toshes—logic and calculus soft-ware. There is also interest inpackages developed for music,operations research, civil engineer-ing, and psychology, chemistry,and physics.

puters is improvement in the qualityof education and not in reducedcost. Universities and their studentswill be faced with the need to payfor a major addition to the capitalplant and a service staff to supportthe computational model we havedescribed. It is quite possible thatthe total costs of the computing willexceed a university's expendituresfor its libraries.

A logical division of the total costwould be for the university to treatthe network and the back-end com-puters as a direct expense, with stu-dents paying for private worksta-tions, either directly or through anidentifiable increase in tuition. Fi-

November 1985

nancial-aid packages will also haveto reflect this increased cost of aneducation, further straining what inmost universities is already anovertaxed set of resources.

Computer centers will undoubta-bly move from their current statusas vendors of CPU cycles to opera-tors of a combination of a retailoutlet (selling private workstations)and a utility (providing networkingand specialized computational ser-vices delivered to workstations overthat network). Some services maybe treated as part of an operatingoverhead; others may operate on afee-for-service basis. For example,network use could be treated as a

part of the general university over-head, whereas printing chargesmight be based on usage. Some in-stitutions may choose to adopt amodified version of the proposedcomputational environment. Forexample, it would be possible for auniversity to provide only publicworkstations and to charge studentsan appropriate usage fee. Althoughthis approach would restrict theavailability of computers and mightresult in contention for the limitedset of private workstations, it wouldstill allow for much higher utiliza-tion of workstations, thereby reduc-ing the cost per user.

An often-discussed means of fi-

123

Page 14: Computing in Higher Education: The Athena Experience · Computing in Higher Education: The Athena Experience Ed Balkovich, Digital Equipment Corporation Steven Lerman, Massachusetts

nancing some of the costs of a com-putation-intensive environment isthe sale of university-produced ed-ucational software, but it seemsunlikely that this will be a majorrevenue source. The market for ed-ucational software is small, andmost of the institutional purchasesare for generic software productsrather than for subject-specific soft-ware. The total market is also highlyfragmented, making it difficult forany one subject-specific softwareproduct to sell widely. In addition,university faculty further fragmentthe market by customizing theircourses to reflect their own interestsand the abilities of their students.Finally, mature distribution chan-nels for such software are nonexist-ent.

Perhaps a more realistic source offinancing is the university's alumni.Students may well find that certainnetwork-based services such as li-brary search, electronic mail, or ar-chival service are of such value tothem that they will be willing to payfor the services beyond their col-lege years. In addition to providinga direct service to alumni, therewould likely be enormous benefitsfor the alma mater in keepingalumni an active part of the ex-tended community. New softwarecould be made available to the al-umni, providing a way for them tomaintain their personal computingenvironment and at the same timereinforcing their ties to the univer-sity.

Even with such revenue sources,it is inevitable that some universitieswill lack the financial resources tosustain computation-intensive en-vironments like those being pro-posed at MIT, Carnegie-Mellon,Brown, and various large, privateuniversities. For these schools, theimportance of this approach is notthe extent of computing that it pro-vides (e.g., a computer in everyroom); rather, it is the low cost ofincremental addition. Our relativelyheavy emphasis on creating a "sin-gle" system out of diverse and het-erogeneous hardware ensures thatthe important part—the computerthe student uses—can be added atlittle more than its unit cost. •

Edward Balkovich is a consulting engineerat Digital Equipment Corporation. As astaff member of the External ResearchProgram, he is responsible for DEC's ac-tivities in Project Athena. His technical in-terests include distributed systems andoperating systems.

Balkovich earned a BA in mathematicsfrom the University of California, Berke-ley, in 1968, and an MS and PhD in elec-trical engineering and computer sciencefrom the University of California, SantaBarbara, in 1971 and 1976, respectively.He is a member of both ACM and IEEE.

Steven Lerman is currently the director ofProject Athena at the Massachusetts Insti-tute of Technology, where he has been onthe faculty since 1975. A professor of civilengineering, he has taught courses oncomputer algorithms, with specific ap-plications to engineering and the sciences.He is the author of numerous technical ar-ticles and a coauthor of a forthcomingbook on the statistical analysis of trans-portation demand.

Lerman received his bachelor's, mas-ter's, and PhD degrees from MIT in 1972,1973, and 1975, respectively. His under-graduate degree is in civil engineering, andboth graduate degrees are in the area oftransportation systems.

Richard Parmelee is project manager forIBM's participation in Project Athena atMIT. Prior to that he was a staff memberat the IBM Cambridge Scientific Center,where, among other things, he worked onvirtual machine systems. His research in-terests also include file systems. Beforejoining IBM, he worked at MIT as aresearch associate. Parmelee holds a BSfrom the University of Illinois (1959) andan MS (1961) and PhD (1966) from MITin mechanical engineering.

Questions about this article can bedirected to the authors at ProjectAthena, Building E40, Massachusetts In-stitute of Technology, Cambridge, MA02139.

November 1985