Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as...

52
FRONT PAGE European Research Consortium for Informatics and Mathematics www.ercim.org C O N T E N T S Joint ERCIM Actions 2 Special Theme: Environmental Modelling 8 Research and Development 42 Events 48 In Brief 51 Number 34 July 1998 SPECIAL : Environmental Modelling 8 Dennis Tsichritzis, Chairman of the Board at GMD, the German National Research Center for Information Technology and President of ERCIM: New Times, new Challenges. Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships between the Institutions are deeper and we see many new projects. We avoided creating a heavy central administration, our office in Paris is lean and effective. Finally, the membership fees have dropped twice and we are running a surplus. In short, we have achieved a European dream. We live, however, in interesting times. We have achieved much but there are new challenges facing all of us. The integration challenge: Basic and Applied Research cannot run separately in our area. Every idea should have a chance to be relevant and the applications push our concepts to the limit. Interleaving structurally and dynamically Basic and Applied Research is a challenge. The impact challenge: Our area is booming. There is so much activity and so many diverse interests, that it is difficult to make our mark. We represent a very small portion of total R&D most of which runs in the private sector. We need a great effort to be visible and credible. The innovation challenge: There is a demand pull from the marketplace for technological innovation. We need to position ourselves in the innovation chain which provides at the end companies, jobs and wealth for our society. The qualification challenge: Our area expands so fast that it is lacking qualified people. We need to provide solutions of lifelong learning both for people being pulled from other areas and our own people needing constant retooling. The acceleration challenge: Everything regarding our area is moving extremely fast. All our processes from starting projects, repositioning, technology transfer to phasing out activities need to be constantly accelerated. The versatility challenge: Our area is becoming a key component in many new activities combining disciplines, eg Bioinformatics, Bioinformation, etc. We need to be versatile to work together with other disciplines and to defend our share in the new developments. All these new challenges should not derail us from our core business which is to provide quality Research in Information Technology. If you think that all this is hardly possible, then think of what was achieved in the last thirty years. At the time not only it was considered impossible, but not anybody could even dream of it. T Dennis Tsichritzis

Transcript of Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as...

Page 1: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

FRONT PAGE

European Research Consortium for Informatics and Mathematicswww.ercim.org

C O N T E N T S

Joint ERCIM Actions 2

Special Theme:Environmental Modelling 8

Research and Development 42

Events 48

In Brief 51

Number 34 July 1998

SPECIAL :

Environmental Modelling 8

Dennis Tsichritzis,Chairman of theBoard at GMD, theGerman National Research Centerfor Information Technology and President of ERCIM: NewTimes, new Challenges.

Next Issue:

Advanced Databases and Metadata

his August I am stepping down as ERCIM President after two terms.ERCIM has enlarged with more members. The relationships betweenthe Institutions are deeper and we see many new projects. We avoided

creating a heavy central administration, our office in Paris is lean and effective.Finally, the membership fees have dropped twice and we are running a surplus.In short, we have achieved a European dream.

We live, however, in interesting times. We have achieved much but there arenew challenges facing all of us.

The integration challenge: Basic and Applied Research cannot run separately inour area. Every idea should have a chance to be relevant and the applications pushour concepts to the limit. Interleaving structurally and dynamically Basic and AppliedResearch is a challenge.

The impact challenge: Our area is booming. There is so much activity and so manydiverse interests, that it is difficult to make our mark. We represent a very smallportion of total R&D most of which runs in the private sector. We need a greateffort to be visible and credible.

The innovation challenge: There is a demand pull from the marketplace fortechnological innovation. We need to position ourselves in the innovation chainwhich provides at the end companies, jobs and wealth for our society.

The qualification challenge: Our area expands so fast that it is lacking qualifiedpeople. We need to provide solutions of lifelong learning both for people beingpulled from other areas and our own people needing constant retooling.

The acceleration challenge: Everything regarding our area is moving extremelyfast. All our processes from starting projects, repositioning, technology transfer tophasing out activities need to be constantly accelerated.

The versatility challenge: Our area is becoming a key component in many newactivities combining disciplines, eg Bioinformatics, Bioinformation, etc. We needto be versatile to work together with other disciplines and to defend our share inthe new developments.

All these new challenges should not derail us from our core business which is toprovide quality Research in Information Technology. If you think that all this ishardly possible, then think of what was achieved in the last thirty years. At the timenot only it was considered impossible, but not anybody could even dream of it.

T

Dennis Tsichritzis

Page 2: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

JOINT ERCIM ACTIONS

2

ERCIMExecutiveCommittee 1995 – 1998by Christos Nikolaou and Constantine Stephanidis

From September 1995 to August 1998,Christos Nikolaou chaired the ERCIMExecutive Committee, succeeding BobHopgood, the first chairman of thecommittee. During this period,Constantine Stephanidis served as theERCIM Executive CommitteeSecretary.

ERCIM is now a well-known actor in theEuropean and international InformationTechnology scene. In the past three years,five new European institutes have joinedERCIM, expanding its geographicalcoverage and its scientific scope. Butthere was also the withdrawal ofAEDIMA, the Spanish consortium inERCIM. It is very much hoped andwidely expected that a Spanishrepresentation will soon re-join ERCIM.

ERCIM has established co-operationactivities at an international level, hasacted as a consultant to the EuropeanCommission, to internationalorganisations and national/localgovernments, and has contributed to thestrengthening of the European R&D inthe Information Technology field. Thisrole should be maintained and encouraged.

The ERCIM Executive Committee hasinitiated a number of actions to improvethe organisational structure andstrengthen the co-operation between themember institutes. Along this line,specific responsibilities have beenassigned to members of the ExecutiveCommittee, resulting in a more co-ordinated and team-oriented operation.

The ERCIM Working Groups and theERCIM Office have emerged as enginesof ERCIM growth through competitivebidding and successful funding ofERCIM proposals. There are nowfourteen working groups (versus six in1995) that cover a wide scientific area,organising conferences and workshops,establishing standard co-operationprocedures and dissemination activities,and giving birth to collaborative projectsbetween the participating institutes. BothERCIM Office and Working Groupsshould maintain and strengthen theirmomentum. In particular, the ERCIMWorking Groups should be stronglyencouraged to seek ERCIM co-ordinatedprojects more aggressively.

The figures of the ERCIM overall budgethave increased, rising from 1,5 MECUin 1995 to 3,3 MECU in 1998. Theprojection for this calendar year is thatERCIM will attract over 2 MECUfunding from the European Commission,versus 600 KECU in 1995, and show acarry forward of 750 KECU in 1998.

ERCIM’s visibility has grownsignificantly through the exploitation ofthe Web, the improvement of ERCIMpublications, and the involvement ofERCIM in international bodies (such asW3C and VRML Consortia) and asconsultant in EC sponsored or co-ordinated technical and policycommittees and working groups.

An ERCIM Internet domain has been setup (ercim.org), providing the electronicdoor to the activities and operation ofERCIM, while restricted web sites havebeen established to support the internalactivities and co-operation between theERCIM members. The ERCIM Webserver has been re-structured, and nowoffers a wide range of information,concerning working groups, the digital

library initiative, workshop sponsorship,fellowship programme, the Cor-Baayenaward, ERCIM co-ordinated projects,ERCIM events, and ERCIM publications(ERCIM News, technical reports,workshop proceedings, strategic reports,ERCIM members’ libraries). TheERCIM News (both in paper andelectronic form) proved to be a verysuccessful tool for the dissemination ofinformation about ERCIM researchactivities, views and policies.

The commitment to the provision of anERCIM Virtual Laboratory wasfollowed, but not fulfilled. Lack offunding and inadequate Europeanresearch networking infrastructure havefrustrated ERCIM efforts to become aPan-European Virtual Laboratory, wherenew technologies, such as the electronicmarketplace, digital libraries, distantlearning, ‘collaboratories’, and virtualworking spaces can be tested, piloted andseeded. In the next three-year period,ERCIM should become a more vocal andardent advocate of the need for a decisiveupgrade of the European networkinginfrastructure.

The significant progress made during thelast three years was only possible becauseof the team spirit that prevailed in theExecutive Committee and the constantencouragement and support given to usby the ERCIM Board of Directors andespecially by ERCIM’s President, DennisTsichritzis. We would like toacknowledge the active support of theEditorial Board through the ERCIMNews, and the ERCIM European LiaisonOfficers Group for being a vital link ofERCIM to Brussels. We would also liketo thank the ERCIM Office for thecontinuous efforts and commitmentagainst growing overload. Finally, wewish our two colleagues, EckartBierduempel and Gilbert Kalb, who takeover the chairmanship and secretariat ofthe Executive Committee respectively,every success in their new role andexciting mission.

Christos Nikolaou

ConstantineStephanidis

Page 3: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

JOINT ERCIM ACTIONS

3

ERCIM Meetingsat GMDby Siegfried Münch and Eckart Bierdümpel

Slovakia is the fifteenth member ofERCIM. On 29 May 1998, the SlovakResearch Consortium for Informaticsand Mathematics (SRCIM) joined theEuropean Consortium of nationalinformatics research institutes. TheDirector of the Department ofComputer Science of the ComeniusUniversity, Bratislava, Prof. Dr.Banislav Rovan, and the Directors ofthe ERCIM member institutes signedthe membership agreement during thisyear’s spring meeting of ERCIM heldfrom 25-29 May 1998 at GMD’sheadquarters in Sankt Augustin.About 100 computer scientistsparticipated in this meeting.

Three workshops held on 26 Maydiscussed the following subjects whichare important to the development ofcommon ERCIM research focuses:

• health information technology

• fluid mechanics

• database technology.

The Executive Committee met on 27May. They prepared the 10th anniversaryof ERCIM and discussed the recruitmentof additional personnel for the ERCIMCentral Office, the reports on work groupactivities, the fellowship programme andthe acceptance of new members. Anothersubject of the discussion was the electionof a new chairman since thechairmanship of Christos Nikolaou,FORTH, will end by August, 1998.Eckart Bierdümpel from GMD wassuggested as candidate for chairmanshipand Seppo Linnainmaa from VTT ascandidate for vice-chairmanship.

On 28 May, GMD presented some of itsactivities. Martin Reiser, head of theInstitute for Media Communication gavean overview of current projects. Theinstitute combines the key technologiesof multimedia and telecommunicationareas to develop innovative applications

for the fast growing market of electronicmedia and the Internet.

Projects of the System DesignTechnology Institute were presented byThomas Christaller, eg in the fields ofrobotics and artificial intelligence.Thomas Lengauer reported on theactivities of the Institute for Algorithmsand Scientific Computing to developmethods and tools for the computer

simulation of complex applications innatural science. His presentation focusedon activities in the field of molecularbioinformatics.

A tour of the institutes enabled thevisitors to gain insight into four activities:Cyberstage and the Virtual Studio,GMD’s Net Lab, the support ofcardiologic diagnosis by enablingsystems and 3D ultrasound and newresults of the development of sharedworkspaces in the World Wide Web asbasic support for cooperative work.

The ERCIM meetings ended with ameeting of the Directors on 29 May. Thesubjects of discussion were the 1997management report of the ERCIM officeand the consolidated budget. Theoutgoing chairman of the Executive

Committee, Christos Nikolaou, presentedthe report on the past six months. Inaddition, the Committee discussed legalproblems and possible activities to beplanned for the 10th anniversary ofERCIM in 1999.

As successor of Dennis Tsichritzis fromGMD who was ERCIM’s President forfour years, Gerard von Ortmerssen, CWIwas elected ERCIM President. Stelios

Orphanoudakis from FORTH and PekkaSelvennoinen from VTT were electedvice-Presidents. The ExecutiveCommittee’s suggestions for the electionof the chairman and vice-chairman wereaccepted.

The next ERCIM meetings will be heldin the first week of November inAmsterdam.

■Please contact:

Siegfried Münch – GMDTel: +49 2241 14 2303E-mail: [email protected]

Eckart Bierdümpel – GMDTel: +49 2241 14 2256E-mail: [email protected]

Participants of the meeting of ERCIM’s board of Directors. Left to right:Pekka Silvennoinen, VTT; Gordon Walker, CLRC; Eldfrid Ovstedal, SINTEFTelecom and Informatics; László Monostori, SZTAKI; Rainer Berling, SICS;Gerard van Oortmerssen, CWI; Jí í Wiedermann, CRCIM; Carl AugustZehnder, SGFI; Branislav Rovan, SRCIM; Bernard Larrouturou, INRIA;Dennis Tsichritzis, GMD; Stelios Orphanoudakis, FORTH.

Page 4: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

JOINT ERCIM ACTIONS

4

New ERCIMWorking GroupFluid Mechanicsestablished by Ullrich Becker-Lemgau

The kick-off meeting of the newERCIM working group on FluidMechanics was held in GMD, SanktAugustin, 26 May 1998. It was hostedby the Institute for Algorithms andScientific Computing of GMD.

So far four ERCIM institutes participatein the new working group:

• CWI, Computational Fluid DynamicsGroup

• FORTH, Institute of Applied andComputational Mathematics

• INRIA, research team M3N – Multi-Models and Numerical Methods

• GMD, Institute for Algorithms andScientific Computing.

Two main research areas in the contextof Computational Fluid Dynamics werechosen to build the focus of the workinggroup:

• multidisciplinary applications (coupling)

• shallow water equations (numeric,solver, applications).

Especially multi-disciplinary applicationswere given much potential for the nearfuture. In nearly every area ofapplications covered by the participatinginstitutes the need for tools and numericalalgorithms to solve coupled problems isincreasing. The main focus of theworking group will be the definition ofa joint project and the organisation of aworkshop as a result of this project.Research groups of other ERCIMinstitutes that are interested in joining theworking group are requested to contactthe chairman. For more information, see:h t t p : / / w w w . g m d . d e / S C A I / s c i c o m p /ERCIM/fluid-dynamics

■Please contact:

Ullrich Becker-Lemgau – GMD Working Group chairmanTel: +49 2241 14 2369E-mail: [email protected]

Third ERCIMEnvironmentalModellingGroupWorkshop by Thomas Lux

The third workshop of the ERCIMEnvironmental Modelling Group onAir Pollution Modelling andMeasuring Processes was held inMadrid, Spain, on 20-21 April 1998.The workshop was hosted by theEnvironmental Software andModelling Group of the ComputerScience School of the UniversidadPolitecnica de Madrid (UPM). Theworkshop chairman was Roberto SanJosé (UPM).

The lectures and discussions at theworkshop focussed especially on theimportance of the linkage between the

modelling processes and the measuringtools. Measurements play two significantroles in air pollution modelling. At first,the usage of measured meteorological,emission and background concentrationdata as input for air quality simulationsobviously influences the quality of thesimulation results, especially for airquality forecasts. At second, measurementdata are one of the basic preconditionsfor the validation of air quality models.As highlights of the program, invitedlectures of representatives of the local

environ-mental authorities of the State ofMadrid and the State of Brandenburg(Germany) were given . The participationof these invited speakers was supportedby the ERCIM Office.

A special item of the workshop programwas a round table discussion about thefurther activities of the ERCIM WorkingGroup Environmental Modelling, led bythe working group chairman AchimSydow. Nikolaos Kampanis fromFORTH, Institute of Applied andComputational Mathematics (IACM)agreed to organize the next workshop athis institute in Heraklion, Crete, Greece.The proposed date of the 4th workshopis the time between mid October and midNovember this year. Proposed topics, asdiscussed in Madrid, are the validationof environmental models, modelcomparison, data mining, ecological/economical impact analysis andproblems of sustainable development.

Moreover, the submission of futureproject proposals was rised, and thepreparation of the ERCIM-NEWS issuenumber 34 with the special theme

Environmental Modelling was discussed.The program was wound up by a tour ofthe Computer Science School at UPM,and by a common dinner in a typicalMadrid restaurant. Detailed informationabout the workshop program and theparticipants can be found athttp://artico.lma.fi.upm.es/ercim.html

■Please contact:

Thomas Lux – GMDTel: +49 30 6392 1820E-mail: [email protected]

Participants of the ERCIM workshop on Air Pollution Modelling and Measuring Processes.

Page 5: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

JOINT ERCIM ACTIONS

5

First ERCIMHealth andInformationTechnologyWorkshop by Vesa Pakarinen

The first workshop of ERCIMWorking Group Helath andInformation Technology (HIT) washeld on 26 May during the ERCIM-week at GMD in Sankt Augustin,Germany. The participants came fromsix different ERCIM memberorganisations: CLRC, FORTH, GDM,INRIA, SGFI and VTT. In addition,INESC had sent their writtencontribution.

Invited speaker, Peter Fatelnig fromEuropean Commission (Directorate-General XIII: Telecommunications,Information Market and Exploitation ofResearch:Telematics application forhealth) gave information about theBackground & Status of the activities onthe preparation of the 5th FrameworkProgramme. The introductions anddiscussions at the workshop took alsonote on the Report of the StrategicRequirements Board (Health Telematics)that is available at http://www.ehto.be/ht_projects/report_board/index.html

The purpose of the workshop was toexchange ideas for possible co-operationfor the future project proposals amongERCIM members. It was also decidedthat ERCIM members may contact eachother directly about mutual interest areas.However, the chairman should beinformed also to avoid overlappingproposals. All the introductions and theprogram itself are linked throughERCIM-pages and can be seen ath t t p : / / w w w . v t t . f i / t t e / t r e / p r o j e c t s / e r c i m /workshop.htm

In the planned Thematic ProgramCreating a User-Friendly Informations o c i e t y (TP2) one of the key actions isSystems and Services for the Citizen. The

aim of this key action is to provide userswith easier access at the lowest cost toquality general-interest services andboost the industry providing theseservices. Concerning the priorities ofHealth and ageing, disability and otherspecial needs, the objectives are:

• new generation computerised clinicalsystems

• advanced telemedicine services

• health network applications tosupport health professionals,continuity of care and health servicemanagement

• intelligent systems and servicesallowing citizens to assume greaterparticipation and responsibility fortheir own health.

RTD priorities for professional healthcare contain:

• systems enhancing the ability of healthcare professionals for prevention,diagnosis, care and rehabilitation, egintelligent systems for non-invasivediagnosis and therapy, intelligentmedical assistants, advanced medicalimaging, advanced telemedicineapplications, ‘virtual hospitals’ offeringsingle point-of-entry services, high-speed secure networks and applicationsfor linking hospitals, laboratories,pharmacies, primary care and socialcentres for continuity of care, healthservice workflow management and re-engineering, new generation electronichealth records and cards forsophisticated health data objects:

• systems for personal health monitoringand fixed or portable preventionsystems, including advanced sensors,transducers and micro-systems

• personal medical advisors forsupervision of prevention and treatment

• tele-systems and applications forsupporting care in all contexts

• user-friendly and certified informationsystems for supporting health educationand health awareness of citizens.

Information is available at www.ehto.beand at www.cordis.lu/fifth/src/docs.htm

The chairman wishes to invite moreERCIM colleagues to the work, please

spread the word and kindly ask peopleto subscribe on emails.

■Please contact:

Vesa Pakarinen – VTTTel: +358 3 316 3358E-mail: [email protected]

Third ERCIMFMICSInternationalWorkshopby Diego Latella

The third ERCIM InternationalWorkshop on Formal Methods forIndustrial Critical Systems (FMICS)was held at CWI, Amsterdam, on 25-26 May 1998.

The success of the meeting wasevidenced by the lively discussions andexchanges of opinion between theapproximately 40 participants from bothacademia and industry. Four invitedspeakers gave the following talks:

• Wam J. Fokking – Univ. of WalesSwansea, United Kingdom:‘Verification of Interlockings: fromControl Tables to Ladder LogicDiagrams’

• Gea Kolk – Holland Railconsult, TheNetherlands: ‘Formal Methods:Possibilities and Difficulties in aRailway Environment from a UserPerspective’

• Hubert Garavel – INRIA Rhone-Alpes, France: ‘Towards a SecondGeneration of FDT – Rationale forthe Design of E-LOTOS’

• Rance Cleaveland – North CarolinaState University, USA: ‘VerifyingActive Structural Control Systems: aCase Study in Formal Analysis’.

14 additional presentations were givenon various aspects of formal methods andtheir application to case studies.

Page 6: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

JOINT ERCIM ACTIONS

6

During the meeting a report wasdelivered on the status of the WorkingGroup:

• a special issue of the Kluwer AP Journalon Formal Methods in Systems Designdedicated to the 1st FMICS workshop(Oxford, March 1996) has beenpublished (Vol. 12 n. 2)

• a special issue of the Springer Journalon Formal Aspects of Computingdedicated to the second FMICSworkshop (Cesena, July 1997) is nowin preparation

• a special issue of the Elsevier journalon Science of Computer Programmingon the Application of Formal Methodsto the Industry which containscontributions from Working Groupmembers is also forthcoming

• an ERCIM fellowship in the field hasbeen granted at INRIA and CWI.

The Proceedings of FMICS ‘98 werepublished by CWI (ISBN 90 6196 480 6 )while a selection of the papers presentedduring the workshop will be publishedin a special issue of Formal Aspects ofComputing.

GMD-FOKUS and CNR have bothoffered to organize FMICS’99. Adecision will be taken soon.

For more information on the FMICSWorking Group of ERCIM, see: h t t p : / / f d t . c n u c e . c n r . i t / ~ l a t e l l a / F M I C S /WgDescription.html

■Please contact:

Diego Latella – CNUCE-CNRFMICS-WG ChairE-mail: [email protected]

Jan Friso Groote – CWIFMICS 98 PC ChairE-mail: [email protected]

ERCIM WorkingGroup onProgrammingLanguageTechnologiesby Neil Jones

The constitutory meeting of thisbrand-new ERCIM Working Groupwill be held in Pisa, Italy on Saturday,19 September. This is the same placeand just after the cluster ofconferences SAS, PLILP, and ALP.All interested ERCIM members areencouraged to attend. If you plan tobring a guest, please contact theworkshop organiser (see addressbelow).

Topics to be discussed at the first meetingwill include projects, proposals, summerschools, internal mobility and fellows,and future workshops.

We have a high expectation of stronginteractions. This is partly due to the factthat we already know many professionalcolleagues within the ERCIM partnersfrom other contexts. Further, we lookforward to new and fruitful contacts inour area, which we feel will become alocus of significant activity withinERCIM.

Possible topics include but are not limitedto:

• Compiling and implementation:program analysis, abstractinterpretation, program control flow anddata flow analysis.

• Model-specific issues concerning:functional, logic, object-oriented,parallel, and distributed programming.

• Program manipulation techniques:program transformation, partialevaluation, program synthesis.

• Programs as data objects: metaprogramming, incrementalcomputation, correctness of compilers,interpreters, instrumenters, etc.

• Tools, techniques, and representations:program and code representations,proof-carrying code, tools for programmanipulation including analysis,prototyping and debugging.

A preliminary WWW page can be foundat http://www.diku.dk/users/neil/PL/

■Please contact:

Neil Jones – DANIT/DIKUTel: +45 35 32 14 10E-mail: [email protected]

Eleventh ERCIMDatabaseResearch GroupWorkshop:Metadata forWeb Databasesby Brian J Read

The latest in the series of ERCIMDatabase Research Group workshopswas held at GMD Birlinghoven Castlenear Bonn on 26 May 1998. The topic‘Metadata for Web Databases’attracted significant interest amongmembers of the working group,resulting in a very full day ofpresentations and discussion. KarlAberer of GMD-IPSI organised andchaired the workshop. There weresixteen participants from ten institutes.

In the context of Web data management,database systems are mostly used in anisolated way as data sinks or sources.Data management services that exploitand support the connectivity of the Webrequire the interaction and co-operationof different data managementcomponents on the Web. To enable thisthe Web needs to be equipped with themetadata on structure and behaviour ofWeb data that these components require.Thus the workshop was intended toaddress such questions as the extraction,modelling and querying of metadata, soadding semantics to the use of web data.

Page 7: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

JOINT ERCIM ACTIONS

7

Keith Jeffery (CLRC-RAL) introducedthe workshop topic by presenting anoverview of the nature of metadata indatabases, distinguishing its variouspurposes, and classifying it into threemain kinds: schema, navigational andassociative. Capturing metadata fromthe web presents problems as virtualpages generated from database queriesare invisible to the large web crawlers.The limitations of HTML, and indeedXML, in managing metadata werediscussed in this and several subsequenttalks.

Yannis Stavrakas (NTU-Athens)expanded on the nature of metadata forweb-based information systems. Hedistinguished three perspectivescorresponding to the atomic level(information within a page or document),the local level (the structure of a site andlinks between documents), and the globalinformation space of the whole web.

Terje Brasethvik (IDI/NTNU-Trondheim), currently in Paris, describedhis work with Arne Sølvberg on aReferent Model of Documents classifiedby semantic metadata. In this approachto sharing information on the web, theyare developing a modelling language andeditor to capture the meaning ofdocuments.

Giuseppe Sindoni (Rome III University),currently visiting RAL, presented workfrom Paolo Atzeni’s Rome group on alogical model for metadata in web bases.Their Araneus Data Model with thePenelope language embeds the schemawithin HTML. Turning to XML ispotentially attractive, but that too haslimitations for data modelling.

Three research projects were covered inthe afternoon session. MenzoWindhouwer (CWI) described the workwith Martin Kersten on the Acoi project.This is developing a feature detectorengine to classify multimedia objects,especially images. The Acoi web robothas already stored in a database detailsextracted from over two hundredthousand images.

Thomas Klement (GMD-IPSI) spokeabout the ICE (Information Catalogue

Environment) project. This concernsmetadata for multidimensionalcategorisation and navigation support onmultimedia documents. It includes aninteresting use of dynamic menus toexplore hypercube structures stored inan object-relational database.

The last presentation was from DonatellaCastelli (CNR-Pisa) about supportingretrieval by “relation among documents”in the ERCIM Technical ReferenceLibrary (ETRDL) based on the Dienstsystem and the Dublin core. Thisprovided an interesting discussion on thepossible semantics of a relationshipdefined between documents.

The workshop concluded with a livelypanel and discussion session on the futureresearch direction of EDRG and also itsrole in the EC Fifth FrameworkProgramme. A relevant component ofthe latter is “Creating a User FriendlyInformation Society”, especially KeyActions relating to application domains.

This suggested that future workshopsmight be targeted towards an applicationarea (such as transport, environment orhealth) instead of a technical topic. CWIemphasised semantic indexing of theweb, in particular by involving the enduser, in an ambitious research agenda andcautioned against being too muchinfluenced by funding considerations.

The workshop papers will be publishedin the ERCIM reports series( h t t p : / / w w w . e r c i m . o r g / p u b l i c a t i o n /workshop_reports.html).

■Please contact:

Brian J Read – CLRCTel: +44 1 235 44 6492E-mail: [email protected]

If you wish to subscribe to ERCIM News or if you know of a colleague who wouldlike to receive regular copies of ERCIM News, please fill in this form and we willadd you/him/her to the mailing list.

Name:

Organisation:

Address:

E-mail:

send this form to: ERCIM OFFICE, Domaine de Voluceau, Rocquencourt

BP 105 , F-78153 Le Chesnay Cedex

(Data from this form will be held on a computer database. By giving your email address, you allow ERCIM to send youemail. For subscribers from the UK: The Council for the Central Laboratory of the Research Councils is registered underthe Data Protection Act and is authorised to hold details from this form on a computer database for any use whichcomplies with the requirements of the Act)

You can also subscribe to ERCIM News and order back copies by filling out the form at the ERCIM web site at http://www.ercim.org/publication/Ercim_News/

Page 8: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

8

ENVIRONMENTAL MODELLING

EnvironmentalModelling and SimulationResearch by Achim Sydow

Information technology has played anincreasing role in the planning andcontrolling of environmental issues atdifferent scales and within varioustime spans. It has been a long journeyfrom the establishment of ecology as ascience dealing with the relations oforganisms to their environment(Haeckel, 1866) and the definition ofthe concept of ecosystems (Tansley,1935), to the development of our fieldof computational ecology. R a p i ddevelopments in information technologymade the establishment of computationalecology possible and continue toaccompany our research endeavours.

Environmental systems consisting ofgeophysical and geochemical elements,abiotic factor complexes (atmosphere,hydrosphere, pedosphere) and bioticelements (growth processes, populationdynamics) are real complex systems.Information technology has succeededin developing adequate tools formodelling, simulation, planning anddecision support for environmentalprotection. As a result, education aimedtoward transmitting an understanding ofenvironmental systems is nowadaysunthinkable without the use of computertechniques.

Considerable progress has been reachedin a number of different research areas:

• in theoretical areas, the use of HighPerformance Computing simulation hasbrought spectacular results in systemsdynamics (evolution strategies, logisticgrowth, chaos research)

• in climate research, the long termanalysis of global change

• in economics and ecology, theconsiderations of sustainable development

• in mathematics, the development ofpowerful algorithms for integration anddecomposition methods for parallelization.

CONTENTS

• Environmental Modelling and Simulation Research by Achim Sydow . . . . . . . . . . .8

• Data Management in Climate Research by Kerstin Kleese . . . . . . . . . . . . . . . . . . . .9

• Performance Prediction Studies and Adaptive Refinements for Local Area Weather Models by Kläre Cassirer, Reinhold Hess, Wolfgang Joppich and Hermann Mierendorf . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10

• A Data Management and Visualization System for Coastal Zone Management of the Mediterranean by Catherine Houstis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11

• A High Resolution Integrated Forecast System for the Mediterranean Sea by Andrea Bargagli, Roberto Iacono, Sergio Nicastro, Paolo Michele Ruti and Franco Valentinotti . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .12

• IDOPT: Identification and Optimization for the Geosciences by Jacques Blum and François Le Dimet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .14

• Environmental Modelling at IACM-FORTH: Aiming at the Future by Nikolaos Kampanis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15

• Implementing a High Resolution Numerical Model of the Strait of Gibraltar by Gian Maria Sannino, Roberto Conversano and Vincenzo Artale . . . . . . . . . . . .16

• Water Quality Modelling in the Channel Network of Venice by Lucia Zampato, Georg Umgiesser, Martina Bocci and Giovanni Coffaro . . . .17

• Numerical Algorithms for Air and Surface Water Modelling by Jan Verwer . . . .19

• Modelling of the Environmental Impact of Main Airborne Pollutants Producers by Franti ek Hrdli ka and Pavel Slavík . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .20

• Dynamic Models for Smog Analysis by Achim Sydow, Thomas Lux, Peter Mieth, Matthias Schmidt and Steffen Unger . . . . . . . . . . . . . . . . . . . . . . . .21

• Development of an Integrated Environmental Monitoring and Simulation System by Peter Mieth, Matthias L. Jugel, Steffen Unger and Bertram Walter . . . . . . . . . .22

• Air Quality Modelling for the Madrid Region by Roberto San José . . . . . . . . . . .23

• Computing the Radiative Balance in Mountainous Areas: Modelling, Simulation and Visualization by Jean-Luc Borel, Yves Durand, François Sillion and Christophe Le Gal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24

• Environmental Risk Management by Steffen Unger, Irene Gerharz, Matthias L. Jugel, Peter Mieth and Susanne Wottrich . . . . . . . . . . . . . . . . . . . . . .26

• Integrated Traffic and Air Quality Simulation by Matthias Schmidt, Birgit Kwella, Heiko Lehmann, Christian Motschke and Ralf-Peter Schäfer . . . .27

• Land Use and Wind Estimation as Inputs for Air Pollution Modelling by Dominique Bereziat, Jean-Paul Berroir, Sonia Bouzidi, Isabelle Herlin and Hussein Yahia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .28

• Data Mining applied to Air Pollution by Brian J Read . . . . . . . . . . . . . . . . . . . . . .29

• Operational Air Pollution Modelling by Tiziano Tirabassi . . . . . . . . . . . . . . . . . .30

• Sustainable Development and Integrated Assessment by Achim Sydow, Helge Rosé and Waltraud Rufeger . . . . . . . . . . . . . . . . . . . . . . .32

• The Modelling and Control of Living Resources by Jean-Luc Gouzé . . . . . . . . . .33

• ECHO and NARSYS – An Integrated Interactive Acoustic Modeler and SoundRenderer for Virtual Environments by Nicolas Tsingos and Jean-Dominique Gascuel . . . . . . . . . . . . . . . . . . . . . . . . .34

• STOY – An Information System for Environmental Noise-Planning by Ole Martin Winnem and Hans Inge Myrhaug . . . . . . . . . . . . . . . . . . . . . . . . . .35

• Partial Differential Equations in Porous Media Research by Hans van Duijn . . . .36

• Descartes System – Interactive Intelligent Cartography in Internet by Gennady Andrienko and Natalia Andrienko . . . . . . . . . . . . . . . . . . . . . . . . . . .37

• Travelling Waves in Nonlinear Diffusion- Convection- Reaction Processesby Róbert Kersner . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .38

• ESIMEAU – Water Resources Management and Modelling in Semi-Arid Areas by Fadi El Dabaghi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .39

• ERCIM Working Group Environmental Modelling by Achim Sydow . . . . . . . . . .41

Page 9: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

9

ENVIRONMENTAL MODELLING

Interdisciplinary co-operation hasbenefited from computer networking.Parallel computation greatly assists in theefficient analysis of large scale andcomplex environmental systems. Weatherand ozone forecasts are based on parallelcomputation. Visualization techniquesallow comprehensive overviews and thusassist decision support. Simulation givesnumerical insight into the behaviour ofcomplex environmental systems. Intelligentinformation technologies (neuronal nets,evolution strategies, expert systems)support modelling should relevantbackground structures from the naturalsciences be unavailable. These informationtechnologies can also assist data mining.Algorithms for optimization and poly-optimization provide a helpful aid fordealing with conflict situations which canevolve between the areas of ecology,economics and the needs of society.

The articles in this issue of ERCIM-Newsreflect the wide range of research andapplications conducted within theERCIM community in the field ofenvironmental modelling and simulation:• analysis of regional systems• climate research

• air pollution modelling• water pollution modelling• modelling of natural resources

• modelling of traffic systems• methods for modelling and

optimization • mathematical methods, partial

differential equations

• simulation tools• environmental risk management• data management

• data mining• visualization• intelligent cartography.

The ERCIM working group Environ-mental Modelling provides a platformfor the discussion of results in this areaand promotes further research. Inaddition to national research programmesthe European Commission is activelysupporting this research.

■Please contact:

Achim Sydow – GMDWorking Group chairmanTel: +49 30 6392 1813E-mail: [email protected]

DataManagement in ClimateResearchby Kerstin Kleese

Parallel environmental models arecurrently some of the most demandingcodes we have. They push themachines available to their limit andare still in need of more resources.However the bottleneck in runningthese codes is not so much the codeperformance as the data handlingstrategies employed. The HighPerformance Computing InitiativeCentre at CLRC, DaresburyLaboratory (UK) is setting up a newproject to investigate this importantissue. Besides analysing existingsolutions, the project will also look fornew, more flexible and portableapproaches.

The UK environmental researchcommunity relies heavily on HighPerformance Computing (HPC) tofacilitate its studies. Fortunately theenvironmental model codes have a largepotential for parallelization, which hasbeen well explored by numerous researchgroups in the UK (eg UGAMP andOCCAM). Still increased resolutions,longer runs or larger ensembles have asignificant impact on the HPC resourcesrequired and can easily overwhelmcurrent systems. Often data handling hasthe most dramatic influence on modelrun times or determine whether a modelcan be run at all. Thus optimal datahandling is vital for this type ofapplication. Unfortunately the problemsdo not stop with the end of a successfulmodel run. These codes produce vastamounts of data which have to bearchived for future analyses. Current datastorage systems leave something to bedesired in speed and ease-of-use. Thesituation for data retrieval is even worse.Tedious searching for archived data, longwaiting times and no selective extractionpossibilities are common problems formodern scientists. Sometimes it is faster

to run a model again instead of retrievingdata from a previous run.

It is already clear that the datarequirements of the community willincrease even more over the comingyears. Big centres like the EuropeanCentre for Medium Range WeatherForecast expect the volume of their dataarchive to double every 18 months. Newmachine architectures allow potentiallylarger models to be run. Data handlingpresents a severe bottleneck for today’sscience, without new strategies it mightprevent future progress.

First steps

For our project it was decided to take aholistic approach, identifying four mainareas of interest:

• data handling within the model codes

• file access during run time (ondifferent platforms)

• data archival

• data retrieval.

Although all these areas have beeninvestigated separately, and someinteresting in-house solutions exist, littlehas been done to offer a portable solutionthat can be easily adapted to the actualrequirements of different sites.

The first step is to analyse the currentsituation. Research results of leadingscientific groups concerning datahandling strategies within model codeshave been examined. A lot of work hasbeen done in this area over the past years,and we can certainly benefit from that.We would like to compare the different

More intelligent solutions for dataarchival and retrieval systems areneeded.

Page 10: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

10

ENVIRONMENTAL MODELLING

approaches, trying to find similarities,differences and tendencies that could beuseful for the community. Secondly a listis in preparation covering the differentfile access mechanisms during run timeon various systems. This gives a clearoverview about what is available, howfast is it and what the user can do to makethe most of it. This information will serveas a base for further investigations. Inconnection with vendors and other siteswe have started to gather moreinformation about the data archival andretrieval systems that are in use today.The clear message so far is that there isa desperate need for more intelligentsolutions.

Future

Our project will continue to analyseexisting solutions. It will test which datahandling strategies within model codesare best for which type of application.We will frequently investigate newmachines or relevant changes to existingsystem architectures. A collaborationwith a leading systems house has juststarted, to determine which off-the-shelfproducts could be used to provide moreflexible data archival and retrievalmechanisms for scientific data.

■Please contact:

Kerstin Kleese – CLRCTel: +44 1 925 60 3207E-mail: [email protected]

PerformancePredictionStudies and AdaptiveRefinements for Local AreaWeather Modelsby Kläre Cassirer, ReinholdHess, Wolfgang Joppichand Hermann Mierendorff

The new generation of numericalweather prediction models is designedfor parallel machines. Since thedevelopment of such a code can takeseveral years of time, the increase ofcomputer power during this timeframe has to be considered in thedesign. But there is not only anincrease of computational power,further on also research on new andfaster algorithms takes place in orderto predict the weather even moreaccurate and reliable in the future.Current activities within the themecluster METEO of GMD Institute forAlgorithms and Scientific Computingconcentrate on performanceprediction studies for meteorologicalmodels and algorithmical research onlocal adaptive refinements.

Numerical weather forecasting andclimate prediction require enormouscomputing power to achieve reliableresults. Six of the twenty-six mostpowerful computer systems in the worldare dedicated to weather research(November 1997). All of them areparallel architectures with the number ofnodes ranging between 64 and 840.In order to exploit this high computingpower existing codes have to be adaptedto parallel computers, and new parallelalgorithms are developed.

After the year 2001 the operational LocalModel (LM) of the DeutscherWetterdienst (DWD) will run in ahorizontal resolution of approximate 3

km mesh size by 50 vertical layers andwith time steps of only 10 seconds. Sincea one-day prediction on about800x800x50 grid points has to run withinhalf an hour of real time in theoperational mode, enormous computingpower is demanded.

Currently no available machine in theworld can fulfill these computationalrequirements, and direct run-timemeasurements for the LM with theoperational resolution are not possible.Therefore, a study was initiated to predictthe performance of the LM and to definespecifications of adequate parallelsystems.

The computational complexity wasmodelled with series of LM-runs on anIBM SP2 with up to 32 nodes usingdifferent spatial and temporal resolutionsand different numbers of processors. Ina least square approximation thecoefficients for approximation functionswere determinated. With calibration runsof the LM with lower resolution themodel could be scaled for variousexisting machines. These run-timemeasurements are presented in the figure1. Note, that different parts (computationand physics) of the LM scale in adifferent way, a model for eachindividual part was set up therefore.

The communication requirements weredetermined by a structural analysis of theLM. The communications, the numberand of sizes of the messages to beexchanged, were parameterized by thesize of the global grid and its partitioning.For existing machines thecommunication model is based oncommunication benchmarks. However,standard benchmarks measuring ping-pong, etc., were not fully sufficient anda special communciation benchmarkmeasuring the special data exchange ofthe LM had to be implemented.Hypothetical architectures can bemodelled by assumed characteristics forcomputation and communication, assustained flop rate, latency andbandwidth.

As a result of the study, the predictedrequired computation demands aretremendous. Indeed, no currently

Page 11: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

11

ENVIRONMENTAL MODELLING

available machine meets therequirements mentioned above. Ahypothetical parallel computer at leastrequires 1024 nodes with 8-10 GFlopseach and a network with about 250MB/sbandwidth. However, it can be assumed,that in the year 2001 machines of thisstyle will be available on the market.

From these requirements for anoperational system it becomes obvious,that algorithmical improvement ofmeteorological models is very important.Since also new numerical models have

to run on parallel systems, parallelismbecomes a very important factor formodern algorithms beside numericalefficiency.

A promising idea is to apply dynamicallyadaptive local refinements to numericalweather simulations. The computationalcosts could be essentially reduced, whenhigh resolutions are provided only whereit is necessary (eg weather fronts, stronglow pressure areas). Calm regions couldbe calculated with a lower mesh size.However, since the weather situation ischanging during the simulation, the

refinement areas have to be adapted intime.

For the Shallow Water Equations, whichbuild the dynamical core of numericalweather predictions, a parallel localmodel with dynamically adaptive localrefinements has been developed andimplemented. On a structured global grid,refinement areas are composed ofadjacent rectangular patches, which arealigned to the global grid with refinementratio 1:2. With a suitable mathematicalcriterion the refinement areas are

dynamically adapted to the calculatedsolution.

Major problems for parallel computerswith distributed memory are theorganization of data and the dynamicalload distribution. Asynchronous, non-blocking communication is used in orderto combine adaptivity and parallelismbest in this approach.

■Please contact:

Reinhold Hess – GMDTel: +49 2241 14 2331E-mail: [email protected]

A DataManagementand Vi s u a l i z a t i o nSystem forCoastal ZoneManagement of theMediterranean by Catherine Houstis

The Mediterranean coastal regionshave been, and continue to be, underthreat from over-exploitation, r e s u l t i n gin environmental degradation, m o s tnotably visible as marine pollution.THETIS is a 30 months researchproject starting in June 1998, that willestablish a data management and datavisualisation system for supportingCoastal Zone Management (CZM) forthe Mediterranean Sea.

CZM is a methodology for the holisticmanagement of all coastal resources withthe ultimate aim of promoting sustainabledevelopment of the coastal zones. CZMrecognises that pollution problemstranscend political boundaries and so tobe effective CZM requires the integrationof multinational data collection, datamanagement and data visualisationacross many scientific disciplines suchas marine biology, oceanography,chemistry and engineering.

The THETIS project seeks to address thefrequent requirement of scientists,engineers and decision-makers to access,process and subsequently visualise datacollected and stored in different formatsand held at different locations. The needexists for tools that enable the integrationof these data, together with theirassociated data models, interpretationmodels, and visualisation environments.The objective of the THETIS project isto build an advanced integratedinteroperable system for transparentaccess and visualisation of datarepositories, via the Internet and the Web.The system will be a working prototype,

Timemeasurements for calibrationruns.

Page 12: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

12

ENVIRONMENTAL MODELLING

which will demonstrate its ability torespond to users such as scientists andlocal authorities that use scientificinformation for decision making.

The technologies used at this project areWeb tools, digital library indexing/searchservice, over the Web visualisation, 2Dand 3D visualisation tools, GeographicalInformation Systems, Virtual RealityModelling Language).

The expected benefits of the THETISsystem for the users of the CZMapplication, the citizens and the Europeanindustries are very important. Integratedand easy access to information createsnew services, thus new job markets.Integrated data repositories access andinteractive use for CZM creates bettercoastal zone preservation with immediateimpact to the European citizen.

Moreover, companies will be able to usethe system to sell their information andcreate new integrated services for theircustomers. Consulting firms can makeuse of the THETIS system and tools tosupport their business. Finallyenvironmental policies concerning theuse of coastal resources can be monitoredvia the integrated information supportand easy access of THETIS.

As shown in the figure, clients submitsimple or complex queries via the Webinterface (Web browser) to the system.User requests invoke various servicessuch as metadata, index and search tolocate the objects, which match the user-

query. We assume that documents arestored in DIENST servers, and metadataservices are provided to access them. Thedocuments could be research paperswritten by scientists studying the coastalproperties of the Mediterranean Sea.Images could be indexed via relationalor object oriented databases in variousformats. It is possible that the informationabout a region could be dispersed acrossseveral databases. This means that theinformation system has to index and

search across the various databases toobtain the corresponding information. Itis possible that information could bereplicated across the databases. For thisservice, distributed search queries (viathe Web) are sent to the various databasesto obtain the objects. Metadata servicesdescribing the GIS objects are used toindex/search for the appropriate GISobjects (multi-dimensional data andimages). Distributed search agentscollect/transform the various informationobjects and present the user with acomposite result object.

The participants of the project are ICS-FORTH, ERCIM, INRIA, CNR, theInstitute of Marine Biology of Crete(Greece), AEROSPATIALE (France),the IACM-FORTH (Greece), theUniversity of Crete, HR Wallingford(United Kingdom), and RECORMEDNetwork (France).

■Please contact:

Catherine Houstis – ICS-FORTHTel: +30 81 39 17 29E-mail: [email protected]

A HighResolutionIntegratedForecastSystem for theMediterraneanSeaby Andrea Bargagli, RobertoIacono, Sergio Nicastro, PaoloMichele Ruti and FrancoValentinotti

An integrated atmosphere-oceanforecast system, covering the wholeMediterranean basin, is beingdeveloped at ENEA (Italian NationalAgency for New Technologies, Energyand the Environment). The project,sponsored by the Department ofTechnical Services of the ItalianGovernment, has two main goals: thehigh resolution forecast of the state ofthe Mediterranean Sea (in terms ofsurface wind, wave height anddirection) and the prediction offlooding events in the city of Venice.The operational forecast system, whichwill be completed by mid-1999, willprovide a useful tool for CivilProtection and, possibly, for otherinstitutions involved in environmentalsurveying, coastal protection,navigation control and harbourmanagement.

The integrated atmosphere-oceanforecast system for the Mediterranean,which is now being developed by thegroup for Atmosphere-Ocean Dynamicsat ENEA, is shown in Figure 1.Predictions of the amplitude, frequencyand direction of the sea-waves areobtained from a spectral wave model,while a two-dimensional grid-pointmodel of the Adriatic Sea circulation,coupled with a shallow water, finiteelement model of the Lagoon, is used topredict the sea level in Venice. Both thewave and the ocean models are forcedby the surface wind stress computed, at

The componentsof the THETISsystemarchitecture.

Page 13: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

13

ENVIRONMENTAL MODELLING

very high resolution (10 km), by a 3Dmeteorological Limited Area Model(LAM) running over the wholeMediterranean basin.

To satisfy the operational requirements(ie, a few days of forecast in a few hours),the meteorological model, which is byfar the most demanding in terms of thecomputational effort required, wasimplemented on APE100/Quadrics, amassively parallel computer with a peakpower of 25 billion floating pointoperations per second in the maximal 512processors configuration.

In the operational configuration, anintermediate resolution run of theatmospheric model is performed, with agrid spacing of 30 km. This run takesinitial and boundary data, at a 60 kmresolution, from the global weatherforecast produced every 6 hours byECMWF (European Centre for Medium-Range Weather Forecasts) and providesthe input data for the high resolution runof the LAM.

The wave model, which describes theevolution of the wave spectrum bysolving the wave energy transferequation, runs over the whole

Mediterranean Sea at a constantresolution of about 30 km. On the otherhand, the ocean model, computing thesea elevation at the entrances of theLagoon, uses a variable resolution gridcovering the Adriatic Sea, with grid sizedecreasing when approaching Venice.

Finally, a finite element, very highresolution model of the interior of theLagoon is used to predict the water levelin Venice. The mesh used by this model

consists of about 7500 elements, with aspatial resolution varying from 1 kmalong the mud flats down to 40 m insidethe channels.

Several experiments, performed to testthe system performance, showed goodagreement with observed data. As anexample, we show the forecast of anextreme surge, which took place duringNovember 1996, due to a strong cycloniccirculation over the WesternMediterranean region. A maximum seasurface elevation in Venice of about 1 m,very close to the measured one, waspredicted for the 18th, at 6.00 am. InFigure 2, we show the correspondingwind field, as predicted by themeteorological model. A depression,initially located over Spain, has movedtowards the East, causing the pressuregradient to align with the main axis ofthe Adriatic Sea, and thereforecontributing to the sea level set-up in thenorthern part of the basin. Surface windis driven by the surrounding orographyand is directed along the same axis,blowing from the South-East.

Partners in the project are FISBAT-CNRof Bologna, the University of Padua andISDGM-CNR, Venice.

■Please contact:

Andrea Bargagli – ENEA RomeTel: +39 06 3048 6416E-mail:[email protected]

Figure 1: The flow-chartof the integratedoperational system.

Figure 2: Simulationof surface wind (m/s)field: zoom over theAdriatic basin.

Page 14: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

14

ENVIRONMENTAL MODELLING

IDOPT:Identificationand Optimizationfor theGeosciences by Jacques Blum and François Le Dimet

IDOPT is a joint research team ofINRIA, CNRS, Université JosephFourier and Institut NationalPolytechnique of Grenoble. IDOPT isoriented towards the problems ofidentification and optimization ofsystems and their applications inphysics and geosciences.

Geophysical fluids have some specialproblems which should be taken intoaccount when they are modelled for theprediction of the evolution of the flows,for the atmosphere as for the ocean.Geophysical flows are governed bynonlinear laws and may have a turbulentbehaviour. Therefore it is difficult topredict their evolution. It is also difficultto validate models with respect toexperimental data. Finally the dimensionsof the numerical models are huge. Theoperational prediction, eg the model usedby the European Center for MediumRange Weather Forecast in Reading, UK,has several millions of degree offreedom. Therefore to carry out anefficient prediction in real time, fastnumerical methods should be used.

The directions of research which havebeen developed by IDOPT since 1994are adaptive grid modelling inoceanography and data assimilation inmeteorology and oceanography.

Adaptive Grid Modelling in Oceanography

The problem of spatial resolution is a keypoint in ocean modelling, both in c l i m a t i cor o p e r a t i o n a l oceanography. From aclimatic point of view, the simulation ofthe ocean’s general circulation requiresa grid size of typically 10-20 km, since

many energetic phenomena occur atscales of a few tens of kilometres, or less.For operational oceanography, thereexists presently a strong request for veryhigh resolution local models. In thiscontext, we are currently trying to exploitthe multi-scale aspect of the oceancirculation to reduce the cost ofnumerical simulations. A package isbeing developped, which gives any finitedifference ocean model the capability ofadaptive gridding and local zooming.This is performed via the management

of a hierarchy of grids of differentresolutions, dynamically interactingtogether. An interesting point is that themodel is seen as a black box, and thushas not to be modified. This approach iscurrently being tested in different oceanmodels.

Data Assimilation in Meteorologyand Oceanography

The concept of Data Assimilation hasemerged at the end of the 70’s in themeteorological community. It was aresponse to the need that analyzed fieldsbe in agreement with the dynamics of theatmosphere and take into account thevarious sources of information likeheterogeneous data in nature, quality anddensity. Since then two main classes ofmethods have been considered:

Kalman-type methods: These methodsare founded on the theory of estimation.

They provide a sequential estimation ofthe field ie an estimation of the field withthe arrival of new observations.If thesemethods perfectly fit the requirements ofdata assimilation, their weakness comesfrom their computational cost: thecovariance matrix has to be computedand stored. The dimensionality of themost recent meteorological models(ECMWF) is around 6 millions ofvariables, it is clear that we are beyondthe range of high performance computing(even if we extrapolate to the next

generation of computers). Therefore mostof the recent research has been devotedto reduced form of Kalman filtering.

Variational Methods: The basic idea isto consider the degrees of freedom of themodel (initial and/or boundaryconditions) as a control variable and toclose the model through a variationalprinciple giving the distance of a solutionof the model to the observations. Thesemethods use standard and robustalgorithms of optimization. Tools, suchas the adjoint model, are used which arealso useful for other purposes such assensitivity analysis. From thecomputational point of view, their highcost is decent especially if we considerthe most recent generation of computers.These methods will become operationalat NCEP (USA), Meteo-France,Météorologie Canadienne in a nearfuture, they are operational at the

Snapshot of the surface circulation in the North Atlantic in May 1997,computed by a numerical model assimilating Topex/Poseidon satellite data(source: LMC and LEGI,Grenoble).

Page 15: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

15

ENVIRONMENTAL MODELLING

European Center for Medium RangeWeather Forecast since October 1997.

Future Developments

A general framework of the researchcarried out in IDOPT is the improvementof numerical techniques used in thegeosciences. The adaptive mesh methodwill be used in projects of operationaloceanography. The efficiency of the dataassimilation algorithms has to beimproved: for the Kalman-type methodsthe size of the state should be decreasedto make the computation realistic. Thedeterministic methods should be able toestimate the impact of uncertainties onthe prediction; this can be done by addingsome extra term modelling the error ofmodel.. A major difficulty in the use ofcontrol methods comes from thenecessity to use an adjoint model whichpermits to compute the gradient. Thederivation of an adjoint model for verylarge numerical models (several hundredthousand statements) is a costly task interm of manpower. The use of tools ofautomatic differentiation, such asODYSSEE (INRIA- Sophia Antipolis)could be a major progress for thesemethods.

Co-operation

The recent development of advancedtechniques for the assimilation of datahas been carried out through a fruitfulco-operation between research groups inmeteorology (LMD, LA, Meteo-France),oceanography (LEGI) and appliedmathematics. This field of research isprogressing both from the modelling ofgeophysical fluids and from thedeveloping of the mathematics.Therefore a cooperation is morenecessary than ever to achieve significantresults. It is clear that the mathematicianscannot develop meteorological or oceanicmodels, neither can the geophysicistsinvestigate the theory of deterministic orstochastic optimal control.

■Please contact :

Jacques Blum or François Le Dimet –LMC/IMAGTel: +33 4 7651 4860 or +33 4 7651 4675 E-mail: [email protected], [email protected]

EnvironmentalModelling atIACM-FORTH:Aiming at theFutureby Nikolaos Kampanis

Scientific computing at the Institute ofApplied and ComputationalMathematics (IACM) of FORTH hasfocused on the development ofmathematical and numericaltechniques for solving partialdifferential equations which describevarious physical phenomena andtechnological processes, and on theircomputer implementation in user-friendly integrated systems supportingthe pre- and post-processing of scientificcomputations, Graphical InformationSystem applications, parallelization,visualization and decision makingt e c h n i q u e s .

Over the years a large amount ofexpertise has been developed onmathematical and computationalmethods for wave propagation problems,especially in the area of direct and inverseproblems of underwater acoustics. Forthe direct problem several computationalmodels for predicting the acoustic fieldin the sea have been developed andvalidated. The models are based on finiteelement and finite differencediscretizations of the Helmholtz equationand its paraxial approximations and usestate-of-the-art mesh generation andnumerical linear algebra techniques. Forthe inverse problem several techniqueshave been proposed and tested for thepurposes of identifying oceanographicand geoacoustic parameters. Thesetechniques couple on site measurementsand computational prediction models.

IACM is now in the process of expandingits involvement in environmentalmodelling. IACM is located in Heraklion,on the island of Crete in the southern partof Greece, an area whose economy ismainly based on local small-to-medium

size industry, agriculture and tourism.The challenge of preservingenvironmental quality in the course offuture development is of paramountimportance. We intend to relate more ofour research to the study of regionalenvironmental problems and try tosupport the decision making processesand the regional planning of the localauthorities, with which we have longestablished close cooperation.

There are several components of potentialenvironmental modelling activities inCrete. One is the problem of thedispersion of water transported pollutants.For example, the disposal of fluid wastefrom local olive oil processing plants(Crete has a big olive production andprovides significant quantities ofprocessed olive oil to the domestic andinternational market) has to be treatedefficiently, since it may be carried awayby rain or rivers. Eutrophicationphenomena due to the contamination ofwater basins by chemical fertilizers, andpollution of the sea, eg at the south ofCrete where extensive areas are occupiedby greenhouses, may destroy usable watersupplies and the coastal zone. Other kindsof solid and fluid waste, due to theincrease of the near-shore population andextensive tourist development very closeto the coastline over decades ofkilometres, are disposed in the sea. Hence,regional planning decisions need to besupported eg on the optimal location ofindustrial, touristic and otherenvironment-affecting installations,especially when these are close to thecoast. Therefore the study of coastalcirculation, including local currents andsurface motion, is important.

IACM currently participates in twoprojects related to water resourcesmanagement. Both are in collaborationwith INRIA, Rocquencourt, andscientists and institutions from severalcountries of North Africa and the MiddleEast. The first, funded by the Greek-French integrated action PLATON’ 98program, is the numerical simulation andprevention of water reservoireutrophication. The other, funded by theEuropean Union DGIII ESPRIT/INCO-DC program, has as its general themeflood modelling with use of HPCN and

Page 16: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

16

ENVIRONMENTAL MODELLING

GIS, and aims towards providing toolsto support design of policies for a rationaluse of water supplies.

Due to its geographic position andregional climate conditions, Crete haslong periods of strong north and southwinds, which in some places arereinforced by complex mountain terrain.Hence, of major importance is theexploitation of wind and water waveenergy resources, and Crete has beencharacterized by the Greek governmentas a privileged region for developmentand use of renewable energy sources. Notless important is the exploitation of solarenergy due to the very long cloudlessperiods. Installation of wind farms isadvancing at several places in Crete andthe collection of field measurements andshort-term weather forecasting will be ofmajor importance for their optimaloperational deployment. Agriculture willalso benefit from such activities, that willalso aid in the controlled and localizeduse of pesticides.

All this leads us to the conclusion thatthe monitoring of the atmosphericboundary layer inland and at coastalzones, as well as the simulation andforecasting of athmospheric flows on aregional basis using computationalmodels will be very important. IACMhas taken several steps to establishcollaboration with scientists in Europeand in the US for these purposes.

■Please contact:

Nikolaos Kampanis – IACM-FORTHTel: +30 81 39 17 80E-mail: [email protected]

Implementing aHigh ResolutionNumericalModel of the Strait of Gibraltar by Gianmaria Sannino, RobertoConversano and Vincenzo Artale

The Mediterranean sea is a basin inwhich a wide range of oceanicprocesses and interactions of globalinterest occur. Intense evaporationover the Mediterranean Sea producesdense, salty water that outflowsthrough the Strait of Gibraltar andcan have an impact on the generalcirculation of the global ocean andparticularly on the formation of deepwater in the North Atlantic. One effectof the outflow from the Mediterraneanis to make the Atlantic Ocean saltierthan either the Pacific or IndianOceans.

The Gibraltar Strait controls the entirefresh water budget of the Mediterraneansea and for this reason there has beenboth experimental and theoretical interestin the physics of the Strait and onmonitoring the heat and flux on the sill.The existence of this strait makes theMediterranean sea one of the few regionsin the world’s oceans where the advectiveheat and salt flow is known withsufficient accuracy to permit testing ofthe available data sets, the bulk formulaand the model results of the surface heatflux. In recent years, in the context of theIII and IV Framework, the EuropeanCommunity has financed a number ofprojects aimed at improving knowledgeof the Mediterranean circulation fromboth numerical and experimental pointsof view.

In particular, one task of the MTP-II-MAST Programme (1996-1999) isdealing with the numericalimplementation of a high resolutionnumerical model of the Strait of Gibraltar

and its parameterization in the OceanGeneral Circulation Model.

Motivations

Since the beginning of the century theStrait of Gibraltar has been a favouriteplace for oceanographic studies. Thedominant oceanographic feature of theStrait is a two-layer inverse estuarinecirculation that is driven by an excess ofevaporation over precipitation and riverdischarge into the Mediterranean. Inaddition, recent field experiments in theStrait have kindled a renewed interest intwo-layer hydraulics and its role incontrolling the mass transport betweenthe Mediterranean and the Atlantic sea.

As a result of the effects of this hydrauliccontrol, two possible states have beenhypothesised for the MediterraneanSea/Strait of Gibraltar system: 1) over-mixed with maximal exchange,minimum difference in salinity betweenthe two layers and a shallow, supercriticalinflow at the eastern end of the Strait, or2) not over-mixed, with sub-maximalexchange, a bigger salinity difference anda thicker sub-critical inflow.

The Alboran Sea Circulation isdominated by the Atlantic water enteringthe Alboran sub-basin as a shallowbuoyant stream that curves southwardmost of the time. It then forms a largeanticyclonic gyre, which may extendthroughout the whole width of the sub-basin.

From satellite observations the gyreseems to have some variability,generating a second cyclonic gyre up tothe main stream. The anti-cyclonic/cyclonic structure at the end ofthe Strait may be related to the geometryof the basin. Observations have alsorevealed the appearance of a secondanticyclone in coincidence with theAlmeria-Oran front. The variability ofthis second anti-cyclonic gyre is morepronounced than that observed in thewestern side of the basin. These two-three gyres are considered to be one ofthe permanent features of the WesternMediterranean circulation and participatein the formation of the MAW(Mediterranean Atlantic Water). The

Page 17: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

17

precise criterion for the anti-cyclonic/cyclonic features of the Alboransea are still unknown.

The numerical simulation that we showin the figures refers to the evolution ofthe Atlantic water into the Alboran basinand the outflow of the Mediterranean seainto the North Atlantic circulation. Figure1 shows the Alboran gyre circulation andthe Mediterranean outflow at depth.From a first analysis of these results we

note a good agreement with theobservations. However moreexperiments and analyses are needed toevaluate whether the resolution of themodel is sufficient to describe thisphenomena. In particular, we hope thatour numerical experiments will answera number of questions, such as:

• is there a permanent hydraulic controlsite in the Gibraltar Strait and, if so,how does it depend on seasonalvariations in the Mediterranean basin?

• What are the principal forcing effects(tide, topography...) that can generatethe variability of the ocean currentalong the coast of Portugal and thebifurcation of the Mediterraneanoutflow at Cape Saint Vincent?

• What are the physical mechanisms thatgenerate gyre variability in the Alborancirculation?

The numerical simulations wereperformed using POM (PrincetonOceanographic Model) a primitiveequation numerical model. The modelwas initialized with climatologicalsalinity and temperature without windstress forcing. The boundary conditionsare computed by the Newtonian

relaxation terms at the boundary domainsof the model using the monthlytemperature and salinity climatologicaldata. The model runs on the Digital 4100parallel platform or on a CRAY J90.

For more information, see the projectWeb site: http//www.cetiis.fr/mtp/mater

■Please contact:

Vincenzo Artale – ENEATel: +39 06 3048 3096E-mail: [email protected]

Water QualityModelling in the ChannelNetwork of Veniceby Lucia Zampato, GeorgUmgiesser, Martina Bocci and Giovanni Coffaro

In Venice a new activity has startedaimed at the hydrodynamic and waterquality modelling of the city’s innerchannels. This project is considered tobe important because of the problemsthe city of Venice has with pollution ofits water and sediments. A two-dimensional finite elementhydrodynamic model is used tosimulate the behaviour of the lagoonsurrounding the city; the modellingsystem WASP is used to describe thehydrodynamics of the inner channelsand the quality of the water. A teamof scientists from the Italian NationalResearch Council (CNR) and theUniversity of Padua carry out themodelling activity within theframework of the ‘Inner Channels ofVenice’ Project, which begun in 1994.The project is coordinated byUNESCO’s UVO-ROSTE Office inVenice and financed by the ItalianMinistry for Scientific Research.

The city of Venice is uniquely structured,since it is constituted by small islandssurrounded by a complicated network ofchannels that communicate with thelagoon. Even today, this channel systemconstitutes the principal way ofcommunication for people, goods andpublic services.

These channels are also the collectors ofpollutants; the most important of themare the sewage waters. Stopping thedredging of channels between the end ofthe 1960’s and 1994 has worsened thisproblem and makes the pollution of waterand sediments a very pressing problemfor the city. Inquiry into the possiblesolutions to the problem of cleaning thechannels is the general objective of the

ENVIRONMENTAL MODELLING

Figure 1: Velocity fieldof Atlantic inflow.

Figure 2: Velocity fieldof the Mediterraneanoutflow.

Page 18: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

18

ENVIRONMENTAL MODELLING

project ‘Inner Channels of Venice’. Theconcluding part of this project was startedat the beginning of 1998 and is expectedto terminate at the end of 1998.

Main research subjects in the projectinclude:

a) the quality of water and sediments

b) the health of the inner channels

c) water quality modelling.

Topics a) and b) are performed throughfield measurements and laboratoryanalyses; they concern hydrodynamicquantities (velocity, water level),ecological parameters (temperature,salinity, pH, dissolved oxygen, nitrogenand phosphorus compounds and others),and biological indicators (virologicalparameters and pathogenic bacteria).

The activities in which the authors aredirectly involved regard theimplementation of the water qualitymodel (WASP) to the channel networkof Venice. The possibility to simulate thehydrodynamic behaviour of the channels,and the evolution of biologicalparameters with different tidal regimes,meteorological conditions andmorphological configurations, give animportant contribution to this study. Theresearch, co-ordinated by a team ofinternational experts, is carried out inVenice by a group of local scientists.

Modelling Activities

The aim of the modelling activity is toobtain a description and therefore a betterunderstanding of the hydrodynamic andeutrophic processes in the inner channelsof the city of Venice. The activitiesplanned are:

• the implementation of a 1-D hydro-dynamic model to the channel networkand its coupling with a 2-Dhydrodynamic model for the lagoon ofVenice, that provides the boundaryconditions around the city

• the application of a water quality modelto the inner channels.

To accomplish these tasks the followingmodels will be used:

• a 2-D finite element model of theVenice Lagoon (Umgiesser, 1986): thegrid of the model is made up of 4359

nodes and 7842 triangular elementsdescribing the varying bathymetry.Forcing functions of the model areheight of sea level at the three lagooninlets, wind blowing over the lagoonand the discharges from rivers into thelagoon. The grid of the model will bemodified in the area around Venice toallow the coupling with the 1-D modelof the inner channels.

• The WASP-EPA model will be used tosimulate water circulation and waterquality of the channel network.DYNHYD, the hydrodynamic moduleof WASP, will be used to describe thewater circulation; the model is based ona link-node approach and provides watervelocity and waterlevel values. Themodel will be calibrated by means of theManning coefficient with data on waterlevel and velocity recorded during thefield campaigns. The results ofDYNHYD will be used in simulating thewater quality. EUTRO, the water qualitymodule of WASP, will be used to modelthe dynamics of the following waterquality parameters: dissolved oxygen;organic carbon; nitrogen (as nitrate andammonia); phosphorus (ortho-phosphate); chlorophyll-a; andsuspended matter (modelled by meansof the TOXI sub-model). Nitrogen,phosphorus, and suspended matter

loading in the channels will be computedfrom data on the resident population andtourists and from water consumptiondata and specific conversion factors.Boundary conditions for the waterquality will be extracted from theavailable water quality data in the areaof the lagoon surrounding Venice. Initialconditions will be set according to fielddata gathered in the framework of thisproject and considering other data fromthe UNESCO database.

Conclusions

The model will be calibrated through thecomparison with data already available,or collected in the field measurementsduring 1998. Future activities are plannedin the framework of the project regardingthe dredging of channels carried out bythe municipality of Venice:modelling ofhydrodynamics and water quality in thechannels is an effective tool to better planthe excavation works and to predict theconsequence of these on the watercirculation and environmental healthconditions. For further information, see:http://www.isdgm.ve.cnr.it/~georg/

■Please contact:

Georg Umgiesser – ISDGM-CNRTel: +39 041 521 6875E-mail: [email protected]

Figure 2: Thechannelnetwork thathas beenadopted for the1-D model.

Figure 1: A mapof the city ofVenice with themain channels.

Page 19: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

19

ENVIRONMENTAL MODELLING

NumericalAlgorithms forAir and SurfaceWater QualityModelling by Jan Verwer

Many environmental problems, suchas damage to the biosphere, local airpollution, the spread of harmfulsubstances in the water, and globalclimatic changes can not be studied byexperimentation. Hence, mathematicalmodels and computer simulations arebeing used as the appropriate meansto get more insight. CWI has beenactive in this field from the early 1990sand co-operates with severalorganizations within and outside TheNetherlands. Significant parts of theresearch are carried out within theTASC project (Transport Applicationsand Scientific Computing) and aresupported by the national HPCNprogramme and the Cray UniversityResearch Grants Programme.

The mathematical models consist of a setof partial differential equations (PDEs)describing the system’s evolution in time.By discretizing the time and spacevariables occurring in the PDEs on a grid,a numerical model is obtained, which isthen solved on a computer. Due to therapid progress in high performancecomputing this numerical approach isnowadays many times cheaper thanbuilding traditional scale models. Inaddition, many processes such as thespread of pollutants defy accuratesimulation by means of scale models.

Solving such realistic time-dependentPDEs numerically is very computation-intensive. Discretization in spacecoordinates produces a formidable set ofordinary differential equations (ODEs)in which only time occurs as a continuousvariable. This set of equations is thensolved step-wise in time by means of anumerical integration formula. Thechoice of the spatial discretization and

the integration formula depends on thestability of the evolution process, thedesired accuracy of the approximationand the efficiency of the entirecomputation process.

CWI carries out research jointly with thenational Organization for AppliedResearch TNO to develop a regional 3Dlong-term ozone simulation model,replacing the current LOTOS model usedby TNO. CWI contributes with thedesign of the mathematical model for ahybrid (terrain following and pressurebased) coordinate system and, in

particular, of tailored numericalalgorithms and implementations onparallel and super-computers.

In a similar project, CIRK, CWI hasdeveloped numerical algorithms for usein 3D models describing global changesin the troposphere’s chemicalcomposition due to the long-term spreadof air polluting substances, taking intoaccount exchange with the stratosphere.The work included stiff chemistryintegration and a factorization approachwithin the Rosenbrock method, as wellas the validation of various advectionschemes in a real-life radon experiment,using analyzed wind fields fromECMWF. New work in this field centresaround the existing TM3 model. Otherrecently started research in air qualitymodelling concerns application ofoperator splitting and sparse-grid methodsto advection-diffusion-reaction problems.

CWI’s present contributions to waterquality modelling are based on apreviously developed efficient computingmodel for 3D shallow-water equations(SWEM). Here a parallel vectorized flowmodel was implemented on a Craysupercomputer. In SWEM a transportmodel is coupled to a hydrodynamicmodel which generates the velocity fieldfor the transport model. Starting from theSWEM model CWI designs parallelnumerical methods for the simulation ofwater pollution (calamitous releases), themarine eco-system, dispersion of riverwater, sediment transport, etc. In

connection with this a 3D transport modelwas studied, in particular the iterativesolution of the equations resulting fromtheir implicit time-discretization, anddomain decomposition with domains ofvarying grid resolutions. Part of theresearch is carried out in the EUprogramme MAST within the MMARIEproject (Modelling of Marine EcoSystems), with applications to pollutionon the Continental Shelf caused bysediment transport from the Rhine andother major European rivers.

For more information see:

• http://dbs.cwi.nl/cwwwi/owa/cwwwi.print_projects?ID=2 (air)

• http://dbs.cwi.nl/cwwwi/owa/cwwwi.print_projects?ID=4 (water)

■Please contact:

Jan Verwer – CWITel: +31 20 592 4095E-mail: [email protected]

The solution ofrealistic 3D modelsfor atmosphericpollution, in the formof PDE's of theadvection-diffusion-reaction typecontaining 20--100chemical species,poses challengingcomputational andnumerical problems.(Adapted from:Atmosphere, Climate& Change, by T.E.Graedel and P.J.Crutzen, Freeman &Co., 1995.)

Page 20: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

20

ENVIRONMENTAL MODELLING

Modelling of theEnvironmentalImpact of MainAirbornePollutantsProducersby Franti ek Hrdli ka and Pavel Slavík

The vast part of electricity producedin the Czech Republic is produced inpower stations where as a fuel thebrown coal is used. This type of coalcontains elements (like sulphur dioxideetc.) that have a very negative impacton the environment. During smog alertsituations it is necessary to reduceemissions from power stations to anacceptable level either by loweringoutput of power stations in the regionor by usage of high quality coal in somepower stations.

The question which power stationsshould be regulated is a very critical one.Such a decision is based on the data fromthe network of monitoring stations thatmonitor pollution and from data obtainedfrom the national meteorologicalinstitute.

The problem is that these data are notinterpreted in a wide context what leadsto situations when it is impossible todetermine the power station that has thelargest contribution to the level ofpollution in the critical area. The solutionto this problem is to develop amathematical model that can simulatethe distribution of pollutants in a givenregion under certain meteorologicalconditions. In such a way it is possibleto determine the contribution to pollutionin the investigated (critical) area for eachsingle power station. Thus it will bepossible to determine the power stationwith the largest impact and thecorresponding measures can be taken(either to lower output or to switch tohigher quality coal). In such a context itis possible to optimize the outputlowering process as this process has

besides ecological criteria also economicones. Therefore a software system hasbeen developed that describes theemission distribution in the atmosphere.The mathematical model used is basedon the statistical theory of diffusion –namely Briggs and Giffords theory ofdiffusion. This theory belongs to the classof theories called Gaussian models as thedistribution of gas concentration iscontrolled by the properly modifiedGaussian distribution.

The model discussed has beenimplemented by means of the IRISExplorer system on Silicon Graphicscomputers. An important problem to besolved was the userfriendliness of thesystem. The results of simulation have aform of a map with colour representationof the pollution level. The input has alsographical form where by means of directmanipulation techniques the input valuesare set. This concerns for example the

settings of parameters for each singlepower station or setting themeteorological conditions in the region.It is also possible to click on any locationon the map and the information about thecurrent level of pollution together withshare of each single power station in agraphical form is displayed. Moreover itis possible to set parameters that definean animation sequence that representsdynamical change of some parametersof the process investigated.

The model has been evaluated by meansof comparison of simulated data and areal data. The real data were influencedby other sources of pollution than power

stations investigated what had impact onthe match between the model and reality.Nevertheless there was a good matchfrom the point of influence of each singlepower station on the level of pollution inthe region investigated.

The system developed is a pilot systemfor the given application. There is a lotof potential for further development.Future work could lead to thedevelopment of a model where the dataobtained from the network of monitoringstations can be interpreted by means ofinformation obtained from the modeldeveloped. It could be possible to employmethods of artificial intelligence that willsupport learning process of the systemthat could help to eliminate the influenceof local sources of pollution. The modeldeveloped could be also used forsimulation of gas distribution duringecological disasters like accidents inchemical plants or in nuclear power

stations. It could be possible to modeland evaluate various measures that couldbe taken in the case of disaster.

■Please contact:

Franti ek Hrdli ka – Department ofThermal and Nuclear Power Plants,Czech Technical University, PragueTel: +420 2 2435 2535E-mail: [email protected]

Visualization of air pollutioncaused by threepower stations.

Page 21: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

21

ENVIRONMENTAL MODELLING

Dynamic Modelsfor SmogAnalysisby Achim Sydow, Thomas Lux,Peter Mieth, Matthias Schmidtand Steffen Unger

The project DYMOS (Dynamic Modelsfor Smog Analysis) carried out at GMDInstitute for Computer Architectureand Software Technology, Berlin, isconcerned with the development andpilot application of simulation systemsand models for air pollution andvehicular traffic, running on HighPerformance Computing platforms.The aim is to support users ingovernmental administrations andindustry in operative decision makingas well as long-term planning, to serveas a scientific basis for national orinternational negotiations, and toenable predictions (air pollutantsconcentration, traffic density, etc.) forinforming the public (WWW, TV,r a d i o ) .

The DYMOS system is a parallellyimplemented air pollution simulationsystem for mesoscale applications.DYMOS consists of differentmeteorology/transport models fordifferent application purposes includingan air chemistry model for the calculationof photochemical oxidants like ozone.

The core of the model system is formedby a hydrostatic mesoscale Eulerianmodel with a low vertical resolution forfast operational forecast tasks (enhancedversion of REWIMET/REgional WIndModel Encompassing Transport) and anon-hydrostatic mesoscale Eulerianmodel with a high vertical resolution andcomplex parameterization facilities(enhanced version of GESIMA/GEesthacht SImulation Model of theAtmosphere). In addition, some Eulerianand Lagrangian transport models areincluded within DYMOS. The airchemistry model CBM-IV (Carbon BondMechanism) is dealing with 34 speciesin 82 reaction equations for simulatingthe photochemical processes in the loweratmosphere.

Using the DYMOS system, analyses canbe performed regarding winter smog(high concentrations of inert pollutants),summer smog (high concentrations ofozone and other photochemical oxidants),and single components (eg heavy metals,benzol, radioactive or antigenicsubstances). In addition to providing adynamic emission source for the smogforecast simulations, traffic flowmodelling has become a research fieldwithin DYMOS in its own right.Emphasis is put on the analysis of criticalstates in urban traffic systems. Due to theresearch work with basic character carriedout in DYMOS, a foundation has beenset for further, more applied projects.

DYMOS Applications

Various case studies of summer smogconditions in urban areas have beencarried out using the DYMOS system. TheDepartment of the Environment of theBerlin state government and the Ministryfor Environment of the state Brandenburgcommissioned summer smog analyses for

the results of a measuring campaigncarried out in July 1994. Greenpeacecommissioned an analysis of the influenceof emissions caused by traffic in Munichon the ozone concentration in the Municharea. The analysis was performed for atypical mid-summer day in 1994. Withinthe subproject PATRIC (Parallel Petri-Net Simulation for Traffic Control inConurbations) the DYMOS systemcoupled with an Petri-net-based trafficflow model was used to analyze the trafficinduced air pollution of the City ofBudapest.

In order to inform the public about ozoneconcentrations, the DYMOS system iscurrently in its test phase for theoperational daily forecast of near surfaceozone concentrations in the Berlin-Brandenburg region. In cooperation withthe Department of the Environment ofthe Berlin state government, the GermanFederal Environmental Agency andInforadio Berlin, the predictedconcentrations are presented as rasterimages for defined day times and asMPEG movie (see figure) for the wholeday and can be found on the WWW(http://www.first.gmd.de/ozon/).

The research activities within DYMOShave given rise to a number of directsubprojects and three autonomousEuropean cooperation projects based onDYMOS: ECOSIM/ Ecological andenvironmental monitoring and simulationsystem for management decision supportin urban areas (see page 22),HITERM/High-Performance Computingand Networking for Technological andEnvironmental Risk Management (see

page 26) and SIMTRAP/HPCNSimulation of Traffic Induced AirPollution Using Parallel Computing in aDistributed Network (see page 27). Moreinformation about the DYMOS projectis available at http://www.first.gmd.de/applications/proj/dymos_more.html

■Please contact:

Achim Sydow – GMDTel: +49 30 6392 1813E-mail: [email protected]

Snapshot of a moviefor the dynamicvisualisation(helicopter flight) of near-surfaceozoneconcentrations in the Berlin region.

Page 22: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

22

ENVIRONMENTAL MODELLING

Development of an IntegratedEnvironmentalMonitoring and SimulationSystem by Peter Mieth, Matthias L. Jugel,Steffen Unger and Bertram Walter

The project ECOSIM (Ecological andenvironmental monitoring andsimulation system for managementdecision support in urban areas) willprovide a prototype for an integratedenvironmental monitoring andsimulation system. It integrates,through a distributed client/serverarchitecture data acquisition from anon-site environmental monitoringnetwork, data analysis andvisualization tools, a range ofnumerical simulation models ofdifferent complexity to assess the air,the marine water and groundwaterquality after releases of pollutants, andan embedded expert system. ECOSIMis a European cooperation projectfunded by the European Commissionwithin the Telematics ApplicationsProgramme.

The ECOSIM system builds on a set ofsophisticated state-of-the-art simulationmodels (eg air chemistry models). Itexploits existing knowledge inenvironmental management byconnecting these models with anadvanced Geographical InformationSystem (GIS), on-line monitoringnetworks, local databases, and anembedded expert system.

This allows eg a direct link between themodels, up to-date measurements ofcurrent meteorological conditions, andactual pollutant backgroundconcentrations. ECOSIM is also able toconsider different environmentaldomains such as air, surface water, andcoastal water to ensure an analysis ascomplete as possible.

The following simulation models arepresently implemented into the ECOSIMsystem:

• the non-hydrostatic, prognostic, grid-based, (Eulerian) mesoscale 3Dwindfield model MEMO

• the dispersion and air chemistry part ofthe DYMOS system (Dynamic Modelsfor Smog Analysis), responsible for theadvective transport, turbulent diffusionand surface uptake of the air pollutants

• the mesoscale ozone forecast modelREGOZON

• the modular, finite-difference based 3Dground water model MODFLOW-96

• the prognostic 3D ocean model POM,responsible for the temperature-,velocity-, and salinity distribution inshallow water as well as in deep oceandomains.

The ECOSIM system integrates dataacquisition and monitoring systems, GIS,and dynamic simulation models in aflexible client-server architecture. Ingeneral, the different model componentsrun at different locations on differentcomputers. They are connected viaInternet and controlled by a main serverlocated at the user’s office. Thesimulation results can be visualized onthe server as well as on any remoteclient’s platform.

ECOSIM Applications

Besides GMD Institute for ComputerArchitecture and Software Technology,Berlin, ECOSIM involves participantsfrom Austria, Greece, Italy and Poland.ECOSIM will be developed and testedin close collaboration with three cityauthorities with differing environmentalproblems and state-of-the-art approachesto environmental management. ECOSIM

will be used within the city of Berlin,Germany, to provide an improvedmethod for modelling the level of ozonewhich depends mainly on trafficemissions.

For the city of Athens, Greece, similarproblems will be addressed as well as theinterplay between surface and groundwater and coastal pollution. ECOSIMwill also be applied in a more limitedmanner in the city of Gdansk, Poland.

More information about ECOSIM isavailable at http://www.first.gmd.de/applications/proj/ecosim_more.html

■Please contact:

Peter Mieth – GMDTel: +49 30 6392 1819E-mail: [email protected]

ECOSIM user interface.

Page 23: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

23

ENVIRONMENTAL MODELLING

Air QualityModelling for the MadridRegion by Roberto San José

The Environmental Software andModelling Group of the ComputerScience School of the UniversidadPolitecnica de Madrid (UPM) isconducting reseach in the field of AirQuality Modelling. The mainmodelling research activity is focusedon the development of prognostic airquality models which include differentmodules. The final purpose is todevelop a research code which is ableto forecast the three dimensional airconcentrations over urban andregional environments in short range,one to seven days.

The integrated air quality modellingsystem (ANA) will include high qualityland use classification, detailedanthropogenic and biogenic emissionover the test domain, high resolutionaltitude values, meteorological externalinformation for boundary conditions,initial meteorological soundings, initialsurface pollution information, use ofupdated meteorological and pollutioninformation in on-line mode.

To date, the group has developed anoperational version of the ANA modelwhich is called EMMA. This work iscarried out in the frame of the EMMAproject supported by the European UnionTelematics Applications Programme.

The EMMA model includes the EMMA-R part which forecasts the primary andsecondary pollutants on the region wherethe urban area is included and theEMMA-U part which includes a highspatial and temporal resolution model forthe urban area. Both models are runningsimultaneously. The EMMA-U modelrequires the information from theEMMA-R. This version of EMMA is theso-called nested EMMA model. A testversion of the EMMA model is used by

the Madrid Community and Madrid CityCouncil since 1997.

The group is also involved in differentDGXII projects of the European Unionwhich include complex and extensivefield campaigns. These campaigns havebeen carried out in the MadridCommunity and they focused on thedetermination of the SO2, NOX, NH3 a n dO3 deposition fluxes. The NH3 Amandagradient system allows the determinationof bi-directional net emission/depositionfluxes. The O3 fast ozone sonde and thesonic anemometer allows to measuredirectly the eddy correlation fluxes. Therest of the instrumentation applies theindirect gradient method for measuringthe net deposition fluxes.

Nowadays projects are focused on theembedding of satellite data, surfacemeteorological information, meteorologicalsoundings and regional and metropolitanair monitoring networks on the ‘spin-off’period of the prognostic air qualitymodels such as ANA or EMMA. Theimportance of the data assimilationmodule has been clearly stated in manydifferent scientific contributions in the

past. The group is intensively workingon this important part in order to improvesignificantly the predictions.

The group is particularly interested in theoperational versions of the air qualitymodels (ANA, EMMA, etc.) by usingthe information provided by thoseinstruments (satellites, sounding,pollution networks, etc.). The ACS(Analysis Correction Scheme from the

U.K. Meteorological office) scheme ison the way to be incorporated to themodel system. In addition, the MM5model (PSU/NCAR) is also in use in thelaboratory. The advantage of using acombined version of MM5/ANA-EMMA are clear since the period that themodel should be run in order to produce24-48 hour forecasts are between 120-168 hours.

The needs of having meteorologicalinformation from areas on about 2000km away from the domain where the airquality model is run are completely coverby using this combined version sinceANA is not applicable over domaingreater than 400-500 km. This versionshould be operative at the end of thisyear. In addition, the laboratory isworking with the Urban Airshed Model(UAM) from EPA (EnvironmentalProtection Agency, USA) which isintended to be linked to the MM5 to carryour simulations over Europe and toproduce proper boundary conditions forANA model.Also, the Laboratory has developed aLagrangian Particle Model (LAGMO)which should be used to predict the

particulate matter over urban and regionaldomains and should be incorporated tothe ANA system in the near future.Finally, the laboratory is incorporatingto ANA air quality system a Gaussianmodel based on the ISCT3 (EPA, USA).The user - through the tcl/tk visualizationsystem - will be able to select the type ofsimulation he prefers and for specificdomains, it would be possible to run

EMMA Domain Topography.

Page 24: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

24

ENVIRONMENTAL MODELLING

several thousands of gaussian modelswhich will produce detailed maps overthe sub-grid scale (Eulerian approach)for traffic or industrial applications. Also,the user would be able to select theLAGMO model for selected sub-grids.

The possibility of having a detailedprediction over very fine areas (by usingthe LAGMO and GAUSSMO models forthis specific areas) is an importantadvance of this type of models. Finally,the WWW capabilities are beingintegrated into the ANA air qualitysystem to use nowadays meteorologicalon-line data on quasi-real-time processes.

■Please contact:

Roberto San José – UniversidadPolitecnica de Madrid Tel: +34 91 336 7465E-mail: [email protected]

Computing the RadiativeBalance in MountainousAreas: Modelling,Simulation andVisualization by Jean-Luc Borel, Yves Durand,François Sillion and ChristopheLe Gal

In mountain climatology and plantecology, the estimation of the thermicaction of the relief and the calculationof the radiative budget of the slopesconstitute an inevitable prerequisitestage. Two image synthesis techniqueshave been used for the computation ofthe radiative balance in mountainousareas, namely the painter’s algorithmand the radiosity model,. Thisapproach allows to take into account,on a fine scale, the complexity of theinteractions between the incident solarradiation (direct but also diffuse) onthe one hand and and the localtopography on the other hand.

The research project presented hereunderis in keeping with the development of ahigh-resolution regional climate model.One of the main aims lies in thecharacterization and the quantificationof the influence on the interface fluxesof the land surface properties, moreespecially those bound to the vegetationcover (albedo, roughness length,emissivity, evapotranspiration, ....).Nevertheless, in mountainous areas,altitude, aspect and slope greatlyinfluence all the climatic parameters.Mountains, owing to the obstacle put tothe solar radiation, may significantlymodify the energy fluxes. Two problemshave been tackled, first the impact on thedirect solar radiation of the shadowinginduced by a topographic mask effect andresulting in a so-called cast shadow, thenthe reflexion phenomena betweenopposite mountain sides, all particularlywhen the ground has a high albedo, forexample when it is covered with snow(figure 1).

From a methodological point of view,noting that the distribution of light energyis one of the most frequently addressedproblems in image synthesis, theexploration of the possibilities affordedby the techniques classically used in thisscientific field has deliberately beenchosen. This approach is therefore ratherdifferent from the topographically basedsolar radiation models already available.This research project has induced atransdisciplinary collaboration betweena plant ecologist, a mountain climatespecialist and an image synthesisspecialist. The research work have beencarried out in the frame of an universitydegree (DEA IVR, INPG-UJF,Grenoble) by a computing engineerduring the last year.

The Cluses-Chamonix area, in thenorthern part of the French Western Alps(Haute Savoie) has been chosen as studyarea (55,5km x 20km). The spatialresolution of the digital elevation model(DEM) for this study area is 75m x 75m.Relating to the meteorological data, adata base has been set up for this studyarea with the output values of theSAFRAN analysis system, a numericaltool developed by the Centre d’Etudesde la Neige of Météo France. SAFRAN

provides, since the 1st August 1981, anhourly estimation, above the groundsurface, of the values of ninemeteorological parameters for each 300maltitudinal section and for seven aspects;so, for example, concerning the incomingshortwave solar radiation, its twocomponents : the direct incident solarradiation and the diffuse incident solarradiation. Considering the key roleplayed by the snow on the distribution,composition and structure of the differentvegetation communities in mountainousareas at mid and high altitudes, aphysically-based snow cover simulationmodel has been developed. By anotherway, snow cover presents a highsensitivity to climate variations. Thissensitivity to changes within the radiativebudget has been used as to test theefficiency of the various algorithmss e l e c t e d . The computation of theradiative budget in each point of amountain massif slope requires first atall to take into account the impact of thetopographic masks and of the castshadows induced by the surroundingrelief on the direct incident solarradiation. The different availablealgorithms (ray tracing, ray casting,floating horizon, Z-buffer and painter’salgorithm) have been systematicallytested according to several criteria suchas, for example, computer capacities,computing costs and display possibilities.

A preliminary study performed with aclassical ray tracing method for a daychosen at random (9 March 1985) showsthat the shadowing may cause a reductionof the received direct incident radiationranging between 20 to 60 %! Now, owingto its small cost, the computation and thevisualization of the interception by thesurrounding relief of the direct incidentsolar radiation is performed with the so-called inverted version of the painter’salgorithm. The painter’s algorithm is adepth-sort algorithm which allows todisplay the polygons in order of thedecreasing distance with regard to theobserver, from back to front. Afterimplementation of the inverted painter’salgorithm (increasing distance from theviewpoint), a simulation of a yearlycycle, namely 1990, has been performed.It follows that the integration of theorographic effects in the computation of

Page 25: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

25

ENVIRONMENTAL MODELLING

the radiative balance in mountainousregions may involve, in shaded areas, theappearance and/or the conservation of asnow cover. The impact of cast shadowson the space-time variability of the snowcover clearly appears at low and middlealtitudes, their influence at high altituderemaining generally small. Theseconsequences are illustrated by asimulation of the snow cover state for theCluses-Chamonix study area on the 18January 1990 at 12.00 (figure 2)

For the computation of the reflectionphenomena between opposite mountainsides, the radiosity method has been used.The radiosity models are derived fromthe thermal models and are based on thenotion of global energetic balance bytaking into account the incident lightenergy, the scattered light energy and theabsorbed light energy. They provide anaccurate treatment of the light reflections

between objects in a view-independentway. These algorithms are based on theconservation of the light energy in aclosed environment. The use of theradiosity equation is based on thecomputing of the form factors betweensurfaces. The form factors may bedefined as a coefficient of geometriccoupling. In a first time, so as to estimatethe possible energy exchanges, asimplified scale model of an encasedvalley chosen in the south-eastern partof the Cluses-Chamonix study area hasbeen created. After computation of theform factors between the two oppositemountain slopes, the simulation showsthat the taking into account of the energyexchange within mountain sides mayincrease the amount of the energybalance by about 40 % for groundwithout snow and even up to 150 % for

a snow-covered ground ! The results ofthe simulation for the 18 January 1990at 12.00 are shown in figure 3. At last,incidentally, the knowledge of the formfactors and consequently of the visiblesky proportion for each facet, the so-called sky-view factor, equally allows toevaluate the interception by the relief ofthe diffuse incident irradiance. Thedifferences, positive and negative,

involved by the integration of the reliefeffects on the computation of theradiative balance for the study area aresummarized in the figure 4.

The main purpose of this research projectwas methodological. So as to assess asaccurately as possible the impact of thetopography on the radiative budget inmountainous areas, it seemed interesting

Figure 1: Topographic effects on solar radiation fluxes.

Figure 2: Simulation of the state of snow cover for the Cluses-Chamonixstudy area on 18 January 1990 at 12.00 including shadowing; in white areasshaded, with a “snow gain”.

Figure 3: Simulation of the state of snow cover for the Cluses-Chamonixstudy area on 18 January 1990 at 12.00 including shadowing and energyexchanges between opposite slopes; in black, areas with a “snow loss”.

Figure 4: Simulation of the state of the snow cover for the Cluses-Chamonixstudy area on 18 January 1990 at 12.00 including shadowing and energyexchanges between opposite slopes; in white, areas shaded, with a “snowgain”, in black, areas with a “snow loss”.

Page 26: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

26

ENVIRONMENTAL MODELLING

to explore the possibilities afforded bythe techniques used in image synthesis.It was then advisable to establish therelevance and to test the efficiency of thevarious proposed algorithms. Thenumerous simulations and visualizationsperformed for the Cluses-Chamonixstudy area with the digital elevationmodel and the snow model did not showobvious errors or inconsistencies. Thispreliminary research was a first stage ofprime necessity. The short-term goal liesin the integration of the shadowing modelin a high-resolution regional climatemodel so as to estimate more preciselythe amount of direct irradiance for eachsurface location (pixel or groups ofpixels) then, at middle term, theintegration of the radiosity model for thecalculation of the energy exchangesbetween opposite slopes. Simultaneously,complementary simulations will beperformed so as to characterize theimpact of the topography on the durationand on the intensity of the direct radiationfluxes according to the different aspectsand according to many time scales(diurnal cycle, seasonal cycle, ....).

■Please contact:

Jean-Luc Borel – Centre de BiologieAlpine, Université Joseph FourierTel: +33 4 7663 5624E-mail: [email protected]

Yves Durand – Centre d’Etudes de la Neige, Météo FranceTel: +33 4 7663 7902E-mail: [email protected]

François Sillion – GRAVIR-IMAGTel: +33 4 7651 4354E-mail: [email protected]

Christophe Le Gal – GRAVIR-IMAG,INRIA Rhône-AlpesTel: +33 4 7661 5213E-mail: [email protected]

EnvironmentalRiskManagementby Steffen Unger, Irene Gerharz,Matthias L. Jugel, Peter Miethand Susanne Wottrich

Within the project HITERM (High-Performance Computing andNetworking for Technological andEnvironmental Risk Management), anew generation of interactive riskassessment and management systemsis being developed. Based on HPCN,

the system will contain tools for thesimulation of the accidental release ofhazardous substances and the adaptiverouting of hazardous transports.HITERM is a European co-operationproject funded by the EuropeanCommission within the ESPRITProgramme.

The aim of the HITERM project is to reachbetter-than real time performance for thesimulation of the accidental release ofhazardous substances into the atmosphere,ground, or surface water. The developedHITERM system will support the user in

the field of emergency management,emergency planning, and emergencytraining, both for transportation accidentsand (Seveso-class) chemical process andstorage plant accidents. Integratingenvironmental risk criteria and roadinformation, the system will be able tosimulate accident scenarios along thepreferred and a number of alternativeroutes, and to determine the optimalrouting (dynamic adaptive routing).

The simulation system HITERM isdesigned to meet the increasing demandfor real-time information inenvironmental risk management. It canbe used as a decision tool or as trainingtool, simulating and analysing differing

risk scenarios and therefore improvingthe efficiency of any emergency planning.Moreover, the dynamic routing ofhazardous goods provides an additionalsafety feature to the environment.

The HITERM System

The HITERM system is based on adistributed client-server architecture,using standard http protocols. It contains:• real-time data acquisition systems such

as transport telematic systems, satelliteimagery, weather radar, observationstations including hand-held dataacquisition systems, and video input

HITERM user interface.

Page 27: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

27

ENVIRONMENTAL MODELLING

• HPCN simulation models, egatmospheric dispersion models

• specialised expertise in form of multi-criteria optimization systems, rule-based expert systems, or neural nets

• multi-media clients, including local andnetworked X Windows servers, httpclients, mobile clients and hand-heldcomputers.

As the amount of information generatedby the simulation system is considerable,its visualization tools are based onsophisticated HPC methods.

Uncertainty Analysis

The field of environmental modellinginvolves considerable uncertainties, bothin terms of model and parameteruncertainty, as well as concerning theinput data. For that reason, sensitivityand uncertainty strategies in form ofMonte-Carlo methods or methods basedon Automatic Differentiation have to beintegrated into the system.

HITERM Applications

Besides GMD Institute for ComputerArchitecture and Software Technology,Berlin, HITERM involves participantsfrom Austria, Italy, Portugal andSwitzerland. Among the planned casestudies to test and demonstrate thecapacities of the system are:

• dynamic adaptive routing (Portugal):transportation of hazardous cargo(petrol and diesel) by a vehicle fleet

• transportation accident simulation(Gotthard alpine corridor, Swiss):release of a hazardous substance intothe atmosphere or into the ground water

• process plant accident (Region of PonteSan Pietro, Northern Italy): leeking orspilling of toxic or flammablesubstances into the environment.

More information about the HITERMproject is available at http://www.first.gmd. d e / a p p l i c a t i o n s / p r o j / h i t e r m _ m o r e . h t m l

■Please contact:

Steffen Unger – GMDTel: +49 30 6392 1816E-mail: [email protected]

IntegratedTraffic and Air QualitySimulation by Matthias Schmidt, BirgitKwella, Heiko Lehmann,Christian Motschke and Ralf-Peter Schäfer

The aim of the project SIMTRAP(HPCN Simulation of Traffic InducedAir Pollution Using ParallelComputing in a Distributed Network)is to create an integrated simulationsystem for modelling both the trafficand the air quality, particularly formedium term planning. The systemwill be able to address most issues ofthe transport-environment link, suchas traffic volumes, demand profile, andfleet composition from the traffic pointof view, and (industrial) sources ofpollutants, orography of the modelarea, land use, and meteorologicalconditions from the air pollution side.SIMTRAP is a European cooperationproject funded by the EuropeanCommission within the ESPRITProgramme.

The link between vehicular traffic andair pollution is well established. Forexample, road traffic accounts for morethan 70% of the overall nitrogen oxidesand more than 55% of the overall volatileorganic compounds emission in theBerlin region. For this reason, authoritiesin urban areas need support in managingroad traffic so as to avoid exceedingpollution limits during hot summerperiods. To date, much of the pastmodelling work concerning transportand air pollution has concentrated onlong term strategies, where the effectson air quality indirectly may beexpressed through energy use andemission reduction.

The increasing demand on the travelsystem and the tightening of manyenvironmental standards, however,require to focus more on developing

medium term strategies, eg on strategiesduring the closure of streets due toadverse meteorological conditions orsevere reconstructions, or the guidanceduring special events. For developingmedium term strategies, the demand onintegration between the traffic modelsand the air quality models is the highest.Integrated dynamic tools are required toassess alternative strategies, and to selectthe most appropriate one.

The SIMTRAP System

SIMTRAP centres around two well-established core components: themesoscopic dynamic traffic modelDYNEMO (Dynamic Net Model) andthe air pollution model system DYMOS(Dynamic Models for Smog Analysis).

Due to the complexity of the simulatedprocesses, both modules are beingimplemented in a remote HPCNenvironment. The simulation results willbe visualized in a local 3D GeographicalInformation System with built-infunctionality for decision support.Communication is realized using existingcomputer networks and standardprotocols.

The Models Used in SIMTRAP

DYNEMO is a simulation tool for bothurban and rural road networks. It hasalready been used to simulate large partsof the German motorway network.

SIMTRAP System Structure.

Page 28: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

28

ENVIRONMENTAL MODELLING

Regarding the movement of vehicles,DYNEMO is a mesoscopic model in thesense that it combines properties of bothmicroscopic and macroscopic models.As in microscopic models, the unit oftraffic flow is the individual vehicle.Their movement, however, follows theapproach of macroscopic models and isgoverned by the average traffic densityon the link the vehicles travel.

DYMOS is a simulation system toanalyse the generation, dispersion, andchemical transformation of gaseous airpollutants and different aerosols. Themodel is well suited to reproduce mostfrequent occurring kind of smogsituations, in particular summer smogwhich is mainly caused by trafficemissions.

SIMTRAP Applications

Besides GMD Institute for ComputerArchitecture and Software Technology,Berlin, SIMTRAP involves participantsfrom Austria, Italy and the Netherlands.In order to demonstrate the functionalityof the system and to validate the obtainedsimulation results, the SIMTRAPprototype will be applied by potentialusers at the four tests: Berlin, Vienna,Milan and the Ranstaad area in theNetherlands.

All test sites differ considerably in size,geographical realities, and consideredtraffic network, and therefore are wellsuited to test the complexity of theSIMTRAP system.

More information about the HITERMproject is available at http://www.first.gmd.de/applications/proj/hiterm_more.html

■Please contact:

Matthias SchmidtTel: +49 30 6392 1815E-mail: [email protected]

Land Use andWind Estimationas Inputs for AirPollutionModelling by Dominique Bereziat, Jean-Paul Berroir, Sonia Bouzidi,Isabelle Herlin and Hussein Yahia

This presentation investigates the useof remote sensing image processingtechniques to estimate input data forair quality models. These modelsactually use a lot of input data, whosecollection can be inaccurate or costly.Thanks to their high spatial andtemporal resolutions and their richspectral content, remote sensing datacan be analysed to estimate some inputdata in an objective and accurate way.This research is led at INRIA (AIRproject) in collaboration with GMDInstitute for Computer Architectureand Software Technology (FIRST),Berlin within the ERCIM workinggroup Environmental Modelling.

Air quality models require a lot ofdifferent input data. The collection of thisdata can be inaccurate or costly and isone of the major limitations to theoperational use of air quality models.Remote sensing data have a greatpotential to enhance input data collectionthanks to their high spatial and temporalresolutions and rich spectral content.Remote sensing data can be analysed toestimate input data in an objective andaccurate way.

Input data required by air quality modelscan be classified into three mainbranches:

• site documentation data giveinformation about the site’s orographyand land use; remote sensing techniquescan be used to produce these dataautomatically, thus making theadaptation of a model to a new siteeasier

• model initialisation, boundaryconditions and synoptic data includecloud cover, wind profiles, soiltemperature; these data have to beknown very accurately when running aforecast or a simulation; they arehowever estimated from a sparse net ofground stations, and the use of remotesensing data for their estimation has notbeen evaluated yet

• model parameterisation data include thevalues of the various physical

parameters involved in air qualitymodelling, for instance surfacecharacteristics (temperature diffusivity,emissivity, albedo, aerodynamicroughness length). These parametersare most often assigned a constant valueaccording to the land cover, regardlessof seasonal changes and spatialvariability. Here again, satellite datacan be used to achieve an objectiveparameter estimation.

In the following, we briefly detail threeexamples showing the ability of imageprocessing techniques to estimate inputdata for air quality models: land usespatio-temporal mapping using mesoscale and high resolution data; windestimation by optical flow techniques;and cloud cover estimation usingdeformable particle systems for cloudtracking.

Land use mapping has already beenwidely assessed and some techniquessuch as maximum likelihoodclassification now belong to the state ofthe art. However, these techniques

Figure 1: Example of land useidentification on the basis of itstemporal behavior.

Page 29: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

29

ENVIRONMENTAL MODELLING

produce a static land use mapping anddo not take into account the effect ofseasonal changes. Consequently, it is notpossible to estimate varying surfaceparameters. For instance, the vegetationindex (NDVI) evolves during the seasonsand this parameter greatly influencessurface characteristics such asaerodynamic roughness length. Meso-scale optical sensors (NOAA-AVHRRor VEGETATION) provide interestingsurface measurements on a daily basis,but with a coarse spatial resolution(1km). Each pixel therefore containsdifferent types of land cover. Pixelunmixing is a technique allowing toestimate the daily reflectance of each type

of land cover, provided the land use isknown at high spatial resolution on someselected learning sites. An interestingresult is that it is possible to map landuse on the basis of the temporal behaviorof land cover. For this purpose, we havedeveloped a data fusion scheme betweenhigh resolution (SPOT-XS) and meso-scale (NOAA-AVHRR) sensors to mapland use with both high spatial and hightemporal resolution. See an example onfigure 1.

Wind can be estimated usingmeteorological sensors (eg METEOSAT)that output images every 30 minutes, thusmaking it possible to capture the apparentair displacement. The most populartechnique to analyse motion on imagesequences is called optical flow. It isbased on the hypothesis that the intensityof a moving pixel remains constant duringtime. Expressed in a Euler-Lagrange

framework, this hypothesis makes itpossible to design a numerical schemethat yields dense velocity -or wind- fields.This assumption is however unrealisticwhen analysing cloud displacement. Aspecific model has been developed forthat purpose: we state that the volume ofthe air column under the clouds remainsconstant over short periods.

This assumption leads to a new equationclose to the optical flow constraint, withan additional term that models thevariation of the clouds’ area. Results arebetter than those obtained using opticalflow, since the clouds’ expansion iscorrectly estimated (see figure 2).

Deformable structures tracking, whenapplied to clouds, is a technique thatmakes it possible to estimate cloud coverwith a good accuracy. The difficulty isthat clouds usually undergo very

important deformation during theirmotion, they can even split or merge.This causes most classical trackingmethods to fail. We have developed anadapted deformable model, based on aparticle system controlling a level-set,that models the cloud’s boundary. Thismodel is governed by an internal energy-controlling the elastic behavior of cloudmotion- and an external one -controllingthe adequacy between the model and theimage data. This model is able totrackclouds even in the case oftopological (split and merge) changes a n deven if the motion is fast (see figure 3).As shown by the three examples above,image processing techniques applied to

dynamic remotely sensed data can besuccessfully used to estimate relevantinput data for air quality models. Futureperspectives concern the evlauation ofthese techniques with regards to the gainin model accuracy and modelautomation.

■Please contact:

Isabelle Herlin – INRIATel: +33 1 3963 5621E-mail: [email protected]

Data Miningapplied to Air Pollutionby Brian J Read

An understanding of the behaviour ofair pollution is needed to predict it andthen to guide action to ameliorate it.Calculations with dynamical modelsare based on the relevant physics andchemistry. To help with the design andvalidation of such models, acomplementary approach is describedhere. It examines data on air qualityempirically by Data Mining using, inparticular, machine learningtechniques, aiming for a betterunderstanding of the phenomenon anda more direct interpretation of thedata.

The work of the Database Group atCLRC has long concentrated on thepractical application of data managementtechnology. The emphasis is on helpingusers at the laboratory and externally toexploit the value in data. Implementingdatabases and providing easy access isthe basis for this, supplemented by dataexploration and decision support tools.More recently, interest has extended todata mining, or more fully KnowledgeDiscovery in Databases (KDD). Thismay be defined as “the non-trivialextraction of implicit, previouslyunknown and potentially usefulknowledge from data”. Data mining isjust the discovery stage of the wholeKDD process. Indeed, most of the workin practice lies in the preparatory stages

Figure 2: Wind estimation withvolume conservation method.

Figure 3: Level set model of abreaking cloud.

Page 30: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

30

ENVIRONMENTAL MODELLING

of data selection and data cleaning.Extensive data exploration is essential ifthe data mining is to yield intelligentresults.

Data mining is multi-disciplinary: itcovers expert systems, databasetechnology, statistics, machine learning(“AI”) and data visualisation. It goesbeyond directed querying of a database(eg by using SQL) by instead looking forhypotheses or questions rather thandetailed answers. Most interest is inmining commercial data - for examplecredit profiling or market basket analysis.However it is starting to be used inscientific applications too. CLRC as aleading research laboratory has massesof data. Thus there is the motivation tosee how data mining techniques mightsupplement the more traditional scientificanalysis in formulating and testinghypotheses. Of specific interest are theinduction of rules and neural net models.Considering environmental data,measurement and possibly control of airpollution is increasingly topical. Inapplying the KDD process, ourobjectives are two-fold:

• to improve our understanding of therelevant factors and their relationships,including the possible discovery of non-obvious features in the data that maysuggest better formulations of thephysical models

• to induce models solely from the dataso that dynamical simulations might becompared to them, and that they mayalso have utility, offering (short term)predictive power.

The investigation uses urban air qualitymeasurements taken hourly in the Cityof Cambridge (UK). These are especiallyuseful since simultaneous weather datafrom the same location are also available.The objectives are, for example, to lookfor and interpret possible correlationsbetween each pollutant (NO, NO2, NOx,CO, O3 and PM10) and a) the otherpollutants b) the weather (wind strengthand direction, temperature, relativehumidity and radiance) looking inparticular for lags – that is, one attributeseeming to affect another with a delay ofperhaps hours or of days. Other factorsare possible. For example, clearly

noticeable is lower NOx on Sundaysthrough less traffic.

The initial analysis concentrated on thedaily maxima of the pollutants. Thissimplifies the problem, the resultsproviding a guide for a later full analysis.Also the peak values were furtherexpressed as bands (eg l o w, m e d i u m a n dh i g h). The bands are directly related tostandards or targets recommended by theExpert Panel on Air Quality Standards(EPAQS) that the public can appreciate.The two principal machine learningtechniques used are neural networks andthe induction of rules and decision trees.Expressing their predictions as bandvalues makes the results of such rulesand models easier to understand.

Work so far supports the commonexperience in data mining that most ofthe effort is in data preparation andexploration. The data must be cleaned toallow for missing and bad measurements.Detailed examination leads totransforming the data into more effectiveforms. The modelling process is veryiterative, using statistics and visualisationto guide strategy. The temporaldimension with its lagged correlationsadds significantly to the search space forthe most relevant parameters.

More extensive investigation is neededto establish under what circumstancesdata mining might be as effective asdynamical modelling. (For instance,urban air quality varies greatly fromstreet to street depending on buildingsand traffic.) A feature of data mining isthat it can short circuit the post-interpretation of the output of numericalsimulations by directly predicting theprobability of exceeding pollutionthresholds. More generally, data mininganalysis might offer a reference modelin the validation of simulationcalculations.

■Please contact:

Brian J Read – CLRCTel: +44 1235 44 6492E-mail: [email protected]

Operational Air PollutionModellingby Tiziano Tirabassi

Air quality management andprotection require a knowledge of thestate of the environment encompassingboth cognitive and interpretativeelements. The monitoring network,together with the inventory of emissionsources, is of fundamental importancefor constructing the cognitiveframework but neglects theinterpretative one. Efficient airpollution management must dispose ofinterpretative tools which are able toextrapolate in space and time thevalues measured where analysers arelocated. However, the improvementsto the atmosphere can only be obtainedusing plans that reduce emissions ie bymeans of instruments (like airpollution models) able to link thecauses (source emissions) of pollutionto the related effects (air pollutionconcentrations). We describe a set ofair pollution models developed at theIstituto per lo Studio dei FenomeniFisici e Chimici della Bassa e AltaAtmosfera (FISBAT-CNR).

In practice most of the estimates ofdispersion of air pollution fromcontinuous point sources are based onthe Gaussian approach. A basicassumption for the application of thisapproach is that the plume is dispersedby homogeneous turbulence. However,due to the presence of the ground,turbulence is usually not homogeneousin the vertical direction.

Various organisations world-wide arecurrently introducing new advancedmodelling techniques based on the resultsof recent research on the meteorologicalstate of the boundary layer. Theseadvanced modelling techniques containalgorithms for calculating the mainfactors that determine air pollutiondiffusion in terms of the fundamentalparameters of the atmospheric boundarylayer, such as the Monin-Obukhov length

Page 31: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

31

ENVIRONMENTAL MODELLING

scale and friction velocity. Experimentalwork and modelling efforts haveattempted to parameterize the surfacefluxes of momentum, heat and moisturein terms of routinely measuredmeteorological parameters.

Model Codes

Within this framework, several modelcodes employing a non-Gaussiananalytical solution of the advection-diffusion equation have been developedat FISBAT-CNR, Bologna:

• KAPPAG – This model uses ananalytical solution based on the verticalprofiles of wind and diffusioncoefficients that are power functions ofheight. The model can handle multiplesources and multiple receptors,simulating time-varying conditions inwhich each time interval (eg, 1 hour) istreated as a stationary case. The modeloutput is a statistical summary of theconcentrations computed at eachreceptor, during each time step, and dueto each source. Partial and totalconcentrations are computed for hourlyand multi-hour averages. Highest andsecond-highest values are alsoevaluated.

• KAPPAG-LT – This is theclimatological version of KAPPAG,insofar as it produces seasonal and/orannual mean concentrations.

• CISP – This screen model provides amethod for estimating maximumground level concentrations from asingle point source as a function ofturbulence intensity and wind speed. Itis designed for the low-cost, detailedscreening of point sources in order todetermine maximum one-hourconcentrations and is regarded as auseful tool for a screen analysis in thatit is a relatively simple estimationtechnique, providing a conservativeestimate of the air quality impact.

• VIM – This is a screening model forestimating maximum ground-levelconcentrations as a function ofturbulence intensity, wind speed andwind direction, in an area with manyemission sources. It is considered to bea useful tool for screen analysis as itconstitutes a relatively simpleevaluation technique that provides a

conservative estimate of the air qualityimpact of a specific multi-source areaand a model for the evaluation of themaximum ground concentrationsproduced by many emission sources.

• MAOC – This is a model for theevaluation of pollutant concentrationsin complex orography. The simulationof terrain-induced distortion of flowstreamlines is accounted for bymodifying the effective plume height.

• VHDM – This practical modelevaluates ground level concentrationsfrom elevated sources, utilising aFickian-type formula where the sourceheight is a simple function of the windvelocity and eddy diffusivity verticalprofiles. The model acceptsexperimental profiles of the aboveparameters, as well as the theoreticalprofiles proposed in the scientificliterature, such as the vertical profilesof the wind and eddy diffusioncoefficients predicted by similaritytheory. In the latter case, the model canbe applied routinely using as inputsimple ground level meteorological dataacquired by an automatic network.

• M4PUFF – A puff model where thepollutant concentration in puffs isdescribed through the first fourmoments of its spatial distribution. Themodel is based on a general techniquefor solving the K-equation using thetruncated Gram-Charlier expansion ofthe concentration field and the finite setof equations for the correspondingmoments. Currently, the type A Gram-Charlier expansion is a classical methodfor approximating a given distributionwith moments of any order, basicallyconsisting of a truncated expansion interms of Hermite functions, whosecoefficients are chosen so as toreproduce the sequence of moments ofthe distribution up to a given order.

• SPM – This is a practical model basedon similarity theory for the dispersionof skewed puffs. It utilises approximatesolutions proposed for the dispersion ofa cloud of passive contaminant releasedfrom an instantaneous source near theground and particular importance isaccorded to describing the interactionbetween wind shear and verticaldiffusion and the process that transformsshear produced skewness into diffusivevariance in the wind direction.

The SPM model has been used in aproject between Italy and Brazil toinvestigate the dispersion ofradionuclides from the Brazilian Navy’sindustrial installation ‘CentroExperimental ARAMAR’ (CEA) locatedin Iperó, in a rural region of the State ofSão Paulo.

The performance of the models has beenassessed with success in cases of bothground level sources, using data of thePrairie Grass experiment , as well aselevated sources, using data from theEPRI experiment at Kincaid power plantand from the Copenhagen data set. Inaddition, performance in the case ofcomplex terrain has been evaluated usingwind tunnel data from the FluidModelling Facility of the USEnvironmental Protection Agency,Triangle Park (North Carolina).

■Please contact:

Tiziano Tirabassi – FISBAT (CNR)Tel: +39 051 63 99 601E-mail: [email protected]

Page 32: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

32

ENVIRONMENTAL MODELLING

SustainableDevelopmentand IntegratedAssessment by Achim Sydow, Helge Roséand Waltraud Rufeger

A new research project aboutsustainable development was initiatedthis year by the Hermann vonHelmholtz Association of GermanResearch Centers. In the course of thisproject at GMD Institute forComputer Architecture and SoftwareTechnology interactive and integratedsimulation tools for complexenvironment systems modelling willbe developed. The integratedassessment approach to modelcomplex systems on different levels ofa hierarchy is introduced as a possibletool to address questions connectedwith the problem of sustainabledevelopment and global change.

With the third millennium approaching,the question of sustainable developmentis becoming an important concept toinvestigate different possible scenariosof our future: Is it possible to realise adevelopment that meets the needs of thepresent and as well as those of futuregenerations? This problem is closelyrelated to the natural resilience and buffercapacity of the biosphere responding tothe impact of human development, ie thequestion of global change.

The progress in human development isbecoming increasingly dependent on theenvironment and may be restricted by itsfuture deterioration. The problem ofglobal change is complex in nature,represented by various interactions thatoperate on different spatial-temporalscales. Addressing these issues demandsan integrative consideration of allrelevant interactions between humansand the environment.

The reductionistic approach has failed inproviding an adequate analysis ofcomplex, large-scale global phenomena.

A more promising route seems to be amore holistic, integrated approach, basedon a systems-oriented analysis, whichconcentrates on the interactions andfeedback mechanisms between thedifferent subsystems.

Since the 1970s, there is a growing interestin an integrated approach to the problemof global change, called integratedassessment. In general, integratedassessment can be defined as aninterdisciplinary process of combining,interpreting and communicatingknowledge from diverse scientificdisciplines in such a way that the wholecause-effect chain of a problem can beevaluated from a synoptic perspective withtwo characteristics: it should have addedvalue compared to single disciplinaryoriented assessment and it should provideuseful information to decision-makers.

Integrated assessment is an iterative,continuing process, where on the onehand integrated insights from thescientific community are communicatedto the decision-making community andon the other hand experiences andlearning effects from decision-makersform the input for scientific assessment.In Europe, integrated assessment has itsorigins in the population-environment,ecological and acidification research. InNorth America the attention was mainlyconcentrated on the economic modelling.During the last years, integratedassessment models have increasinglyfocused on climate change andsustainable development.

The main questions under considerationtoday are: human health management andpopulation growth, management of fossilfuels and renewable energy resources,safeguarding of food and fresh water andthe stability of the biogeochemical cycleswith respect to the human perturbations.

To model these complex systems, thePressure-State-Impact-Response conceptmay be used as organizing principle forachieving a plausible division of thecause-effect chains into subsystems. ThePressure System represents social,economic and ecological driving forcesunderlying the pressure onto the humanand environmental system.

The State System represents physical,chemical and biological changes in thestate of the biosphere, as well as changesin human population and resources /capitals. The Impact System representssocial, economic and ecological impactsas a result of human and/or naturaldisturbance. The Response Systemrepresents human intervention inresponse to ecological and societalimpacts.

Modern integrated assessmentapproaches use a hierarchic structure toovercome the complexity problem ofmodelling. At the lower level ofaggregation, economy-energy modelsoperate in multi-year time steps withnational or regional political boundaries.Their regional and distributed output datamay be used in theme-specific modelslike RAINS (Regional AcidificationInformation and Simulation) or IMAGE(Integrated Model to Assess theGreenhouse Effect) at the next level ofaggregation. Their results ofcharacteristic simulations provide a guideline to setting up metamodels whichestablish the building blocks forintegrated assessment models, eg theTARGETS system (Tool to AssessRegional and Global Environmental andhealth Targets for Sustainability) of theNational Institute for Public Health andthe Environment, TheNetherlands/RIVM, at the highest levelof aggregation.

This hierarchic system of modelsdescribes the problem at different levelsof aggregation and integration. Thereforeit provides a flexible framework ofsimulation tools which can help toanswer the multifaceted questionsregarding the understanding andmanaging of complex environmentalsystems. Based on this concept the GMDInstitute for Computer Architecture andSoftware Technology will develop aninteractive and distributed modellingsystem which is supposed to tacklequestions connected with the problem ofsustainable development.

■Please contact:

Achim Sydow – GMDTel: +49 30 6392 1813E-mail: [email protected]

Page 33: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

33

ENVIRONMENTAL MODELLING

The Modellingand Control of LivingResources by Jean-Luc Gouzé

The aim of the new research teamCOMORE (Modelling and Control ofRenewable Resources) at INRIASophia-Antipolis is to apply methodsfrom control theory (feedback control,estimation, identification, optimalcontrol, game theory) and from thetheory of dynamic systems, to themathematical modelling of livingexploited resources (renewableresources) and their management.COMORE is a joint project with theCNRS Villefranche-sur-Mer. Themain research themes are ‘dynamicsand control of fishery andaquaculture’, ‘modelling and controlof bioreactors’, ‘modelling and controlof phytoplankton growth’, ‘modellingof forest dynamics’ and ‘mathematicsof biological modelling’.

COMORE is interested in themathematical modelling of biologicalsystems, more particularly of ecosystemssubmitted to a human action (theframework is thus that of renewableresources). It is now clear that it isimportant to know how to model andregulate the exploitation of theseresources by man. Our conceptualframework is that of Control Theory: asystem, described by state variables, withinputs (action on the system), and outputs(the measurements available on thesystem). In our case, the system will bean ecosystem, modelled by amathematical model (generally adifferential equation). Its variables willbe, for example, the number or thedensity of populations. The inputs couldbe the actions which one exerts on theecosystem: eg action of the man ( fishingeffort, introduction of food, etc), or actionof an external factor (pollution, light, etc).The outputs will be some product thatone can collect from this ecosystem(harvest, capture, production of a

biochemical product, etc), or somemeasurements (the total number ofindividuals for example).

This approach begins with themathematical modelling of the system.This stage is fundamental and difficult,because one does not have rigorous lawsas in physics. It is then necessary to studythe properties of this mathematicalsystem, which often has someparticularities. Let us take a simpleexample: in reality, the variables arepositive because they are populations; isit the same in the mathematical system?

One seeks to study the qualitativebehavior of the system, the existence ofequilibria, their stability, the existenceof periodic solutions... These qualitativequestions are fundamental because theytell us whether or not the model is viable(the model does not predict the extinction

of any species, etc). Specific problemsare posed by the biological origin of themodels: functions or parameters areuncertain, or unknown; what can we sayon the behavior of the model?

It is necessary to develop new techniquesto study these problems. In the same way,the strong structure of the models makesit possible to define classes of systems,for which one develops adaptedtechniques: for example the well-known

models of Lotka-Volterra in dimensionn, describing the interactions between nspecies.

Finally, we will consider problems ofregulation (how to keep a variable at agiven level) and observation (how toestimate the state variables from theoutputs). The biological origin of themodel gives new constraints for theseclassical problems.

Modelling of a Chemostat –Growth of the Marine Plankton

We work in collaboration with theStation Zoologique of Villefranche-sur-Mer (CNRS), which developed achemostat (small bioreactor where algaeor cells grow on a substrate) fullyautomated and managed by computer;this system is well adapted to theapplication of the methods resulting from

the theory of control. Our current workconsists of studying and validatingmodels of growth for the plankton in avariable environment (light, food, etc).The growth of the plankton is the basisof all the production of the organic matterof the oceans (fishes, etc); however, theexisting traditional models (Monod,Droop) are often unsatisfactory. We seekto obtain models valid during thetransitory stages, away from theequilibrium.

Growth of phytoplankton in thechemostat (CNRS, Villefranchesur Mer, Laboratoire d’Ecologiedu Plancton Marin).

Page 34: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

34

ENVIRONMENTAL MODELLING

A fundamental problem is that of thevalidation, or invalidation, of thesemodels: how to accept, with a certainprecision, a model by comparing it withexperimental noisy data ? The traditionalapproach, which consists of identifyingthe parameters of the model byminimizing a criterion of variationbetween the outputs of the model and thedata, is often inefficient. We aredeveloping new methods more pertinentfor the biologists.

Modelling of a Bioreactor –Observation and Control

Naturally, the topics above lead to themore general problem of modelling andcontrol of a bioreactor. The bioreactorsare having growing importance in manydomains related to the humanenvironment: alimentary (fermentation),pharmaceuticals (production ofmedicine), wastes (waste watertreatment), etc. Let us mention theproblem of observation in these systems,where the measurements are rare andexpensive: a software sensor, or observer,is a differential system which estimatesasymptotically the (unmeasured)variables of the model. It is based on themodel and the outputs of the process.

Modelling of Exploited Systems(fishing, forests, etc)

The scale of the problems changes here;data are rare and noisy. We considersome important methodologicalproblems: how to model the stock-recruitment relationship of the fish (therelationship between the number of fertileadults and eggs they produce). How doesone optimize the exploitation of fisheriesor forests with respect of some criteria?

For more information on COMORE, seeh t t p : / / w w w . i n r i a . f r / E q u i p e s / C O M O R E -eng.html

■Please contact:

Jean-Luc Gouzé – INRIATel: +33 4 9238 7875E-mail: [email protected]

ECHO andNARSYS – An AcousticModeler andSound Renderer by Nicolas Tsingos and Jean-Dominique Gascuel

Computer graphics simulations arenow widely used in the field ofenvironmental modelling, for exampleto evaluate the visual impact of anarchitectural project on its environmentand interactively change its design.Realistic sound simulation is equallyimportant for environmentalmodelling. At iMAGIS, a joint projectof INRIA, CNRS, Joseph FourierUniversity and the Institut NationalPolytechnique of Grenoble, we arecurrently developing an integrated

interactive acoustic modelling andsound rendering system for virtualenvironments. The aim of the system isto provide an interactive simulation ofglobal sound propagation in a givenenvironment and an integratedsound/computer graphics rendering toobtain computer simulated movies ofthe environment with realistic andcoherent soundtracks.

Computer graphics and acousticsresearch have both given rise to a varietyof models which are often based on thesame geometrical formalisms. However,despite the growing development ofmulti-modal simulations, it appears thatthere is a lack of modelling systemsoffering integrated design of visual andacoustic properties of dynamic virtualenvironments and combined computerrendering of image and sound. Oursystem attempts to fill this gap by readingin a 3D computer model of anenvironment and providing tools tospecify positions and properties of virtualsound sources, microphones and acousticmaterials, later used for sound simulation.

Simulating sound waves propagation isachieved by the ECHO module using anoriginal hierarchical radiant exchangesmethod similar to the radiosity techniqueused for lighting simulations. The methodaccounts for the temporal aspect of soundpropagation and both diffuse and mirrorspecular reflections which are of primaryimportance. We have also developed

original techniques to model sounddiffraction phenomena. The hierarchicalaspect of our approach allows for tuningthe complexity and realism of thesimulation, depending on the applicationand computing resources, eg a coarsermodel is used when less computingpower is available. Thus, the solution canbe updated at interactive rates after eachmodification of the environment. The

Example of a simulation in a concert hall (the Opera de La Bastille in Paris).

Page 35: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

35

ENVIRONMENTAL MODELLING

sound simulation is completely integratedin an animation module, used to specifymovement of sound sources,microphones, cameras or any object inthe environment over time.

The second part of the system is a soundrendering engine NARSYS (NaturalAudio Rendering SYStem) which is usedto filter r o u g h sound samples with thesolution obtained from the ECHOmodule in order to produce a soundtrackwith realistic sound reverberation,attenuation, Doppler shifting, multi-channel audio output etc. Imagesrendered from camera viewpoints andsound received by microphones can thenbe combined to create a realistic animatedmovie with perfectly coherentsoundtrack.

The flexibility of the system makes itwell suited for a variety of applications,from generating soundtracks forcomputer animation movies or interactivesound walkthroughs for virtualenvironments to interactive acousticdesign of concert halls or environmentalacoustic planning. A collaboration withthe acoustics department of the CentreScientifique et Technique du Batiment(CSTB) in Grenoble has been establishedto validate our simulation technique ona variety of environments, includingseveral concert and opera halls like theOpera de la Bastille in Paris. Furtherinformation can be found at http://www-imagis.imag.fr/~Nicolas.Tsingos/

■Please contact:

Nicolas Tsingos or Jean-DominiqueGascuel – iMAGIS/GRAVIR-IMAGTel: +33 4 7663 5576E-mail: [email protected], [email protected]

STOY – AnInformationSystem forEnvironmentalNoise-Planning by Ole Martin Winnem and Hans Inge Myrhaug

The new international airport atGardermoen forced the RoyalNorwegian Army to relocate theiractivities to Rena. This movement intoa new area with farmers and smallvillages at both sides of the training areaput major constraints on what trainingthat can be performed in the area. SFT(Norwegian Pollution ControlAuthority) has given rules for thetraining regarding time of activity,noise in the nearest buildings and areaof training. The area where the newtraining fields are located has coldwinters with stationary air down in thevalley. This can give a temperaturevariance of 20(C from the training areain the mountains and down in the valleywhere the civilian population lives.

Listeners are affected by the time of daywhen the noise is heard. During the day,they can accept much noise, in the

evening they accept less noise, and atnight they are very sensitive to noise.Also, the kind of information they aregiven in advance regarding noisyactivities, influences their attitude to thenoisemaker. If they are given noinformation, they tend to get morefrightened when loud and warlike noisesoccur. On the other hand, if a noisyactivity is announced in advance as anairplane show, they can accept almostany kind of noise.

The kind of weather and the time of theyear are also important factors. Duringwinter people are much more indoors andthus isolated from the noise. Duringsummer - and particularly when theweather is nice - people are outdoors andtherefore much more sensitive.

With these conditions the military haveto optimise the activity regarding tonoise.

The Monitoring and Planning System

STOY (noise) is a project for logging,monitoring, identifying, computing andplanning noise.

The system is based on seven components:User Interface component, Monitorcomponent, Estimation component,Planning component, Databasecomponent, Logging component and theNoise recognition component.

The military training area. The identified points are measurement stationsfor noise, temperature and wind.

Page 36: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

36

ENVIRONMENTAL MODELLING

The logging component retrieves datafrom seven stationary and one mobilemeasurement station for noise andweather conditions. And then send themto the Database component. The databasecomponent stores the logged data, thecalculated data, the recognised data andthe data from the planning component.The Noise recognition componentretrieves the logged data from thedatabase and identifies the noise source.The Monitoring component is the day-to-day monitoring system for the user.The Estimation component estimates thenoise propagation for a given noisesource by using a mathematical modelthat pays respect to terrain formations.The Planning component is the supportsystem for activity planning. Thecomponent has two subparts, one forgeneration of plans and one for critiqueof plans. The planning component isusing knowledge intensive case basedreasoning and rules for evaluating theplans.

Other Application Areas

The military all over the world gets moreand more restrictions on their training.This gives a market for noise planningsystems to help them plan their activitiesinside the restriction threshold.

Major airports are getting newrestrictions on how much noise they cangenerate, and the airplane companies aregetting restrictions on how much noisethey can create on a particular airport.Both have a planning problem for whichtype of plane that can be used tomaximise the number of passengers theycan transport.

Industrial noise has become a problemall over the world. An example is fieldsfor testing of plane engines where thecompany must plan the activity accordingto the civilian population in the area.

■Please contact:

Ole Martin Winnem – SINTEF Telecomand InformaticsTel: +47 73 59 29 93E-mail:[email protected]

PartialDifferentialEquations in Porous Media Research by Hans van Duijn

Mathematical modelling of flow andtransport through porous media playsan important role in environmentalstudies as well as in reservoirengineering. Applications include thespread of pollutants from a landfillthrough the soil system and of oil spillsin the subsurface, the intrusion of saltseawater in coastal aquifers, and newmethods for enhanced oil recovery andunderground gas storage. At CWIseveral of such problems are studied,ranging from very applied totheoretical research.

Although already intensively studied inthe past, transport phenomena in porousmedia still yield new and challengingmathematical problems in the field ofnonlinear partial differential equations(PDEs), free boundary problems, andhomogenization procedures. Moreover,

detailed numerical studies are necessaryto improve the prediction capacity of therelated models. CWI’s research focuseson the mathematical modelling oftransport processes in the subsurface, andon the qualitative analysis and numericalstudy of the governing PDEs.

In the field of PDE research particularattention was given to systems consistingof a convection-diffusion equation

coupled with an ordinary differentialequation. A special case, modelling saltuptake by mangroves, involves non-localconvection. In a collaboration with theUniversity of Bonn the interface betweenfresh and salt groundwater in thepresence of wells is studied. The interfaceappears as a free boundary in an ellipticproblem in which, depending on thepumping rate of the wells, a cuspsingularity develops. In the case ofheterogeneous media the interface wasstudied numerically, using a movingmesh Finite Element Method.Knowledge of the behaviour of theinterface is important, eg, to estimate inhow far seawater intrudes in aquifers inthe Dutch coastal area when drinkingwater is pumped out. This research washighlighted for example at a tele-conference ‘Mathematics and environ-ment: problems related to water’, aninitiative of the European MathematicalSociety, which was held last Decembersimultaneously at three locations:Amsterdam, Madrid, and Venice.

The study of density-driven flow inporous media concentrates on brinetransport problems related to high-levelradio-active waste disposal in salt domes.High salt concentrations give rise to non-linear transport phenomena such asenhanced flow due to volume(compressibility) effects and the

reduction of hydrodynamical dispersiondue to gravity forces. A non-lineardispersion theory for this case, proposedby S.M. Hassanizadeh (Delft Universityof Technology), was verified usingexperimental data from the TechnicalUniversity of Berlin.

The pressure in active gas reservoirs inthe Netherlands becomes insufficient tomeet the needs during the winter period.

A fingeringpattern offresh/saltfingers in aporousmedium (light color = salt, dark = fresh).

Page 37: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

37

ENVIRONMENTAL MODELLING

Therefore Gas Unie/NAM (NederlandseAardolie Maatschappij) intends to storegas in depleted gas reservoirs which areno longer in production. These reservoirsact as a buffer to meet peak demand. Inan ongoing project sponsored by NAMCWI studies gas injection into areservoir, in order to understand andquantify the mixing (diffusion/dispersion) of injected gas with residualgas in old reservoirs, and develops anumerical code to predict this mixing.

Another project concerns soilremediation techniques. Organiccontaminants may be removed from thesoil either by pumping methods or byinjecting air (air sparging), whichenhances biodegradation andvolatilization. The corresponding flowof groundwater, organic contaminant andair is described using multi-phase flowmodels. For air injection intogroundwater in a horizontally layeredmedium accurate numerical solutions ofthe full transient two-phase flowequations were found and an almostexplicit solution for the steady-state airflow just below a less permeable soillayer was derived. To model pumping ofa lens of light organic liquid from anaquifer, multi-phase seepage faceconditions were applied at the wellboundary. For two different geometriesof the lens similarity solutions providedgood approximations of the removal rateand the location of the remainingcontaminant as a function of time.

Finally, in a recently started projectPDE’s with higher order mixedderivatives are studied. Such equationsarise in models for unsaturatedgroundwater flow, taking into accountdynamic capillary pressure.

See also : h t t p : / / d b s . c w i . n l / c w w w i / o w a /cwwwi.print_projects?ID=5

■Please contact:

Hans van Duijn – CWITel: +31 20 592 4208E-mail: [email protected]

DescartesSystem –InteractiveIntelligentCartography in Internet by Gennady Andrienko and Natalia Andrienko

Descartes assists users in theexploration of spatially referencedstatistical data, ie economical,demographic or ecological data relatedto geographical locations. Itincorporates knowledge aboutthematic cartography to automaticallygenerate thematic maps that correctlyvisualize selected data. It also supportsthe further, interactive exploration ofthese maps. Descartes was formerlycalled IRIS (see ‘IRIS generatesThematic Maps’ by Hans Voss inERCIM News No. 27, October 1996,page 13).

Current Geographic Information Systems(GIS) specifically support the analysisand manipulation of geometricinformation. However, they are weakregarding their support of visualizingspatially referenced statistical data. Majorshortcomings are:

• Users bear full responsibility for acorrect selection of presentationtechniques. Consequently, they mustproficiently know principles ofcartographic presentation, have to spendconsiderable time and efforts for mapproduction, and cannot entirelyconcentrate on their task of analysis andproblems to solve.

• GIS are typically complex and difficultto operate; significant training isrequired.

• GIS do not sufficiently support thefurther exploitation and exploration ofproduced maps, ie their use in dataanalysis. Descartes overcomes theseshortcomings. It provides intelligentmapping support and a wide spectrum

of functions for interactive visual dataanalysis.

Intelligent Mapping in Descartes

Descartes incorporates the knowledge onthematic cartography in the form ofgeneric, domain-independent rules. Tochoose automatically the adequatepresentation techniques for data selectedby the user, Descartes accounts for datacharacteristics such as types of data fields(numeric, categorical, or logical), valueranges, and relations among the fieldssuch as comparability, hierarchicalinclusion, or their covering a meaningfulwhole (eg male and female cover the setof all persons).

Interactive Visual Data Analysis

Maps on computer screens should not bemere reproductions of paper maps, ratherthey should be dynamic and reactive touser’s actions. Descartes provides varioustools to interactively manipulate the mapsso as to make them more expressive andto exploit the potential of specificvisualization techniques for dataexploration.

An example is the visual comparisontechnique. Figure 1 demonstrates thistechnique in application to a bar map thatrepresents values of a numeric attributeby heights of bars. The left map enablesvisual comparison with an interactivelyselected number 19 (median). The barsshow how the attribute values for thecountries differ from 19: the higher is abar, the greater is the difference. Whethera difference is positive or negative is seenfrom the bar orientation and color. Themap is redrawn in real time as the userchanges the reference value for the visualcomparison, for example, by draggingthe slider along the number line (see theupper left corner of the figure).

Dynamic Linking and Highlighting

All data displays produced by the systemare linked. As the mouse moves througha map or a supplementary non-cartographic presentation, the systemhighlights the objects touched by thecursor in all currently visible displays.For example, on the figure below a reader

Page 38: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

38

ENVIRONMENTAL MODELLING

can see three linked displays: two mapsand a presentation with a dot plot and a“box-and-whiskers” plot (upper left) thatsuggests a summary of the valuevariation of the percentages of childrenin population. The mouse cursor has beenpositioned on the third dot from the leftin the dot plot. As a result, the countries

with the corresponding value of theattribute ‘percentage of children’ arehighlighted in both maps.

Applications

Descartes has been applied to several setsof statistical data, eg data about Europeancountries, the City of Bonn, census datafor Leicester. Some demonstrators canbe accessed and operated remotely at theURL http://allanon.gmd.de/and/java/iris/A set of thematic maps for Rhein-Siegarea produced in Descartes is availableat http://allanon.gmd.de/and/java/showmap/

Current State and FurtherDevelopment

Descartes is implemented in Java andavailable as a stand-alone programrunning on Personal Computers underWindows 95/NT, and as an applicationbased on Java-applets running under anyJava-enabling Internet browser.Descartes has been fully operational now

for more than two years. In September1996, it was included in the Top 1% webapplets and Top 10 web applets lists bythe independent Java Applet RatingService.

A commercial version of Descartes isavailable from the GMD spin-off

company DIALOGIS Software &Services GmbH. Descartes has beenintegrated with the data miningworkbench KEPLER, which is alsoavailable from DIALOGIS. Furtherdevelopment of Descartes is to be donewithin CommonGIS project submittedon the Esprit Thematic call “InformationAccess and Interfaces” and accepted forfunding.

■Please contact:

Gennady and Natalia Andrienko – GMDTel: +49 2241 14 2329E-mail: [email protected]

TravellingWaves in NonlinearDiffusion-Convection-ReactionProcesses by Róbert Kersner

Special solutions play an importantrole in the study of nonlinear partialdifferential equations (PDE’s) arisingin mathematical biology. Confrontedwith a mathematical model in the formof an initial or a boundary valueproblem for a partial differentialequation or a system of such equations,the foremost desire of a practitioner isto solve the problem explicitly. If littletheory is available and no explicitsolutions are obtainable, generally theensuing attack is to identifycircumstances under which thecomplexity of the problem may bereduced. The study of these problemsbelongs to the main research directionsof the Applied MathematicsLaboratory at SZTAKI.

In cooperation with the University ofTwente, a technique has been developedfor determining whether or not anonlinear, possibly degenerate parabolicpartial differential equation admits atravelling-wave (TW) solution, iesolution of the form f(x-ct) which doesnot change the shape f and where thespeed c is constant. This technique issuitable for investigating properties ofsuch travelling waves as well.

In ecological context, the different termsin such kind of partial differentialequations represent the birth - deathprocess, diffusion and convection.Typical examples for them are the Fisheror logistic equation, which is thearchetypical deterministic model for thespread of an advantageous gene in apopulation of diploid individuals, the

Various cartographic presentation methods for age categories.

Page 39: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

39

ENVIRONMENTAL MODELLING

Nagumo equation, which models thetransmission of electrical pulses in a nerveaxon, and the Richards (or nonlinearFokker-Planck) equation. Analogoussystems of PDEs were proposed asmodels for the chemical basis ofmorphogenesis by Turing in 1952.

The principal contention of our methodis that the study of the travelling wavesolutions of partial differential equations

above is equivalent to the study of thesingular Volterra-type nonlinear integralequations in the following sense: thepartial differential equation has amonotonic travelling wave solutionconnecting two states with a given speedif and only if the integral equation has anon-negative solution which is zero atthese states.

For the Fisher-type equations a typicalresult is that the travelling wave solutionsexist only for speeds which are biggerthan the minimal speed and have the formgiven in Figure 1. The minimal speed isan unknown of the problem, having greatpractical importance. In many cases, aftersome time the general process moveswith this velocity.

It may happen that a travelling wavesolution has a sharp front (see Figure 2).This is the case, for example, for thegeneralised Fischer-type model, when thedensity dependence of the diffusioncoefficient is power-like. The travellingwave solutions with not minimal speedhave no sharp front (see Figure 1), but theunique wave with minimal speed has (seeFigure 2). Note that this travelling wavehas attractor property: after some time all

the solutions of the original partialdifferential equation with initial data froma given class will take the form and thespeed of this wave. Using the abovementioned integral equation method onecan obtain these kind of results almostalgorithmically, together with goodestimates for the minimal speed.

Our developed technique provides analternative to phase-plane analysis and

has several advantages with respect tothis classical approach. Turning to theresults of the theory of integral equationsthe user can easily identify circumstanceswhich guarantee the existence of therequired waves and can decide if thewave has a sharp front or not. Modelswith the same characteristics arise inseveral scientific fields, for example heattransfer, combustion, reaction chemistry,plasma physics, etc.

■Please contact:

Róbert Kersner – SZTAKITel: +36 1 4665 783E-mail: [email protected]

ESIMEAU –WaterResourcesManagementand Modelling inSemi-Arid Areas by Fadi El Dabaghi

The main objective of the ESIMEAUproject is to build an infrastructure ofan integrated system to be used as asingle decision support system for themanagement of water related issues.Such an open work platform willembody numerical models andsimulators operating on spatialgeographic databases, as well as dataacquisition and data managementtools. The direct beneficiaries arespatial planners and decision makerswho will find through such a system,efficient operational facilities in thepreparation and the installation ofplanning and/or emergency measuresdirectly related to water – both as aresource and as a natural hazard – forsemi-arid areas.

The main actions to be addressed are:

• development or integration of tools forthe management, analysis and treatmentof heterogeneous data (acquired in situor through satellite images) taking intoaccount the semi-arid climatecharacteristics (variability,heterogeneity, size)

• modelling and analysis of rapidphenomena such as floods and solidtransport or slow scale ones such aseutrophication, erosion, climate changeand the development of appropriatenumerical algorithms for a better andrealistic simulation reproducing the realphysical situation of the phenomenaunder investigation

• integration of these conventionallyseparately treated application domainsinto a single application throughintelligent interfaces within anappropriate and user-friendlyframework.

Figure 1: The travellingwave solutions exist onlyfor speeds exceeding theminimal speed.

Figure 2: A travelling wavesolution with a sharp front.

Page 40: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

40

System Architecture

The architecture of the system to bedeveloped will consist of three levelscontaining some modules. Modules ofgeneral type such as graphics, data basemanagement systems (DBMS),geographic information systems (GIS),shells, simulators, etc.) will constitute apart of the backbone of the system. Theother specific modules taking care ofparticular physical applications(hydrology, hydraulics/hydrodynamics,eutrophication, etc.) will be developedindependently. The numerical modulesare dealing with the mechanical treatmentof eutrophication while only a state ofthe art on the physical modelling offlood/release risk management occurringin watersheds will be delivered. The threelevels of the system are:

• at the top level the ESIMEAUintegrator will allow the end userwithout prior or deep knowledge incomputer technology, to access and usein a user-friendly environment all theoperational modules data: consultation,treatment, or visualisation, numericalsimulation , and post-processing(graphs, iso-curves, thematic maps, etc.)

• at the intermediate level are specificmodules for data treatment andnumerical simulation/prediction

• at the lower level are located dataaccess, consultation, modification andvisualisation interfaces.

For building the architecture ofESIMEAU, general conventionalstandards tools will be used. GeographicalInformation Systems (GIS) such asArcInfo and Arcview, Idrisi; DatabaseManagement Systems (DBMS) such asDbase, FoxPro, Access, Oracle; shellssuch as CLIPS, KAPPA, KEE; simulatorssuch as, Powersim, Slam, CellularAutomata based systems; integrators suchas ArcView, KAPPA, and to a less extentGeonamica, will be considered.

Representation of Spatial Dynamics at Different Geographical Scales

Many environments will generally haveat least two strongly linked components,one for macro-level, macroscopic longrange and large scale processes(processes at the global level of the

region studied) and a second forprocesses operating on the micro-level,short range, and micro-scale (processesat the level of small sub-sections of theregion studied). Models at both levels aretightly integrated and exchangeinformation back and forth through anumber of state variables speciallyconceived for this purpose. It will besupplemented with all the facilitiesrequired to be used as a stand-aloneapplication.

The system will be designed and adaptedtaking into consideration the

specifications of its main end-users; ieplanners, policy makers, and modelbuilders. The integration of modelsoperating at different geographical andtemporal scales, is important toESIMEAU. A built-in simulationlanguage and associated modelling tools,will ease the implementation and theintegration of the models selected. Alinkage to commercial GIS packages willfacilitate the storage and manipulationof geographical data as well as theirintegration into a single multi-levelmodel.

ENVIRONMENTAL MODELLING

Structure of an integrated model representing spatial dynamics at differentgeographical scales, with linkages between Macro-level, Micro-level andGIS components.

Page 41: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

41

E S I M E A U is supported by the EuropeanCommission’s INCO DC programme. Itstarted in December 1997 and has aduration of three years. Partners are:INRIA; ORSTOM, French NationalInstitute of Scientific Research forDevelopment through Co-operation;FORTH; RIKS, Research Institute forKnowledge Systems, The Netherlands;Ecole Nationale Polytechnique, Algiers,Algeria; Ecole Supérieure d’Ingénieursde Beyrouth, Lebanon; EUCLID,Lebanon; Ecole Mohammadiad’Ingénieurs, Morocco; Office Nationald’Eau potable, Morocco; CentreInternational des Technologies del’Environnement de Tunis, Tunisia. Theproject is administered by ERCIM.

■Please contact:

Fadi El Dabaghi – INRIATel: +33 1 3963 5343E-mail: [email protected]

ERCIM WorkingGroupEnvironmentalModelling by Achim Sydow

The purpose of the ERCIM WorkingGroup Environmental Modelling is tobuild and maintain a network ofERCIM researchers in the field ofenvironmental modelling andsimulation. The working group is opento anybody affiliated with an ERCIMinstitute. Since its establishment in1995, the working group hasundertaken a variety of activities inorder to promote the co-operation andjoint research activities of groupsworking in the field of environmentalmodelling and simulation.

To date, the ERCIM Working GroupEnvironmental Modelling has carried outthree workshops. The fourth workshopis already in its preparation phase.

On 11 October 1996, the kick-off meetingof the Working Group took place at

INRIA Rocquencourt. Points ofdiscussion were the enlargement of theWorking Group through finding newpartners and the definition of certainresearch topics for future projects.

The Working Group held its firstworkshop on Air Pollution Modelling inBerlin, Germany, from 7-8 April 1997.The workshop was hosted by the GMDInstitute for Computer Architecture andSoftware Technology Berlin. Theworkshop chairman was Achim Sydow(GMD). In total 28 participants from 10European countries attended the workshopand listened to 18 lectures, presentingrecent research results in the field of airpollution modelling and simulation.

The second workshop of the WorkingGroup on Remote Sensing for AirPollution Analysis was held inStrasbourg on 11 September 1997. It waspart of the International ConferenceUmweltinformatik ‘97 - Informatiquepour l’environnement ‘97. The workshopchairman was Isabelle Herlin (INRIA).The lectures and discussions at theworkshop focussed on the usage of dataderived from remote sensing for airpollution analysis and forecast.

The Working Group carried out its thirdworkshop on Air Pollution Modellingand Measuring Processes on 20-21 April1998 in Madrid, Spain (see article onpage 4). The workshop was hosted by theEnvironmental Software and ModellingGroup of the Computer Science Schoolat the Universidad Politecnica de Madrid(UPM). The workshop chairman wasRoberto San José (UPM). As highlightsof the programme, invited lectures weregiven by representatives of the localenvironmental authorities of the State ofMadrid and the State of Brandenburg(Germany). The participation of theseinvited speakers was supported by theERCIM Office.

Submission of Joint Project Proposals

The first project activity of the WorkingGroup was the submission of the projectproposal STEPUP in the TMRProgramme of the European Commissionon 15 June 1995. Further activities of theWorking Group have led to the

preparation of the second projectproposal. The proposal DECAIR -Development of an Earth ObservationData Converter with Application to AirQuality Forecast has been submitted tothe European Commission ProgrammeEnvironment and Climate. Under thecoordination of the ERCIM Office,Working Group members fromGermany, France, Spain and the UnitedKingdom have participated in the project.

Further activities of the Working Grouphave led to the preparation of the thirdproject proposal. The proposal P A N D O R E- Pollution Analysis and Modelling withDatabases and Observations from RemoteS e n s i n g has been submitted to theEuropean Union-India Economic CrossCultural Programme. Under thecoordination of the ERCIM Office,Working Group members from Germany,France and United Kingdom together witha partner from India have participated inthis project.

ERCIM Fellowship

An applicant has been accepted for thefirst ERCIM Fellowship within theframework of the Working Group. Hehas started his fellowship at the WorkingGroup member team of INRIARocquencourt on 1 June 1998 and after9 months he will continue at GMDInstitute for Computer Architecture andSoftware Technology.

To date, members from nine researchteams of ERCIM institutions in sevenEuropean countries are collaboratingwithin the Working Group. Moreinformation about the ERCIM WorkingGroup Environmental Modelling isavailable at http://www.first.gmd.de/applications/em.html

■Please contact:

Achim Sydow – GMDWorking Group chairmanTel: +49 30 6392 1813E-mail: [email protected]

ENVIRONMENTAL MODELLING

Page 42: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

RESEARCH AND DEVELOPMENT

42

Task Parallelismin an HPFFrameworkby Raffaele Perego and Salvatore Orlando

High Performance Fortran (HPF) isthe first standard data-parallellanguage. It hides most low-levelprogramming details and allowsprogrammers to concentrate on thehigh-level exploitation of dataparallelism. The compiler automaticallymanages data distribution andgenerates the explicit interprocessorcommunications and synchronizationsneeded to coordinate the parallelexecution. Unfortunately, manyimportant parallel applications do notfit a pure data-parallel model and canbe more efficiently implemented byexploiting a mixture of task and dataparallelism. At CNUCE-CNR in Pisa,in collaboration with the University ofVenice, we have developed COLThpf,a run-time support for the coordinationof concurrent and communicatingHPF tasks. COLTh p f provides suitablemechanisms for starting distinct HPFdata-parallel tasks on disjoint groupsof processors, along with optimizedprimitives for inter-task communicationwhere data to be exchanged may bedistributed among the processorsaccording to user-specified HPFdirectives.

C O L Th p f was primarily conceived foruse by a compiler of a high-levelcoordination language to structure a setof data-parallel HPF tasks according topopular task-parallel paradigms such aspipelines and processor farms. Thecoordination language supplies a set ofconstructs, each corresponding to aspecific form of parallelism, and allowsthe constructs to be hierarchicallycomposed to express more complicatedpatterns of interaction among the HPFtasks. Our group is currently involved inthe implementation of the compiler forthe proposed high-level coordinationlanguage. This work will not begin fromscratch. An optimizing compiler forSKIE (SKeleton Integrated Environment),

a similar coordination language, hasalready been implemented within thePQE2000 national project (see ERCIMNews No. 24). We only have to extendthe compiler to support HPF tasks andprovide suitable performance models

which will allow the compiler to performHPF program transformations andoptimizations. A graphic tool aimed tohelp programmers in structuring theirparallel applications, by hierarchicallycomposing primitive forms ofparallelism, is also under development.

The figure shows the structure of anapplication obtained by composing threedata-parallel tasks according to a pipelinestructure, where the first and the last tasksof the pipeline produce and consume,respectively, a stream of data. Note thatTask 2 has been replicated, thusexploiting a processor farm structurewithin the original pipeline. In this case,besides computing their own jobs, Task1 dispatches the various elements of theoutput stream to the replicas of Task 2,while Task 3 collects the elementsreceived from the replicas of Task 2. Thecommunication and coordination code ishowever produced by the compiler, whileprogrammers have only to supply thehigh-level composition of the tasks, and,

for each task, the HPF code whichtransforms the input data stream into theoutput one. The data type of the channelsconnecting each pair of interacting tasksis also shown in the figure. For example,Task 2 receives an input stream, whoseelements are pairs composed of anINTEGER and an NxN array of REAL’s.Of course, the same data types areassociated with the output streamelements of Task 1. Programmers canspecify any HPF distribution for the datatransmitted over communicationchannels. The calls to the suitableprimitives which actually perform thecommunications are generated by thecompiler. Note that communicationbetween two data-parallel tasks requires,in the general case, P*N point-to-pointcommunications if P and N are theprocessors executing the source anddestination tasks, respectively. Thecommunication primitives provided byour run-time system are however highlyoptimized: messages are aggregated andvectorized, and the communicationschedules, which are computed onlyonce, are reused at each actualcommunication.

C O L Th p f is currently implemented ontop of Adaptor, a public domain HPFcompilation system developed at GMD-SCAI by the group lead by ThomasBrandes, with whom a fruitfulcollaboration has also begun. Thepreliminary experimental resultsdemonstrate the advantage of exploitinga mixture of task and data parallelism onmany important parallel applications. Inparticular we implemented high-levelvision and signal processing applicationsusing COLTh p f and in many cases wefound that the exploitation of parallelismat different levels significantly increasesthe efficiencies achieved.

■Please contact:

Raffaele Perego – CNUCE-CNRTel: +39 050 593 253E-mail: [email protected]

Salvatore Orlando – University of VeniceTel: +39 041 29 08 428E-mail: [email protected]

High-level description andassociated task graph of a parallelapplication obtained by composingthree data-parallel tasks accordingto a pipeline structure where thesecond stage has been replicated byhierarchically composing thepipeline with a farm structure.

Page 43: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

RESEARCH AND DEVELOPMENT

43

DevelopmentTools forDistributedServices by Marc Born and Linda Strick

The increasing complexity ofdistributed telecommunicationservices has lead to a requirement forpowerful methods and tools to supportthe design and specification process.The GMD Institute for OpenCommunication Systems hasdeveloped a methodology covering thewhole development life-cycle. CaseTools that support the design processhave the Y.SCE tool (Y.ServiceCreation Environment) developed byGMD and the commercially availableTAU toolset from Telelogic. Both toolsare connected to perform a tool chainwhich has been successfully used inseveral national and internationalprojects for the development ofdistributed services. This article givesa brief overview of the methodology,the tools and further work.

The methodology developed by GMDscientists provides notations andguidelines to cover all stages of thedevelopment life-cycle for arbitrary(telecommunication) services. Itdescribes all activities that are necessaryto define a distributed service and is acomplete description of both, stages andsteps involved in the analysis, design andimplementation plane.

The Reference Model for OpenDistributed Processing (RM-ODP)[ODP] has been used as a generalframework for the tool supported designmethodology as it provides object-oriented concepts and principles forstructuring the system design. The designprocess itself is not a pure top-downapproach, but is an iterative usage of eachof the stages from an abstract level downto the detailed specification andimplementation. Repetition of steps isneeded if errors are detected either by

validation on the design plane or bytesting the implementation.

Different languages and techniques areused to support the specification of themodels. These techniques include OMT(Object Modelling Technique), TINAODL (Telecommunication InformationNetworking Architecture/Object

Definition Language) and SDL [Z.100](Specification and DescriptionL a n g u a g e ) .

In order to get acceptance for newsoftware development methods it is notenough to propose the method from atheoretical perspective only. Tools haveto be provided to enable the usage of themethod.

For the Analysis Plane the Y.SCE toolprovides graphical notations for definingthe Enterprise Model. A Use Case Modeland an Organisational Model can bedefined which are stored in aninformation repository. For thebehaviour, high level Message SequenceCharts (MSCs) can be captured using theTAU toolset from Telelogic. Both toolsare connected to allow that MSCs can beattached to the Use Cases and anavigation between them is possible.

To support the Design & ValidationPlane the Y.SCE tool provides agraphical notation for the ObjectDescription Language (ODL) and is ableto import and export textual ODL files.To support CORBA (Common ObjectRequest Broker) based systems, IDL files(Interface Definition Language) can alsobe imported and exported. The ODLinformation is stored in the repository. Itis possible to establish links between theUse Cases and ODL entities to navigatealong these links. In order to describe the

functional behaviour, the Y.SCE toolgenerates SDL skeletons which can beimported into the TAU tool. Thebehaviour information will be addedthere. The TAU tool providesfunctionality for simulation andvalidation of the specification modelsdefined in the different planes.

To support the Implementation andTesting Plane automatic generation ofimplementation language code (currentlyC and C++) is possible either directlyfrom Y.SCE or from TAU. Dependingon the behaviour model the generatedcode is more or less abstract.

In order to test, whether the finalimplementation is conform to itsspecification the ITEX tool, a Telelogictool, which is also part of the TAUtoolset, is used for the derivation of testcases. To bridge the gap between abstracttest suites and executable test cases theGMD Institute for Open CommunicationSystems has developed an additional toolto perform testing in a distributedCORBA environment.

The application of the describedmethodology to the design of distributedservices ensures a higher quality of theresulting service and a shorter time tomarket. Further work will be done tosupport automated deployment of theservices to appropriate middleware bytools.

■Please contact:

Marc Born – GMDTel: +49 30 3463 7235E-mail: [email protected]

Linda Strick – GMDTel: +49 30 3463 7224E-mail: [email protected]

Used notations formethodology.

Page 44: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

RESEARCH AND DEVELOPMENT

44

The Use of Wavelets in Seismics and Geophysics by Nico Temme

Wavelets are wave patterns of smallsize that can be used to analyze rapidchanges in a signal, sharp contrasts inan image, and structures at differentscales. The wavelet method, developedin the eighties and now provided witha sound mathematical basis, hasbecome a powerful tool in the field ofdata compression, noise suppressionand processing of images and signals.At CWI wavelets are studied in aproject financed by the TechnologyFoundation STW which focuses on theapplication of wavelets in seismologyand geophysics.

A wavelet is a wave-like function withshort extension: its graph oscillates onlyover a short distance, or damps very fast;its mean value over the whole domainequals zero. A wavelet is localized bothin frequency and position. From this‘mother wavelet’ a whole family of otherwavelets is derived by displacement andscaling. In this family every scale isrepresented for every wanted position.Wavelet analysis operates as amicroscope: as a wavelet family contains‘building blocks’ of arbitrary small scale,we may zoom in at any time on any detailof the signal. Wavelets are well suited todetect the presence of fractal componentsin observational data. As we observeimages and acoustic signals mainly bycontrasts, wavelets can be very effectivein storing and transmitting such imagesand signals by data compression.Wavelets are also successfully appliedfor, eg, removing noise in old soundrecordings and restoring images fromMRI and CT scans.

Research on wavelets and its applicationsin The Netherlands takes place forinstance at the Royal DutchMeteorological Institute KNMI (structureof the clouds), Delft University of

Technology (soil structures, for oilexploration), Groningen University(image processing in medicine) and CWI(analysis of geophysical/seismic signals).CWI’s research is focused on thedevelopment and use of directional time-scale analysis methods.

An important research topic is thecombination of the Radon transform (alsocalled X-ray transform — a well-known

technique for reconstructing a 3D objectfrom a number of cross-sections) and thewavelet transform. A rigid mathematicalformulation of this ‘Wavelet X-rayTransform’, its basic properties and itsdiscretization have been obtained. A fastalgorithm for this combined transform isalso under development. The computercode has been established in a Matlabenvironment and the first results arepromising.

The ‘Wavelet X-ray Transform’ isapplied to the filtering of two-

dimensional seismic data sets. Theseimages contain information about thesubsurface of the earth such as thelocalization of geological interfaces.However, they also contain irrelevantparts, for example contributions fromwaves coming directly from theexplosion (that is used to generate thesignals) to the detection point. CWI’swavelet method of filtering away theirrelevant parts prior to extraction of the

relevant parameters is based on the ideathat relevant and disturbing parts in animage can be separated based ondistinction in position (or time), scale (orfrequency) and direction.

Secondly, research is conducted jointlywith KNMI on the problems ofpolarization in seismic signals, where anestimation of the spectrum is made bypreprocessing the signal data. In aseismogram, the waves coming from thesource and arriving at the Earth’s surfaceat different arrival times appear as so-

A compressed image of a brain-slice (top right) is stored six times asefficient as the original (top left). The resolution is retained. This is achievedby ignoring negligible wavelet coefficients. A corrupted image of a plane(bottom left) is cleared (bottom right). First a wavelet decomposition of theimage is made. Then the coefficients corresponding to the noise part aremuted. The final reconstruction of the image is based on the modifiedwavelet coefficients. In both cases MATLAB Graphical User Interface toolswere used.

Page 45: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

RESEARCH AND DEVELOPMENT

45

called phases. These phases, which canbe seen as the ground motion due to thearrival of a particular wave coming fromthe source, are all measured during ashort time period. A research aim is tofind the time periods in a seismogram,when they appear. Furthermore we wantto analyze these phases accurately.

Analysis of a short segment in a longerrecord by spectral estimation, ie, studyingthe frequency components of the Fourierspectrum, is hampered by the fact that itsspectrum cannot be determined exactlyif the data process is stochastic or a noisydeterministic signal.

A seismogram is a non-stationary dataprocess and hence, instead of consideringthe Fourier spectrum, a direct estimateof the time-varying spectrum is moreappropriate. For this we used the wavelettransform. Since at higher scales thewavelet transform too cannot deal withthe uncertainty introduced by the shortsegment, we first preprocessed thesegment with a tapering algorithm, whichsmoothes the data by multiplying the datasegments with some window function,and then took the wavelet transform ofthe new data. The first mathematicalresults of our research contain error andconvergence estimates of the newalgorithm. Successful experiments withsynthetic data will be followed by testswith real seismic data in collaborationwith KNMI.

Other recently started research concernsthe design of an algorithm based on thewavelet transform for the detection oftime points, where the several phasesappear in the seismogram. This isrelevant for, eg, distinguishing a (nuclear)explosion from an earthquake andlocating the source of the seismic event.Together with the seismology departmentof KNMI, physical properties of theseismic signals, especially their scalingbehaviour, are used to refine thealgorithm. See also http: / / w w w.c wi.nl/cwi/projects/wavelets.html

■Please contact:

Nico Temme – CWITel: +31 20 592 4240E-mail: [email protected]

ATS – AdaptiveTeachingSystemby Marcus Specht

Web-based educational systems arebecoming more and more popular andare used by a heterogenous user group.Users of educational Web-basedsystems differ in their goals,professional background, interests andknowledge of the subject matter. Tooptimize the efficiency andeffectiveness of the learning processeducational systems should adapt tothis user features.

ATS is a Web-based educational systemthat mainly uses adaptation to theknowledge and the interests of learnersto generate individualized hypermediainstruction. The system is based on adomain model of the subject matter, adidactical model how to teach acurriculum and learner modelling ondifferent levels, eg preferences, interests,knowledge.

The domain model contains theknowledge about the subject matter. Theknowledge base is represented as aconceptual network with different typesof units and relations between them.Several types of information areassociated with each unit. These aredifferent kinds of text (introduction,several stages of teachtexts, summary),examples, demonstrations, playgrounds,and tests. In addition, each unit hasprerequisites (units that the studentshould be familiar with before workingon the unit) and consequences (possibleoutcomes and effects on other units). Thetests, prerequisites and consequences areweighted according to their importancefor a unit.

The didactical model represents theteacher’s knowledge of how to teachunits. It consists of two main parts:teaching strategies and diagnosticknowledge. The system has a defaultstrategy for each concept type andteachers can specify a preferred strategy

for a concept if needed. The diagnosticcomponent stores the knowledge aboutseveral types of tests and how they haveto be generated and evaluated.

In the learner model is stored theinformation about the material that alearner has used and how successfulhe/she worked on the material (eg, if testsare solved correctly, if the playgroundestimations were good or to what extendteachtexts where requested). Every actionof a student has consequences forupdating the learner model depending onthe involved learning material and theprevious experiences of a learner.

Based on these three componentsindividualized web-content (html, vrml,pictures) can be generated and presentedto a learner. Both, knowledge andinterests are diagnosed at the beginningof the learning process with ATS andupdated through the learning process tokeep track of the changing needs. Takinginto account the interest and theknowledge of a learner the system canadapt the instructional process on severallevels:

• adapting the curriculum by selectionof content

• adapting the presentation of contentsby choosing appropriate media andcombining them (adaptivemultimedia presentation)

• adapting the teaching strategies forspecific contents

• annotating hyperlinks

• recommending hyperlinks andcontents.

Some adaptive methods used in ATShave been empirically evaluated andshowed improvements in efficieny andeffectiveness of learning compared toclassical static hypermedia learningenvironments. Based on this resultsfurther research has to be done oncomplex interactions between adaptivemethods and the dependencies betweendifferent user features and theeffectiveness of adaptive methods foreducational purposes. Beside that furtherdevelopments will try to integrate sociallearning processes and goal basedadaptation methods in ATS. Anauthoring tool for ATS could be easily

Page 46: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

RESEARCH AND DEVELOPMENT

46

integrated in the domain independentATS framework and would allow newATS systems to be build by teacherswithout any html or programmingexperience.

■Please contact:

Marcus Specht – GMDTel: +49 22 41 14 2847E-mail: Marcus [email protected]

Abstract TestSuiteby Katalin Tarnay and Abdalla Areik

With the increasing number ofdifferent communication networksand protocols, it comes more and moreimportant to check whether a protocolimplementation is conform to theformal specification. A new methodfor analysing abstract test suites, afundamental step of conformancetesting has been developed at theCommunication Protocol Laboratoryof SZTAKI.

A communication protocol can beconsidered as an (N)-Protocol entitiyhaving upper and lower interfaces. TheProtocol Data Units (PDU) and theAbstract Service Primitives (ASP) crossthese interfaces. The basic blocks ofconformance testing can be seen in theFigure. The Implementation Under Test(IUT) is tested in a layered architecture.N stands for the layer number, the testmethods are abstract. An abstract testmethod describes how an IUT is to betested, given at an appropriate level ofabstraction to make the descriptionindependent of any particular realization,but with enough detail to enable tests tobe specified for this method.

The fundamental notion of conformancetesting is the abstract test suite.Traditional methods to derive an abstracttest suite are based on extended finitestate machines or labelled transitiongraphs. Our new idea is based on AItools, namely on rule-based systems. Theadvantage of using this method is to get

better information with higher confidenceand to have wider possibility to modifythe test actions.

The abstract test suite is composed ofabstract test cases. An abstract test caseis a complete and independentspecification of an action required to

achieve a specific test purpose. Accordingto our method, test steps and test casesare converted to a rule-based system.

Rule-based system representationprovides the possibility of integratingexpert systems into the conformancetester. The most popular mode ofknowledge representation within expertsystems is the mode obtained by the useof rules or rule-based systems. Such rulesare called IF-THEN. For understandingand maintaining, validation and ease ofdocumentation, some kind of ruleorganization should be followed. Eachgroup should be ordered, the rule-groupordering is determined according to theconclusion attributes, but it is oftenadvocated to limit the rule to a singleconclusion attribute.

The application of a rule-based abstracttest suite will be demonstrated on asimple protocol called Inres Protocol.This is a connection oriented protocolthat operates between two entities, theInitiator and the Responder. The protocol

entities communicate by exchanging theProtocol Data Units (CC, CR, DT, AKand DR.) The communication betweentwo protocol entities takes place in threedistinct phases: connectionestablishment, disconnection, and datatransfer phase. In each phase only certain

Protocol Data Units and Abstract ServicePrimitives have meaning.

In our research we try to use a rule-basedsystem for conformance test cases. Inother words, we attempt to generate fromthe test cases written in TTCN notation(Tree and Tabular Combined Notation)rules. Since the description of the testcases is in TTCN and we want to convertthese cases to rule-based systems, we canuse other specification languages (egSDL) as well. Confidence factors anduncertainity can be related to the rules.This method is expected to be applied inprotocol engineering. It is especiallyinteresting for abstract test suitedevelopers. These applications providea feedback to further research.

■Please contact:

Katalin Tarnay – SZTAKITel: +36 1 209 5400E-mail: [email protected]

The basic blocks of conformance testing.

Page 47: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

RESEARCH AND DEVELOPMENT

47

European TMRProject SystemIdentificationby Jan van Schuppen

The System Identification project aimsat: joint research on selected topics,increased transnational cooperation,joint training of post-doctoral and pre-doctoral researchers, and interactionwith industrial, service, and commercecompanies, and with governmentallaboratories, leading eventually to jointapplied research. The researchobjectives are to produce algorithmsand theory for system identificationfor nonlinear systems, for multi-input/multi-output linear systems, andfor control. The project is carried outby the European Research NetworkSystem Identification (ERNSI) whichconsists of nine research teams in sevenEuropean countries. It receivessupport through the EU Training andMobility for Researchers (TMR) –Research Networks. The projectstarted on 1 March 1998 and will runfor four years. It provides primarilyfellowships for young researchers atthe post-doctoral and at the Ph.D.student level.

System identification concerns theconstruction and evaluation ofmathematical models, obtained fromdata, in the form of dynamical systems.The project is motivated by the need forsuch models in problems of control,signal processing, prediction orforecasting, simulation, monitoring, andanalysis, such as arises in engineering,economics, environmental sciences, andbiology. The problems of systemidentification to be considered are themodeling of phenomena by dynamicalsystems, the realization problem oftransforming one system representationinto another, the derivation of systemidentification algorithms, theperformance evaluation of suchalgorithms, and combined identificationand control.

The ERNSI Network was created in 1992by several research groups, with the aimto stimulate cooperation between theresearch teams in the field of systemidentification. ERNSI has held annualworkshops since 1992.

Its research teams: are active leaders inthe field in Europe; span a broadspectrum of academic disciplines, frommathematics to engineering andeconometrics; have experience in trainingPh.D. students and post-docs, also fromabroad; regularly receive visitors fromother countries, and maintain intensiveinternational contacts; and are active inorganizing research at the internationallevel, such as in editorial boards ofjournals and in programme committeesof conferences. Teams from thefollowing institutions participate inERNSI: CWI Amsterdam (coordinator);Institute for Econometrics, OperationsResearch and System Theory —Technical University Vienna; CESAME— University of Louvain (Leuven);INRIA Sophia Antipolis; IRISA —University of Rennes; Department ofEngineering — University of Cambridge;CNR-LADSEB Padova; Royal Instituteof Technology (KTH) Stockholm;Institute for System Technique —University of Linköping. For moreinformation, see http://www.cwi.nl/cwi/departments/PNA2/tmrsi.html

■Please contact:

Jan van Schuppen – CWITel: +31 20 592 4085E-mail: [email protected]

ReflectiveTeams – ActiveLearning and InformationIntegration in OpenEnvironmentsby Heinz Mühlenbein

Research in the Reflective Teamsproject at GMD System DesignTechnology Institute aims at theanalysis and development of systemsthat operate on large data spaces withunknown properties or interact withtechnical or natural processes withunknown behavior. The mainemphasis of is on fundamentalalgorithmic research and on novelsoftware architectures for real-worldintelligent systems. The R-TEAMSsystem combines evolution ofstructure, active learning, reflectionand teamwork of agents in adistributed fashion. The R-TEAMSarchitecture consists of teams ofreflective actors which communicatevia distributed blackboards.

R-TEAMS is designed for the followingproblems, which might exist individuallyor in combinations:

• Search of large data spaces. The dataspaces are given in mathematicalrepresentation or, in the case of real-world data, as sets of value pairsobtained from observation or fromsimulation of real-world processes.

• Interaction with real-world processes.The tasks and the behavior of theprocesses cannot be completely takeninto account when the system isdesigned, since the tasks are not knownin advance nor the situations which theywill refer to.

• Simulation of real-world processes. Incomplex applications data often cannotbe generated by mathematical models,but has to be numerically computed bysimulating the physical process.

Page 48: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

48

EVENTS

Scientific Approach

In our opinion, an artificial system withreal world intelligence cannot bepreprogrammed and remains unchanged,but it has to have capabilities to learn andevolve. This requires:

• evolution of structure

• active learning

• reflection

• teamwork of agents.

The R-TEAMS approach tries to fulfillthese requirements within a uniqueframework based on parallel distributedproblem solving. Many agents of varyingfunctionality and granularity cooperateor compete in teams, referred to asreflective teams. They define subtasksand subgoals, perform subtasks, pursuesubgoals and eventually producesubresults that are then combined byother agents to yield a solution, eg, anoptimum, a model, a prediction or anaction. The idea is to draw on the inherentnetwork structure of such problems andto reflect it in the architecture of thesystems. Asynchronous interaction ofagents and teams of agents turned out tobe a suitable principle of organization forimplementing division of labor andteamwork.

Theory

In many real-world applications,especially in spatial domains, it is easyto decompose the application into agentstogether with their interaction structure.But in the past interactions have beenassumed to be one of the two extremecases – each agent interacts with anyother agent or the agents do not interactat all. It has been independentlydiscovered that teams of agents withconstrained interactions can beconveniently described by a network.The vertices of the graph representagents. There is an edge between agentA and agent B if they interact, ie, theyhave something in common. This can bea common variable, a common action tobe agreed on, a common part of theenvironment, sharing of sensorinformation etc. The task of the agentsis to solve a given problem in adecentralized way. We have started to

investigate three different approaches torender efficient solutions of problemsdefined on such networks:

• graphical models

• team theory

• game theory.

R-Teams Components

R-TEAMS applications consist of thefollowing major components:

• Agent: ranging from simple automatato reflective cognitive agents

• Team: defining relations between theagents

• Interaction: between agents, teamsand environment

• Environment: depending on the typeof application.

Our research is focused on thedevelopment of different modelsimplemented within a single platform.The models and tools developed aretested through various applications.

Current Applications

Current application include:

• control of an anthropomorphic hand-eye robot with two manipulators and3D stereo vision

• placement of antennas for mobileradio communication

• predicting the risk of a credit

• ptimization of decomposablefunctions.

The project is supported by the RealWorld Computing Partnership.For moreinformation, see http://set.gmd.de/AS/rteams/proj.html

■Please contact:

Heinz Mühlenbein – GMDTel: +49 2241 14 2405E-mail: [email protected]

Design, Specification andVerification of InteractiveSystems ‘98

by David Duce

The DSV-IS’98 workshop took placeat The Cosener’s House, Abingdon(CLRC’s conference centre) from 3 to5 June. Organized under the auspicesof the Eurographics Association, andwith sponsorship from ERCIM, thiswas the fifth in an annual series ofworkshops. The workshop attracted 41delegates from France, Italy, TheNetherlands, Norway, Spain, UK,Australia and the USA, and two invitedspeakers from the USA and Ireland.ERCIM was well-represented in theorganizing committee and attendees.

The workshop was chaired by PeterJohnson and Panos Markopoulos fromQueen Mary and Westfield College,University of London. The openingaddress was given by Dan Olsen fromCarnegie-Mellon University. Dan’s paperwas entitled “Interacting in Chaos”, avery impressive demonstration of chaosin technology: electronic ‘gadgets’ takenfrom every available pocket: mobilephone, pager, Palm Pilot, tiny solid staterecording device (holds 90 secs of audiomessages). There is a chaotic state (in themathematical sense - minor perturbationsproduce wildly different long termoutcomes), a divergent state, yet thecreation of helpful tools demandsconvergence. Human usage demandsregularity. He argued for a focus onsurface representations in order tosupport truly large communities. Heargued that in the chaotic order broughtabout by Internet-based information andcollaboration, we should concentrate onhuman-consumable data types and thefundamental media for representinginformation: text/language, images andpictures, audio, video, 3D environments(stressing the importance of sense ofplace and of what is around the body),and tactile information. His paperconcludes “Consider also that much ofthe WWW information is generated byalgorithms rather than by people. Thereis inherent regularity in such output that

SPONSORED BY ERCIM

Page 49: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

EVENTS

49

is waiting to be exploited by other tools.All that remains is for us to design them.This, I believe, is where the future ofinteractive technology lies.”

John McCarthy, University CollegeCork, Ireland, gave the second invitedtalk entitled: “The viability of modellingsocially organised activity”. He spokeabout the growing interest in HCI,CSCW and cognitive science intoconceptualizing activity as sociallyorganized and situated. This brings inwhat John referred to as the “messy stuff”of the social and experiential. He arguedthat the kinds of studies characteristic ofresearch into socially organized activityprovide insights which should not beignored by designers. He illustrated thiswith examples from a study carried outby the universities of Cork and York intothe workings of ambulance controlcentres in Cork and West Yorkshire. Hepointed out that decisions abouttechnology are decisions about work. Thestudies focused on how staff deal withemergency calls, how they locate thesource, build representations of what ishappening in their area and decide whichambulance to dispatch. Such work has amoral dimension which also cannot beignored. He concluded with a discussionof strategies and frameworks that mighttake such ‘messy stuff’ into accountduring design.

Twenty three of the submitted paperswere selected for presentation andcovered a broad range of topics,modelling the design of interactivesystems, the role of representations indesigning interactive systems, formalsupport for the design of interactivesystems, advances in model-based designand specification and verification ofinteractive systems. In addition, therewere three working group sessions. Theworkshop split into three groups, eachcontaining a mix of skills and disciplines.Groups were presented with threeproblems: design of an interactiveguidebook (inspired by work at theUniversity of Lancaster in mobilecomputing), reading for writing and anaccident scenario from cruise control inAmerican cars. Groups could work onone or more problems, though initiallyeach of the groups was asked to work on

a different problem. In the closing sessionof the workshop, the working groupsreported to the full workshop.

From an organiser’s point of view, themeeting was very successful. Thetechnical presentations were of highquality and displayed a real breadth anddepth of interest in the field acrossEurope, North America and Australia. Itwas pleasing to see many new faces atthe workshop, and there was a real senseof a growing, dynamic, inter-disciplinarycommunity. The workshop proceedingswill be published by Springer-Wien laterin the year.

It is planned that next year’s workshopwill be held at the University of Minho,Braga, Portugal, with Mario Martins andJose Campos as local organisers andDavid Duke (University of York) andAngel Puerta (Stanford University) asprogramme co-chairs.

■Please contact:

David Duce – CLRCTel: +44 1235 445511E-mail: [email protected]

EP98 – the SeventhConference on Electronic Publishing

by Jacques André

INRIA recently organized (andERCIM sponsored) EP98, the seventhElectronic Publishing conference, inSaint-Malo, in the framework of aweek consecrated to digital documentsand typography. This conference washeld from 1-4 April at Saint-Malo,France.

About three hundred people participatedin the events, more than a half havingregistrated to EP98 and its tutorials.Attendants came from America, Far Eastand Europe (with a very great number ofpeople from Finland) and ‘even’ fromFrance (very few people actually!).

For twelve years, the series of EPmeetings has been a hot spot fortechnological watch. Indeed, it is at these

meetings that such concepts as activedocuments, document restructuring andso on, first appeared.

This year’s conclusion is that research inthis area is far from being complete, eventhough the Web only seems to beconcerned with the development ofcommercial products these days. Somethought that the OpenType or MM andUnicode fonts marked a pause. On thecontrary, research is revealing very novelopenings in object-oriented orconstrained character modelling –including non Latin characters, legibilityon screens, etc. Others opinions were thatpaper is out. On the contrary, the conceptof intelligent paper, that bridges the gapbetween the physical and digital worlds,puts it back in the spotlight. Among thebest lectures was the invited talk fromGrenoble-Xerox research center. Somethought that we knew how to managemultimedia. On the contrary, Allen’slogic opens up brand new and excitingpossibilities in terms of temporalmanagement. In short, we are witnessinga transition from a narrow and specializedtechnical subject to a wide area ofresearch, a new discipline.

Proceedings are available as:

• R.D. Hersch, J. André, H. Brown (Eds.):Electronic Publishing, Artistic Imaging,and Digital Typography, 7thInternational Conference on ElectronicPublishing, EP’98. Held Jointly with the4th International Conference on RasterImaging and Digital Typography,RIDT’98, St. Malo, France,March/April 1998. Proceedings.

• LNCS 1375, Springer-Verlag,Heidelberg, 1998 . VIII, 575 pp. ISBN3-540-64298-6, Softcover, DM 114,-h t t p : / / l i n k . s p r i n g e r . d e / l i n k / s e r v i c e / s e r i es/0558/tocs/t1375.htm

The cover has been designed at EPFL,Switzerland by using very new imagingtechnologies and can be found at:h t t p : / / w w w . i r i s a . f r / e p 9 8 / B o o k C o v e r . p d f

EP 2000 ?

This series is far from dying! Its mainproblem is to find its own approach, tobe independent of the Web fashion andfar of the printing technology. In that

SPONSORED BY ERCIM

Page 50: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

EVENTS

50

way, the EP steering committee is lookingfor an unambiguous new acronym (EPseems to be too much printing orientedalthough EP means Electronic Publishingwhose meaning is broader than Editing!).Anyway, EP2000 should be organized inthe United States. Have a look ath t t p : / / w w w . i r i s a . f r / e p 9 8 / e p 0 0 . h t m l

■Please contact:

Jacques André – INRIA-RennesTel: +33 2 9984 7350E-mail: [email protected]

ERCIMWorking Group on Constraints Workshop

Amsterdam, 23-25 September 1998

The ERCIM working group onConstraints will hold its third workshopin conjunction with the 1998 annualworkshop of the CompulogNet (EspritNetwork of Excellence onComputational Logic) area ‘ConstraintProgramming’, 23-25 September 1998.

Authors are invited to submit abstractson ongoing research in all areas ofConstraint Programming and ConstraintProcessing. Position papers describingcurrent projects are also welcome. Thesubmission deadline is 20 August. Pleasesend an electronic version of the abstractin postscript, latex or plain text toERCIM working group secretary, EricMonfroy ([email protected]). Authors will benotified of acceptance/rejection by theend of August.

The workshop is jointly organized by:Philippe Codognet, INRIA, Krzysztof R.Apt and Eric Monfroy, CWI with supportfrom the CompulogNet and ERCIM. Seealso http://www.cwi.nl/~eric/ERCIM/W o r k s h o p s / W o r k s h o p 3 / C F P / i n d e x . h t m l

■Please contact:

Eric Monfroy – CWI Tel: +31 20 592 4075E-mail: [email protected]

INRIA-Industry Meeting

Paris, 26 November 1998

The next INRIA-Industry meeting willbe held in Paris on 26 November 1998.It will be consecrated to applications inthe telecommunications and multimediasector. This meeting will provide industryoperatives from this extremelycompetitive area with an opportunity todiscover the latest technologicalinnovations stemming from INRIAresearch. More than 50 applications willbe presented that involve leading edgetechnology in various domains such aswireless networks, the Internet,sophisticated human-machine interactionand mobile applications.

■Please contact:

Yves Peynaud – INRIATel: +33 1 3963 5817E-mail: [email protected]

IEEEMultimedia Systems ’99

Florence, Italy, 7-11 June 1999

This is the 6th edition of the InternationalConference on Multimedia Computingand Systems organized by the Instituteof Electrical and Electronics Engineers.IEEE ICMCS is an annual conferenceaiming at bringing together researchers,developers and practitioners fromacademia and industry, working inMultimedia. We encourage submissionsof scientific and technical papers, panel,tutorial, workshop and poster proposals.Conference topics include but are notlimited to:

• Operating System Support

• Distributed Multimedia

• Human Computer Interaction

• Multimedia Databases

• Multimedia Tools

• Multimedia Applications.

Important dates:

• paper submission deadline:15 October 1998

• notification of paper acceptance:15 January 1999

• camera ready copy: 15 February 1999

• panel, tutorial, demonstration, exhibitdeadline: 31 December 1998

• notification of acceptance: 15 January 1999

Fore more information, see:http://www.dsi.unifi.it/~icmcs99

■Please contact:

Conference secretariat – University of Florence E-mail: [email protected]

Second EuropeanConference on Research and AdvancedTechnology for Digital Libraries

Heraklion, Crete, Greece, 21-23 September 1998

This conference is the second of a seriesof European conferences on research andtechnology for digital libraries. Panelsessions, result demonstrations and postersessions are also included in theconference programme. A limitednumber of fellowships for the conferenceare available. The objectives of theconference are: to bring togetherresearchers from multiple disciplineswhose science relates to the developmentof digital libraries; to establish a forumfor discussion of issues such asinteroperability, multilinguality,intellectual property policy, andinformation commerce. For moreinformation see : http://www.ics.forth.gr/2EuroDL/

■Please contact:

Rena Kalaitzaki,Maria Stavrakaki –Conference secretariat, University of CreteTel: +30 81 39 35 04Fax: +30 81 39 35 01E-mail: [email protected]

CALL FOR PAPERS

CALL FOR PARTICIPATION

CALL FOR PAPERS

CALL FOR PARTICIPATION

Page 51: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

IN BRIEF

51

FORTH – Jacques Santer, Presidentof the European Commission, visitedthe Foundation for Research andTechnology – Hellas (FORTH),located at the Science and TechnologyPark of Crete, on the 10th of April1998. Mr Santer was accompanied by

senior Greek officials, including theAlternate Minister of Foreign Affairs,the Deputy Minister of NationalEconomy, the General Secretary of theRegion of Crete, members of the GreekParliament, and other distinguishedguests. Mr Santer was, by all means,impressed by the high quali ty ofscientific work conducted at theInstitutes of FORTH. He expressed hisfull support for the research anddevelopment activities and thesystematic efforts made by FORTH tocontribute to the industrial exploitationof new, promising technologies, at theregional, national, European andinternational levels.

■GMD – Dennis Tsichritzis , Chairmanof GMD’s Executive Committee, wasawarded the ‘Premio Italia’, theGerman-Italian scientific prize. The‘Premio Italia’ includes the fields ofeconomy and science and is awardedby the German-Italian EconomicAssociation and the German-Italian

Science Committee. Tsichritzisreceived the prize for his merits inscience and research and for furtheringthe German-Italian co-operation. Theprize was awarded 26 May 1998 inDüsseldorf in the presence of PierLuigi Bersani, Italian Minister forIndustry, Trade and Craft Trades, andDr. Enzo Perlot, Italian Ambassador toGermany.

■CLRC – Albert Westwood has beenappointed as the new Chairman andChief Executive of the Council for theCentral Laboratory of the ResearchCouncils, CCLRC. He succeeds PaulWilliams, the first Chairman and ChiefExecutive of CCLRC, who retired on31 March 1998. Williams wasappointed Director of the RutherfordAppleton Laboratory in 1986 and to

his present post on the creation ofCCLRC in 1995. Commenting on theappointment of his successor, PaulWilliams said: “Dr Westwood comesto CCLRC after a distinguished careerin science and engineering in the USA.His experience is just what theRutherford Appleton and Daresburylaboratories need at this point in theirlives. He has an impressiveinternational reputation and bringsmany years of experience in managinglarge scale research. CCLRC isfortunate to have such a man in theChief Executive’s chair.” AlbertWestwood has commented: “It isindeed a real privilege for me to be

invited to become a member of thisgreat laboratory. I am really lookingforward to doing all I can to helpsustain and enhance the laboratory’sreputation as an international scientificleader, and to continue to build up itsrole as a critical resource for excitingtechnical ideas and innovativedevelopments that, over time, willcontribute significantly to nationalprosperity”.

■SARIT – A new name for ERCIM’sSwiss member institute – the generalassembly of SGFI (SchweizerischeGesellschaft zur Foerderung derInformatik und ihrer Anwendung) hasdecided to change its name to SARIT(Swiss Association for Research inInformation Technology). The changeof the name stands for a strongerinternational orientation andemphasizes the focus on research. Themembership structure remainsbasically the same: the professors ofcomputer science or related areas at theSwiss universities and Swisscompanies active in research ininformation technology. The SARITweb pages can be reached underhttp://www-dbs.ethz.ch/sarit/.

Jacques Santer, President of theEuropean Commission (left), duringhis visit to the Foundation forResearch and Technology – Hellas(FORTH) , accompagnied by SteliosOrphanoudakis, Director of ICS-FORTH (right).

The former and the new Chairmanand Chief Executive of the Councilfor the Central Laboratory of theResearch Councils, CCLRC, PaulWilliams (left) and Bert Westwood.

Page 52: Next Issue - ERCIM · Next Issue: Advanced Databases and Metadata his August I am stepping down as ERCIM President after two terms. ERCIM has enlarged with more members. The relationships

52

ERCIM NEWS

ERCIM Central Office: Domaine de VoluceauRocquencourtB.P. 105F-78153 Le Chesnay CedexFRANCEE-mail: [email protected]

The European Research Consortium for Informatics and Mathematics (ERCIM) is anorganisation dedicated to the advancement of European research and development, inthe areas of information technology and applied mathematics. Through the definitionof common scientific goals and strategies, its national member institutions aim tofoster collaborative work within the European research community and to increase co-operation with European industry. To further these objectives, ERCIM organises jointtechnical Workshops and Advanced Courses, sponsors a Fellowship Programme fortalented young researchers, undertakes joint strategic projects, and publishesworkshop, research and strategic reports, as well as a newsletter.

ERCIM News is the in-house magazine of ERCIM. Published quarterly, the newsletterreports on joint actions of the E R C I M partners, and aims to reflect the contributionmade by E R C I M to the European Community in Information Technology. Throughshort articles and news items, it provides a forum for the exchange of informationbetween the institutes and also with the wider scientific community. ERCIM News hasa circulation of 6,500 copies.

GMD –Forschungszentrum I n f o r m a t i o n s t e c h n i kGmbH

Schloß BirlinghovenD-53754 SanktAugustin

Tel: +49 2241 14 0 Fax: +49 2241 14 2889h t t p : / / w w w . g m d . d e /

Consiglio Nazionaledelle Ricerche

IEI-CNRVia S. Maria, 46I-56126 Pisa

Tel: +39 050 593 433Fax: +39 050 554 342h t t p : / / b i b a r e a . a r e a . p i . c n r . i t/ E R C I M / w e l c o m e . h t m l

Instituto de Engenharia de Sistemas e Computadores

Rua Alves Redol 9Apartado 13069P-1000 Lisboa

Tel: +351 1 310 00 00Fax: +351 1 52 58 43h t t p : / / w w w . i n e s c . p t /

Centrum voorWiskundeen Informatica

Kruislaan 413NL-1098 SJAmsterdam

Tel: +312 05 9 29 3 3 3Fax: +31 20 592 4199h t t p : / / w w w . c w i . n l /

Institut National de Rechercheen Informatique et en Automatique

B.P. 105F-78153 LeC h e s n a y

Tel: +33 1 39 635 5 1 1Fax: +33 1 39 63 5330h t t p : / / w w w . i n r i a . f r /

Central Laboratory of the ResearchCouncils

Rutherford AppletonL a b o r a t o r yChilton, DidcotGB-Oxon OX11 0QX

Tel: +44 123582 1900Fax: +44 1235 44 5385h t t p : / / w w w . c c l r c . a c . u k /

Swedish Institute

of Computer

Science

Box 1263

S-164 28 Kista

Tel: +46 8 752 1500

Fax: +46 8 751 7230

h t t p : / / w w w . s i c s . s e /

Stiftelsen for Industriell ogTeknisk Forskningved NorgesTekniske Høgskole

SINTEF Telecom& InformaticsN-7034 Trondheim

Tel :+47 73 59 30 00Fax :+47 73 59 43 02h t t p : / / w w w . i n f o r m a tics.sintef.no/

TechnicalResearch Centreof Finland

VTT InformationTechnologyP.O. Box 1200FIN-02044 VTT

Tel:+358 9 456 6041Fax :+358 9 456 6027h t t p : / / w w w . v t t . f i /

Foundation of Researchand Technology –Hellas

Institute of ComputerScienceP.O. Box 1385GR-71110 Heraklion,Crete

Tel: +30 81 39 16 00Fax: +30 81 39 16 01h t t p : / / w w w . i c s . f o r t h . g r /

FORTHCzech ResearchConsortium for Informatics and Mathematics

FI MUBotanicka 68aCZ-602 00 Brno

Tel: +420 2 6884669Fax: +420 2 6884903h t t p : / / w w w . u t i a . c a s . c z /C R C I M / h o m e . h t m l

Swiss Association for Research in InformationT e c h n o l o g y

Dept. InformatikETH-Zentrum CH-8092 Zürich

Tel: +41 1 632 72 41Fax: +41 1 632 11 72http://www-dbs.inf.ethz.ch/sgfi/

Slovak R e s e a r c hConsortium for Informatics and Mathematics

Dept.of ComputerScience, ComeniusUniversityMlynska Dolina M,SK-84215 Bratislava

Tel: +421 7 726635Fax: +421 7 727041

Danish Consortium for InformationTechnology

CITForskerparkenGustav Wieds Vej 108000 Århus C

Tel: +45 8942 2440Fax: +45 8942 2443h t t p : / / w w w . c i t . d k / E R C I M /

DANIT

Central Editor:Peter Kunz

Local Editors:Lars Bergman (SISU)Erzsébet Csuhaj-Varjú (SZTAKI)Truls Gjestland (SINTEF)Michal Haindl (CRCIM)Fritz Henglein (DANIT)Bernard Hidoine (INRIA)Pia-Maria Linden-Linna (VTT)Siegfried Münch (GMD)Henk Nieland (CWI)Carol Peters (CNR) Martin Prime (CLRC)Constantine Stephanidis (FORTH)

E-mail:[email protected]

[email protected]@sztaki.hut r u l s . g j e s t l a n d @ i n f o r m a t i c s . s i n t e f . n [email protected]@[email protected]@[email protected]@[email protected]@[email protected]

Telephone:+33 1 3963 5040

+46 8 752 1613+36 1 1810 990+47 73 59 26 45+420 2 6605 2350+45 35 32 14 02+33 1 39 63 5484+358 0 456 4501+49 2241 14 2303+31 20 592 4092+39 050 593 429+44 1235 44 6555+30 81 39 17 41

You can subscribe to ERCIM Newsfree of charge by: sending e-mail toyour local editor; posting papermail to the address above or fillingout the form at the ERCIM web siteat h t t p : / / w w w . e r c i m . o r g /

M a g y a rT u d o m á n y o sAkadémia –S z á m í t á s t e c h n i k a iés AutomatizálásiKutató Intézete

P.O. Box 63H-1518 Budapest

Tel: +361 166 5644Fax: +361 166 7503h t t p : / / w w w . s z t a k i . h u /

SRCIM

Directeur de Publication: B. Larrouturou Publication périodique réalisée par GEIE-ERCIM

ISSN 0926-4981