Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article...

24
Number 24 Space Telescope European Coordinating Facility Newsletter ST–ECF March 1997 STS-82 before launch with STIS and NICMOS on 11 th February 1997 Photograph by Bob Fosbury

Transcript of Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article...

Page 1: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 1

N u m b e r 2 4

S p a c e T e l e s c o p e — E u r o p e a n C o o r d i n a t i n g F a c i l i t y

NewsletterST–ECF

M a r c h 1 9 9 7

STS-82 before launchwith STIS and NICMOSon 11th February 1997

Pho

togr

aph

by B

ob F

osbu

ry

Page 2: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 2

Early in the morning of Tuesday the11th of February, the Space Shuttle‘Discovery’ set off on the second

Hubble Space Telescope servicingmission. Following the very carefulpreparation and extensive training of theSTS-82 crew and an undelayed launch,the installation of the new instrumentsand other telescope components proceededwith a precision which we have come toexpect but is nonetheless impressive. Fourdays of scheduled EVA saw the re-placement of GHRS and FOS by STISand NICMOS respectively; the exchangeof an ailing FGS (Fine Guidance Sensor)with a new one; the installation of a new-technology solid-state ‘tape’ recorder ofmuch higher capacity, the insertion of anumber of other pieces of hardware and amodest orbit reboost. A fifth EVA daywas scheduled to patch-up some areas ofthe telescope where the thermal insulationshowed signs of age. As this is beingwritten, the new instruments are in thefirst stages of their Science Mission OrbitalVerification (SMOV).

While this was happening at an altitudeof about 360 miles, successful Cycle 7proposers were beginning the task oftransforming their scientific programmesinto completed phase II proposals. ForSTIS and NICMOS, this will necessarilybe an iterative process as more is learnedabout the in-orbit performance of theinstruments.

After nearly seven years of HSToperations it is, perhaps, timely to sketchout what we know about the future of theobservatory and what plans are being made

STS-82 and beyond

Piero Benvenuti & Robert Fosbury

for a successor. It is clear that HST is notonly a primary source of astronomicaldata in its own right but that, incombination with new, large groundbasedtelescopes and other space missions, wehave a complementary set of observationaltools of formidable power.

Two more HST servicing missions arefirmly scheduled by NASA. In 1999, theAdvanced Camera for Surveys (ACS) willbe installed in place of ESA’s FOC, anotherFGS will be replaced and the telescope—with new solar arrays—will be reboostedto a higher orbit in order to withstand thenext solar cycle maximum. For the missionin 2002, NASA issued an announcementof opportunity for a new instrument late in1996 with a proposal deadline set for mid-April of this year. What will happen in

2005—the nominal end of the HSTmission—is still very much underdiscussion and will depend on the progressof other plans. The recommendation ofthe ‘HST and Beyond’ committee (seeBox) is to operate HST beyond 2005,perhaps in a reduced-cost mode, withemphasis on ultraviolet spectroscopy andimaging and on wide-field optical imaging.In 2001, the current Memorandum ofUnderstanding (MoU) between ESA andNASA concerning HST expires. Aninteragency working group is currentlyexploring options for renewing this MoU.Unfortunately, because of budget res-trictions in the Science Programme, ESAis not in a position to offer a new Europeaninstrument for the year 2002, an optionwhich would have been more thanacceptable as a contribution to theextension. However, if the STJ detector(see ST–ECF Newsletter No. 23, p6, 1995)

is a component of a selected proposal forthe NASA 2002 Instrument, ESTEC willsupport the developement of the STJ flightunit, including the cryogenic section.

It is important for us to appreciate howclosely Europe is involved with the HSTat the moment, what splendid value weare getting from the project and how crucialit is to ensure that the relationship continuesbeyond 2001. Indeed, Europe has alwaysobtained more HST observing time thanthe formally agreed 15% lower limit: inCycle 7 the fraction is 22% of the orbits(see article by Leon Lucy in this issue).The continued access to HST by Europeanastronomers will become vital in the nearfuture when the potential for jointprogrammes with the new groundbasedfacilities such as VLT and Gemini, isrealised.

A major initiative is currentlyunderway in the USA to develop thesecond of the ‘HST and Beyond’recommendations, namely the planningof a space observatory, with an aperture of4m or larger, optimised for imaging andspectroscopy in the 1–5µm wavelengthrange. This has become known as theNext Generation Space Telescope (NGST)and 1996 saw three major studies of aconcept for a passively-cooled telescope,up to 8m in aperture, in an orbit muchfurther from Earth than HST. There is aconsiderable quantity of information aboutthe NGST on the Web (see Box) so wewill not attempt a technical descriptionhere. The assessment of the scientific caseused to derive a ‘Design ReferenceMission’ is described in the article byStiavelli et al. in this issue.

There is, as yet, no official European

Pho

to: B

ob F

osbu

ry

Go Discovery! The afternoon beforelift-off on pad 39A at the KennedySpace Center in Florida.

Discovery clearing the tower. The viewfrom 3.5 miles before the sound arrives.

After the solid rocket boosters haveseparated, Discovery—on main enginesalone—is seen as a trail projectedbehind the floodlights from the smoke-enshrouded launchpad.

Pho

to: B

ob F

osbu

r y

Pho

to: B

ob F

osbu

r y

Page 3: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 3

Gregory Harbaugh (left) and Joseph Tanner (right) exchanging places on the footrestraint during Day 5 of the servicing mission.

Pho

to: N

AS

A

HST is released at the end of the second servicing mission.

Pho

to: N

AS

A

Information sources for HST and NGSTInformation about STS-82, including lots of pictures and movies, is available from:

http://ecf.hq.eso.org/sts82/this includes the report of the SM2 Independent Science Review team.

The new HST instruments STIS and NICMOS are described at:http://www.stsci.edu/ftp/instrument_news/NICMOS/topnicmos.htmlhttp://www.stsci.edu/ftp/instrument_news/STIS/topstis.html

The NASA AO for a 2002 instrument can be obtained from:http://ecf.hq.eso.org/AO.html

The report of the ‘HST and Beyond’ committee, chaired by Alan Dressler, is available asa PDF file at:http://ecf.hq.eso.org/priority/Dressler_report.html

The NGST story is told at:http://saturn1.hst.nasa.gov/ngst/ and http://ecf.hq.eso.org/ngst/ngst.html

STScINEWSLETTER

The Space Telescope Science In-stitute publishes a Newsletter 2–4 timesper year. The STScI Newsletter containsinformation of interest to proposers, in-cluding updates on the status of the HSTand its instruments. Subscriptions areavailable at no cost to all interested sci-entists; requests to be added to the mail-ing list should be sent (by regular orelectronic mail) to the User SupportBranch at the following address:

Space Telescope Science Institute3700 San Martin DriveBaltimore, MD 21218USA

E-mail: [email protected] (Internet)

Requests should also specify whetherthe subscriber wishes to receive futureCalls for Proposals.

SPACE TELESCOPE SCIENCE INSTITUTE

Operated for NASA by AURA

involvement in NGST although quite anumber of European astronomers havebeen and are involved in various aspectsof the study. ESA has been formally invitedby NASA to participate in the ‘Origins’program, of which NGST is an integralpart. ESA has responded positively withthe proviso that the NGST collaborationshould find its place within the ESAScience Programme, Horizon 2000+. Thisis not a particularly easy task at the momentsince the entire programme in underrevision following the loss of the Clusterexperiment. Nonetheless, the SpaceScience Advisory Committee, consideringthe very high scientific return for Europeanastronomers that an ESA participation inNGST would represent, recommended theappointment of an external NGST advisoryteam which will have the task of examiningpossible European contributions. Inparticular, the team will analyse theadvantages and technological riskreductions which could be achieved byusing Ariane V as a launcher. It is expectedthat the team will issue its recom-mendations to ESA in about a year fromnow when NASA’s intentions andschedule on NGST become better defined.Meanwhile, the interest in NGST in Europehas been growing and we have beenrequested to illustrate its science goalsand technical challenges with severalseminars in different Institutes. AEuropean Workshop on NGST for laterthis year is under consideration.

Page 4: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 4

The ‘HST and Beyond Committee’was formed to advise NASA onfuture missions for space astro-

nomy after the nominal end of the HSTmission in 2005. Chaired by Alan Dressler,the committee made three recom-mendations in their report ‘Explorationand the Search for Origins: A Vision forUltraviolet-Optical-Infrared Space Ast-ronomy’:1. A low cost extension of the life of the

Hubble Space Telescope beyond 2005.2. The development of a near-IR optimised

telescope of 4m diameter or larger.3. The development of space inter-

ferometry capabilities with emphasison high resolution imaging and thedetection of Earth-like planets inneighbouring planetary systems.

Despite previous discussions—and evenscientific workshops—on NGST, therelease of the ‘HST and Beyond’ reportrepresented the real beginning of a numberof activities intended to show that NGSTwas not only desirable but also feasible.Three independent teams led, respectively,by Lockheed Martin Corp., by TRW, andby NASA’s own Goddard SpaceflightCenter, produced complete designs of thespacecraft with estimated developmentcosts near the cost target of US$ 500M,excluding the launch and operations.Concurrent with these efforts, a volunteerScience Working Group (SWG, chairedby John Mather and Peter Stockman) wasset up to further quantify and extend thescientific goals discussed in the ‘HST andBeyond’ report. A Science AdvisoryCommittee (SAC, chaired by RobertKennicutt), was also created to review theprogress of both the SWG and the technicalstudies. The SAC, the SWG and theGoddard-led design team includedEuropean and ESA scientists. The grouplead of the instrument module design team(known as the Instrument IntegratedProduct team or IPT) was an ESA engineer(Pierre Bely). An artist’s view of theGoddard version of NGST is shown inFigure 1.

Next Generation Space Telescope— design reference mission

Massimo Stiavelli1,2,3, Pete Stockman1 & Richard Burg1,4

In the following, we will concentrateprincipally on one aspect of the activity ofthe Goddard team in order to illustrate theinterplay between the science goals andthe instrument design at these early stagesof the project. Clearly, astronomers wouldalways prefer a larger telescope to a smallerone. However, in order to build somethingthat will fly and be cheap to operate, wemust achieve a compromise betweenscientific capabilities and technical andfinancial feasibility. As an example,improvement in certain technical areas,eg, detector performance, might gainsignificant scientific return for a relativelymodest cost. To obtain similar gains inother areas, eg, by increasing the primarymirror size, could be extremely expensive.On the other hand, increasing the mainmirror size with all else kept constantwould increase the performance for mostof the science. Improvements in otherareas may have a more limited impact.Since NGST must accomplish a broadrange of scientific goals, striking thecorrect balance is not straightforward. Asa tool to achieve this balance, theInstrument IPT and the SWG establisheda strawman observing program based onthe key scientific questions identified inthe ‘HST and Beyond’ report as well as abroad program of astronomical research

that requires NGST capabilities.Because of its importance in the actual

design of the mission, the program istermed the Design Reference Mission(DRM). The DRM may be thought of asa shopping list of the highest priorityobservations and a price list whichrepresents the time that each observationwould require. This price list was producedby including all the major attributes of theobservatory: primary mirror size, detectornoise and quantum efficiency, pixel size,field of view, optics temperature, celestialbackground, setting time after a majorslew, cosmic ray flux, etc. By comparingthe time required for each observationwith the total time required, we can seehow each parameter affects the missionoutcome and how various science aspectsaffect the overall mission.

Science goalsWe will focus on one particular DRM asan illustration. About 72% of the time isdedicated to a ‘Core program’ directlylinked to cosmological studies as outlinedin the ‘HST and Beyond’ report. ThisCore program is the minimum that amission of this type should achieve. Theremainder is devoted to science projectsnot feasible with other kinds of facility butnot included in ‘HST and Beyond’. The

1 Space Telescope Science Institute2 on assignment by the Space Science Dept. of ESA.3 on leave from the Scuola Normale Superiore, Pisa, Italy.4 Johns Hopkins University, Dept. of Physics and Astronomy.

Figure 1: An artist’s view of the Next Generation Space Telescope, asdesigned by the Goddard-led team. The principal elements are a

deployable telescope, passively cooled and shaded from sunlight, and awarm service package on the opposite side of the large sunshield.

In order to match the technical capabilities of theNext Generation Space Telescope (NGST)designs with the scientific drivers, a Design

Reference Mission (DRM) has been developedand used to quantify the effects of optical,

detector and operational parameters.

Page 5: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 5

selection of the scientific programelements is somewhat subjective anddepends on current scientific interests.However, it is broad enough to berepresentative. A brief explanation of eachof the research goals and of the cor-responding observational parameters isgiven below.Supernovæ (Core). The interest is twofold:one can use the SN as standard candles toimprove our knowledge of the geometryof the Universe (q

0 and Λ), and also use

them to study the material universe beforethe birth of galaxies. These goals areachieved by identifying about 100 SN atredshift greater than 2 (ie, down toK

AB~ 31), and following them both before

and after the maximum for two ‘rest frame’weeks. Additional SN will be identifiedwhile monitoring the 100 main SN objects.NGST Deep Fields (Core). One deep field(down to K

AB~ 32) and 100 less deep

(KAB

~ 30) flanking fields will be observedin several broadband filters. Several classesof objects will be identified in these fields.The need for both a very deep survey anda shallower survey covering a very largearea arises from our lack of knowledge ofhow exactly galaxies form: whether asmany small subunits or rather as rarer,more luminous aggregations. The deepfield will be optimised to investigate thesmall subunits scheme and will enable usto detect proto-globular clusters information up to a redshift of about 10. Thelarge area shallower survey allows us tolook for other rare, although important,objects: proto-quasars, primeavalspheroids, proto-disks and is sensitive toL

* objects up to a redshift of 10. At the

limit of the flanking survey there are about1000 galaxies per square arcmin. In orderto have significant statistics as a functionof redshift on objects such as AGNs andQuasars which appear only in a fraction10–2–10–3 of galaxies, one needs to observeabout 107 galaxies. To study the earlyevolution of galaxies and their dark mattercontent, galaxies at redshift greater than 2will be studied spectroscopically at lowand high resolution to derive their redshift,their spectral energy distribution, and theirinternal kinematics. These spectroscopicobservations are also split into single deepfields, required to obtain data as a functionof redshift on the faint end of the galaxyluminosity function and shallower moreextended surveys to obtain data on therarer, brighter objects.Universe at z > 2 (Core). This includesfollow-up studies (pointed observations)of objects identified in the NGST DeepFields Survey. In particular, primevalspheroids, birth and evolution of disks,origin of heavy elements, and birth andevolution of AGN are all included in thiscategory. The observations include bothNIR and T(hermal)IR spectroscopic and

photometric studies. High redshiftgalaxies and star forming regionsdiscovered in the HST HDF and the NGSTshallow surveys will often display complexstructures due to merging, galaxy-galaxyinteractions, spiral-wave star formation,and chance superpositions. We will obtainmoderate resolution, imaging spectro-scopy of these structures and those ofnearby galaxies to study the dynamics ofthese systems. We can confirm thatdifferent elements are bound by gravity iftheir velocities are identical. Moreover,we can study the mass within each galaxy(mass which may be dark) by measuringthe temperature or the collective motionsof the stars and gas clouds within thegalaxy. Disk structures should show strongvelocity shifts with distance from thecentre. Imaging spectrometers in the NIRand MIR are required, with resolutions ofR = 1000–3000.Gravitational Astronomy. Deep HSTimages and the NGST SN fields andshallow surveys will discover goodexamples of gravitational lensing byclusters of galaxies. These chancealignments—approximately 0.1% of allsightlines—provide excellent opportun-ities to see distant galactic structures athigh magnification as well as superbopportunities to study the total massdistribution within the cluster. NGST’sNIR sensitivity, wide field imaging, andHST-like resolution can reveal hundredsof background galaxies at moderateredshift, z= 1–3, as well as the magnifiedimages of star forming regions at z = 10–30. We will study the evolution of clusterpotentials and the study of serendipitousultra-distant galaxies using deep imagingand follow-up spectroscopy of approx-imately 20 gravitational lenses with thegravitational lens at redshifts betweenz = 0.3–2.0.Stellar Populations in the NearbyUniverse. Our understanding of the fossilstellar record is limited to our own galaxyand a half-dozen satellite galaxies. UsingNGST’s light gathering power and HST-like resolution, we will extend ourunderstanding of the stellar populationswithin the local group of galaxies and theVirgo cluster galaxies. Accurate, wide-field imaging of single stars in 30 galaxiescan provide the star-formation fossil recordfor the disks and outer portions of theinner bulge components (spheroids). Theexposure times are selected so as to reachthe horizontal branch luminosity in theVirgo cluster.Kuiper Belt Objects. The outer portion ofour solar system contains millions ofasteroids, with the largest almost 1000kmin diameter. Many lie outside the orbit ofNeptune and are presumed to be remnantsof the formation of our solar system.Before NGST is launched, astronomers

will have discovered about ~100 of thebrightest and nearest asteroids in theKuiper Belt. These objects and theirdistribution are not likely to be pristine,since they lie within or near the orbit ofNeptune. We will obtain optical and NIRimages in a wide field Ecliptic Survey(about 800 square arcmin). Our goal is todiscover an equivalent number of KP beltobjects at greater radii from the sun(> 40 AU) and over a wide range in sizes.Mid-infrared signatures, such as silicatefeatures, will be important in linking theseobjects, our closest protoplanets, to thegreat dust disks seen around young systemsin Orion and β-Pictoris stars. Searcheswill be carried out down to magnitudeAB ~30 in the optical and NIR and AB ~25in the TIR. Any object discovered will bestudied by followup NIR and TIR imagingand TIR spectroscopy.Individual Object Classes. A variety ofpointed studies in both imaging andspectroscopy that can take advantage ofthe NGST performance. Included areobservations of normal and active galaxies,ISM in galaxies and its evolution, studiesof brown dwarfs and stellar evolution.Collectively these projects make up 12%of the program.

Program optimisationHaving defined a strawman scienceprogram, the role of observatory par-ameters on the scientific capability of themission can readily be studied. The figureof merit being optimised is the completionrate of the program over a given missionlifetime. The parameters to vary aretelescope diameter, instrument field ofview, optics temperature, opticalthroughput, parallel use of instruments(several instruments observing the samefield either simultaneously or successivelyif they do not share the same field ofview), detector readout noise, and darkcurrent.

In the example discussed here, thetime to execute the above program wascalculated under the following assum-ptions:❏ Observations made in a region where

zodiacal light and Galactic emissionare near the minimum

❏ Repointing overhead of 18 minutesfollowing major slews

❏ Negligible single exposure overhead(36 seconds)

❏ 10% mission time allowance forengineering and calibration

❏ Cosmic ray rate of 1 proton cm-2 s-1,(resulting in a maximum exposure timeof 1000 seconds for 27µm pixels).

The main results of this parametric studyare shown in Figure 2, where the fractionof the completed strawman programdescribed previously is plotted as afunction of the various parameters being

Page 6: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 6

optimised. The conclusions which can bedrawn from these trade-off studies can besummarised as follows:❏ The constraint of having the same spatial

resolution of HST in the NIR forcesone to have a mirror close to 8m indiameter.

❏ A 6m effective aperture telescope withthe proposed set of instruments isadequate to complete all core programsand most of the complementaryprogram in a 5 year mission.

❏ Below about 5m of effective aperture,the core programs cannot be completed.An effective aperture larger than 5mappears necessary to fulfill therequirements set forth by the ‘HST andBeyond’ report.

❏ To some extent, telescope diameter andfield of view can be traded against eachother. A 6m telescope with a 4× 4arcmin instrument does as well as an7.2m with a more modest field of view(2.8× 2.8 arcmin). This is because thescientific program is dominated bysurveys in background limited mode,not by individual targets or high spectralresolution programs. Since instrumentsare typically less expensive thantelescopes, this indicates thatinstruments should be built with aslarge a field of view as possible, withinthe limit allowed by optical aberrationsand packaging. That there are obviouslimits in the overall size of theobservatory also indicates that there isan optimum combination of instrumentfield of view and telescope size.

In conclusion, a New Generation SpaceTelescope with 5.5m effective aperture,with large field of view, optimised forNIR imaging and low-resolution spec-troscopy, appears to be the minimumrequirement for understanding the Originof Galaxies. We should note that atelescope with this minimum size wouldnot be able to carry out a significant,

Frac

tion

of s

traw

man

pro

gram

(%)

5 year mission

coreall

Relative detector noise (%)

7.2 m

300

200

100

050 100 150 200

6.0 m

Frac

tion

of s

traw

man

pro

gram

(%)

5 year mission

core

all

Effective aperture (m)

300

200

100

02 4 6 8

Frac

tion

of s

traw

man

pro

gram

(%)

5 year mission

core

all

FOV NIR (arcmin)

7.2 m

300

200

100

02 3 4 5 6

6.0 m

Frac

tion

of s

traw

man

pro

gram

(%)

5 year mission core

all

Mirror temperature (K)

7.2 m

300

200

100

040 60 80

6.0 m

Figure 2: Completed fraction of the Strawman program as a function of the variousparameters. Red curves represent the Core program, and green curves thecomplete program. In the upper left panel we show the dependence on the telescopeeffective aperture. The upper right shows the effect of field of view both on a 6mand a 7.2m telescope. In the lower panels we show, also for a 6m and a 7.2mtelescope, the effect of detector noise and mirror temperature. The detector noiseincludes both dark current and read-out noise and is given relative to the one in thebaseline design.

broad-based General Observer (GO)science mission until after the first 4–5years of its mission. As experience withHST forewarns us, our knowledge ofastronomy may have improved inunexpected ways by the time NGST islaunched. Also, it is clear that a significant,if not the most important, fraction of the

scientific results come from the GO scienceprogram. We feel, therefore, that the 8mdiameter goal for the primary mirror isappropriate and better addresses the intentof the ‘HST and Beyond’ Committeerecommendations.

SCIENCE WITH THE NGSTGoddard Space Flight Center, Greenbelt, Maryland, USA

April 7–9, 1997The Next Generation Space Telescope is a new facility for space astronomy being considered by NASA in collaboration with ESA andother Space Agencies and planned for launch in the first decade of next century. The telescope is optimised for Near-IR observations.The meeting will address the science feasible with such a facility.Main Topics: Status of the NGST project, Precursors to Galaxy Formation, Galaxy Evolution and High Redshift objects, Observationalcosmology, Nearby Galaxies and their Stellar Populations, Star and Planet Formation, Solar System ScienceScientific Organizing Committee: R. Kennicutt (U. Arizona, chair), J. Mather (GSFC, co-chair), J. Bahcall (IAS/Princeton), S. Beckwith(MPIA/Heidelberg), A. Cochran (U. Texas), A. Dressler (OCIW, Pasadena), R. Ellis (U. Cambridge), R. Giacconi (ESO), M. Hauser(STScI), G. Illingworth (Lick, Santa Cruz), J. Knapp (Princeton U.), J. Ostriker (Princeton U.), M. Rowan-Robinson (Imperial College),P. Stockman (STScI), H. Thronson (NASA, Washington DC), V. Trimble (U. Maryland), M. Werner (JPL, Pasadena), S. White (MPIA,Garching), R. Williams (STScI).Local Organizing Committee: P. Bely (STScI), E. Dwek (GSFC), D. Fahey (GSFC), M. Hauser (STScI), M. Hurlbut (Jorge Sc.), A. Koratkar(STScI), S. Maran (GSFC), J. Mather (GSFC), A. Michalitsianos (GSFC, chair), E. Smith (GSFC), M. Stiavelli (ESA/STScI), P. Stockman(STScI).Contributed papers are invited in the form of posters.Contact Address: Mary Hurlbut [email protected] (301) 220-1701Registration Deadline: March 24, 1997: Registration Fee: sessions: $45, banquet: $25, proceedings: $40.Detailed information on meeting logistics and registration can be obtained at WWW URLhttp://ngst.gsfc.nasa.gov/~ngst/science/meetings/NGSTmeeting1.html

Page 7: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 7

A unique capability of NICMOS, thenear-IR camera which was installed

during the 2nd servicing mission inFebruary, is the grism spectroscopy mode.Camera 3 (NIC3) is fitted with three grismsgiving slitless spectroscopy in the 0.8 to2.3 micron range at a spectral resolvingpower of 200. The image area is51 × 51 arcsec with 0.2 arcsec pixels. Thethree grisms cover the wavelength ranges0.8–1.2µm (grism A), 1.1–1.9µm (grismB) and 1.4–2.5µm (grism C). In the 1.0–1.8µm range, the NICMOS grism hasunsurpassed sensitivity in comparison withmuch larger aperture groundbasedfacilities due, principally, to the very lowbackground above the atmosphere. Thesensitivity in the longest wavelength rangewill, however, be compromised by thermal

NICMOS grism spectroscopy— software development

at the ST–ECF

Wolfram Freudling

emission from the warm HST optics. At1.4µm for example, a signal-to-noise of10 can be achieved in a 1000 secondexposure on a source of 0.07 mJy (H = 18).See the NICMOS Instrument Handbookfor details (also available at http://www.stsc i .edu/ inst rument-news/handbooks/nicmos/NIH.html).

Typically, one or more direct imagesare taken in conjunction with the grismimages for the wavelength calibration.Both direct and grism images are routinelycalibrated by the standard STSDASpipeline (calnica and calnicb). However,there is currently no NICMOS-specificsoftware within STSDAS to extract 1-dimensional spectra from such image sets.Two IDL programs are being developedat the ST–ECF for that purpose. The first

of these, called NICMOSlook, is aninteractive program to extract spectra forindividual sources imaged with one set ofgrism and direct images. The second one,called calnicc, is a non-interactive programwhich automatically detects and extractsspectra from a set of image pairs.

In addition, a program GRISMdeconis being developed to allow the de-convolution of NICMOS grism data takenat a number of distinct roll angles.

NICMOSlookNICMOSlook is an interactive programusing an IDL widget. It relies on thepresence of the Xmanager and has beentested under Linux and Solaris versions ofIDL. The program reads a pair of user-defined NICMOS images which are

Figure 1: Screen layout of a typical NICMOSlook session. This is the interactive version of the program developed at the ST–ECF for the extraction of spectra from NICMOS grism data. In this session, the direct image and a zoomed region aredisplayed along with an extracted spectrum. Also visible is the tutorial which is provided in html format.

As part of a collaboration with the NICMOS IDT and with theSTScI, software is being developed for handling data from theNICMOS grism spectroscopy modes.

Page 8: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 8

assumed to be calibrated by the STSDASpipeline. From the image header, the imagetype (grism or direct) is identified and theframe placed into the appropriate buffer.There are a number of options for thedisplay of the images and simple imagemanipulation. Objects can be identifiedon the direct image either automatically(using DAOPHOT) or through userinteraction. Subsequently, spectra forindividual objects, or for all identifiedobjects, can be extracted. Several optionsfor the weighting of the extraction areavailable. The weights for point sourcesused by NICMOSlook have been computedfrom TinyTim point spread functions (seeRichard Hook’s article in this Newsletter).The extracted spectra can be saved in anoutput file, displayed on the screen orprinted as a PostScript file. Figure 1 showsa screen snapshot of a NICMOSlooksession.

The extraction of the spectra iscontrolled by a calibration database inwhich properties of the grisms andflatfields are stored. This database is easilymaintainable. It currently contains pre-flight calibrations and will be updated asnew information becomes available duringthe course of SMOV. On-line docu-mentation and a tutorial, written by LinYan, are available.

calniccNICMOSlook is a convenienttool to extract spectra from asingle pair of images. Since itis interactive, it can be used toexperiment with differentstrategies such as weightingmethods and object detectionas well as different calibrationfiles. However, for large datasets with large numbers ofimages, a more automaticapproach is frequently desir-able. Such a need may arisefrom large survey-type grismprogrammes, for archivalresearch or for programmeswhere grism images contain alarger number of objects whichare not the primary target(s) ofinterest. Ideally, such a grismextraction could be run at thesame time as the STScI pipelinedata reduction so that extractedspectra can be provided as astandard data product. Thisneed is addressed by calnicc.Like NICMOSlook, calnicc iswritten in IDL, but it is non-interactive. Some modules ofthe program are written in C.Therefore, the program needsan IDL version which allowsexternal functions to be called.Currently, the program hasonly been tested under SunOS

4 and Solaris. Linux users will need IDLversion 5.0.

The capabilities of the program aresimilar to NICMOSlook, with all aspectsof the extraction controlled by setup files.Objects are identified on a direct imageand classified as stars or galaxies using aneural network approach implemented asthe SExtractor program (Bertin & Arnouts,1996). The positions of the objects areused to extract spectra from the grismimage. The wavelength calibration of theextracted spectra is performed using theposition of the objects as determined bythe SExtractor program as the zero point,and using parameterised dispersionrelations. After extraction, the spectra arecorrected for the wavelength dependenceof the quantum efficiency of the detector.The flux scale is then computed usingthe same calibration database asNICMOSlook. The extracted spectra arecorrected for contamination by nearbyobjects under the assumption that the shapeof extended objects does not change withwavelength. Subsequently, the extractedspectra are automatically searched foremission and absorption lines. In addition,the continuum emission is automaticallydetermined. The final data products arebinary FITS tables with the spectra, errorestimates, object parameters derived fromthe direct image and details of the spectrum

extraction process. PostScript files withplots of the individual spectra are alsoprovided.

Grism deconvolutionFor crowded fields where the object spectraoverlap, the spectral information on asingle grism image for a given spatialposition is not unique since there is adegeneracy between the flux at a givenwavelength from one position and the fluxat a different wavelength from anotherposition. calnicc tries to correct extractedspectra for contamination of neighbouringobjects by assuming that the spatial shapeof extended objects is independent ofwavelength. This strategy fails when thisassumption is not valid (such as for anAGN with a continuum nucleus and anextended emission line region) or whenthe overlap between two or more objectsis such that there is not enough independentinformation in each of the spectra.However, this degeneracy can be brokenif several grism images are obtained withdifferent roll angles of the spacecraft,giving rise to differing directions of thedispersion. Other constraints can beobtained with direct images of the samefield without the grism. A programGRISMdecon has been developed whichperforms such a deconvolution based on aLucy-Richardson restoration. The

program is written in IDL andis non-interactive. However, asufficient amount of memoryand a fast computer arenecessary to run it efficiently.

Availability of programsAll programs described in thisarticle have been tested onsimulated data. They will befurther developed and testedon real NICMOS data as theybecome available. Soon af-terwards, official releases ofthe programs and their progresswill be announced. For up-to-date information, check theST–ECF NICMOS page at:http:/ecf.hq.eso.org/nicmos/nicmos.htmlAcknowledgments: Most of thecode for both calnicc andNICMOSlook was written byRobert Thomas. Lin Yan preparedthe flat-fields and backgroundestimates used in both programs.Other people involved in thesoftware development are: RudolfAlbrecht, Hans-Martin Adorf,Alberto Micol and Benoît Pirenne.

ReferenceBertin, E. & Arnouts, S., 1996,A&A Suppl. 117, 393

ADASS ’97Seventh Annual Conference onAstronomical Data AnalysisSoftware and Systems14–17 September 1997Sonthofen, Bavaria, Germany

The ADASS conferences provide a forum for scientists andprogrammers concerned with algorithms, software and softwaresystems employed in the reduction and analysis of astronomicaldata. ADASS ’97 is organised jointly by the Space Telescope –European Coordinating Facility and the European SouthernObservatory.

Dates:

July 7, 1997: deadline for – early registration– abstract submission– demo. requests

August 15, 1997: – hotel reservations

September 4, 1997: – late registration

September 14–17, 1997: ADASS ’97

For reasons of cost and convenience we decided not to holdADASS '97 in Munich proper, but instead to move it toSonthofen, Bavaria, about 100 km southwest of Munich nearthe Bavarian Alps. The conference facilities in the Stern Hotelare first class, the setting in the alpine environment is spectacu-lar, and the weather should be beautiful in mid September.

Location: Stern Hotel, SonthofenBavaria, Germany

Contacts: c/- Britt SjöbergST–ECF/ESOKarl-Schwarzschild-Str. 2email: [email protected]: +49-89-320 06 291Fax: +49-89-320 06 480WWW: http://ecf.hq.eso.org/

Page 9: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 9

A series of articles in the February1995 issue of this Newsletter

presented some methods by which ditheredWFPC images can be combined to recoversome of the information lost by theundersampling of the image by the largepixels of the camera CCDs. These arenon-linear, iterative techniques whichwork well for small areas of images.Unfortunately, such methods cannotreadily be used for large-scale combinationbecause they are computationally de-manding and can handle neither thesignificant geometrical distortion of theWFPC optics, nor rotations betweendithered frames. They also cannot handlePSFs which vary across the image andgive resultant images in which the noiseproperties are very different from those ofthe input images and hard to quantify forsubsequent photometric analysis.

For these reasons, a new method—formally known as Variable Pixel LinearReconstruction, but usually calleddrizzling—was developed by us to handlethe major image combination problemposed by the Hubble Deep Field project.We wished to preserve resolution while

Richard Hook & Andrew Fruchter†

Originally designed for processing the Hubble Deep Field, a methodhas been developed for combining dithered WFPC images usingVariable Pixel Linear Reconstruction. It is known, for reasons whichhave become obscure, as ‘drizzling’.

Drizzling dithered WFPC frames

† Space Telescope Science Institute

obtaining the highest possible signal-to-noise on faint galaxies. Drizzling is a non-iterative, linear, direct technique whichmay be considered to be an enhanced,fully flexible, generalisation of the familiar

Figure 2: A small subset of the Hubble Deep Field showing a peculiar disturbed orinteracting galaxy. The left-hand image was produced by ‘shift-and-add’ on theoriginal pixel grid. The right-hand image was produced using drizzling and has 2.5times finer pixels.

Fine output gridGeometric

transformation

Coarse input pixel grid

Figure 1: A cartoon showing how thedrizzling algorithm projects inputpixels, via a geometric transformation,onto a finer output pixel grid. The smallblue squares show the reduced footprintassigned to each original pixel.

an additional convolution with the pixel-response function (see Fig. 1). In theextreme case, where the input pixel isshrunk to a point, this method becomesequivalent to pixel interlacing which is anoptimum strategy but rarely practicalbecause the fractional shifts requiredcannot normally be executed to anadequate accuracy. Furthermore, geo-metric distortion prevents interlacing frombeing applied when there are large shiftsbetween images. The degree of pixelshrinking is normally limited by a desireto avoid empty pixels in the output or, inthe less extreme case, large variation inthe weights of adjacent output pixels.

Drizzling can handle bad pixel masks,geometrical distortion, rotations and shiftsand combines images in a statisticallyoptimum way with minimum loss ofresolution. Since its application to theHDF, the software has been considerablyenhanced and has subsequently been usedon many datasets including ISO imagingof the HDF as well as WFPC 2 imaging. Afull description along with the currentimplementation of the software (easilyinstalled to run under IRAF) is availableat:http://www.stsci.edu/~fruchter/dither/and an example of its use is shown inFig. 2. A detailed published descriptionof the method will appear in the near

‘shift-and-add’ method widely used ininfrared imaging and elsewhere. Insteadof simply shifting the large input pixelsinto the correct position on the output andadding them, the drizzling method shrinksthe input pixels before mapping onto theoutput and hence minimises the effects of

Page 10: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 10

Figure 3: An example of the use of drizzling to combine dithered images andsimultaneously detect and suppress cosmic-ray events. The left-hand frame shows asmall part of a single input image, one of 12 public ones extracted from the archivefor this work. The right-hand image shows the result of combining all twelveimages using cosmic-ray detection and image combination procedures currentlyunder development at STScI.

future and this brief note is to alertinterested observers of the code’savailability and to encourage anyone withdithered data to down-load the softwareand try it out.

The drizzling task will form a part of amore comprehensive software packagebeing developed at STScI by AndrewFruchter, Ivo Busko and others which willalso address issues such as the accuratemeasurement of the shifts and possiblerotations between dithered images. Thisdither package will appear in a futurerelease of STSDAS. There is also verypromising work in progress in adaptingdrizzling to be able to detect and flagcosmic-ray events in small sets of ditheredimages which do not have the multipleexposures at each telescope pointing whichare required for traditional CR rejectionschemes (Fig. 3). As such sets are commonin typical GO programs it is hoped thatthis method will become a valuable tool.

Point-spread functions (PSF) arerequired for many aspects of HST

data flow. These include simulations andexposure time estimation during proposalpreparation, image restoration andoptimum weighting for grism spectralextraction. The Tiny Tim program,designed and written by one of us (JK),has proved convenient for PSF simulationas it is easy to install and use, quick to runand produces PSFs which are close enoughto empirical ones for most purposes. As acollaboration between the ST–ECF andSTScI, facilities for creating PSFs for thethree NICMOS cameras have now beenadded to the latest version of Tiny Tim(V4.2) which is available for downloadingvia the Web at:http://ecf.hq.eso.org/~rhook/

nicmos-tinytim.htmlThere is also a general Tiny Tim page at:http://scivax.stsci.edu/~krist/tinytim.htmlThe model of the NICMOS imagingsystem used by this version of Tiny Timincludes the current best values for theorientation and position of the NICMOSapertures in the HST focal plane, the focalratios of the cameras and the geometry ofthe coldmasks. No aberrations of the finalbeam are currently well known (althoughthey are known to be small) and none areincluded. More precise information willbe available after the instrument has beenin orbit for some time, although pre-launchoptical tests and phase-retrieval suggest

NICMOS point spread functionsnow available from Tiny Tim

Richard Hook & John Krist†

that the PSF produced by this version ofthe software are good approximations.

This version of Tiny Tim does notinclude filter curves for NICMOS. For theother cameras, the spectral energydistribution to be used in the constructionof a weighted, polychromatic PSF isdetermined by the choice of a stellarspectral type. In the infrared, where fewobjects have a black-body spectrumcorresponding to typical stars, this makeslittle sense. Instead there is a new option,which may also be used with the othercameras, whereby the user can supplyTiny Tim with a list of wavelengths andweights which are used to create apolychromatic ‘bespoke’ PSF. It is plannedto include NICMOS filter curves alongwith several options for SED definition(black-body, power-law, user-defined,etc.) in a future release. Unfortunately thecoronagraphic mode of NICMOS cannotbe properly modeled by Tiny Tim (whichassumes that all the obscurations are in asingle pupil plane and hence cannot modelthe apodizing effects). Another factorwhich is not currently included is theNICMOS pixel-response function—thepixels are known to be less sensitive at theedge than in the centres and this will havesome effect on the PSF.

It should be stressed that this is apreliminary release and, although initialcomparisons with actual NICMOS dataare good, there will need to be significantrevisions after orbital verification.

† Space Telescope Science Institute

Examples of monochromatic PSFs forNICMOS produced by Tiny Tim V4.2.The left column is for 1µm and the rightfor 2µm. The bottom row is forcamera 1, the centre for camera 2 andthe top for camera 3. For each PSF, anarea of 5" on a side is shown. Theintensity scaling is such that themaximum scaling value is the samefraction of the total intensity of the PSFin each case. Because the pixels ofcamera 3 are larger than camera 2 or 1they contain a larger fraction of thetotal, the fainter outer structure is moreprominent in these images.

Page 11: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 11

Following the allocation of HST timefor Cycle 5, I wrote an article (ST–

ECF Newsletter No. 22, February 1995)under a similar title summarising theoperations of the Proposal Review Panels(PRPs) and of the Telescope AllocationCommittee (TAC). I also explained thecircumstances under which otherwisescientifically excellent proposals endedup with no time.

In view of favourable comments onthe usefulness of that article, followingthe meetings of the PRPs and the TAC forCycle 6, I asked the European Panelliststo debrief themselves and summarize forus what they learned about the do’s anddon’t’s of successful proposal writing forHST. These make fascinating reading and,at the urging of the ST–ECF’s Users’Committee, a lightly edited version wasmade available via our Web site (http://ecf.hq.eso.org/tacomments.html) for thebenefit of proposers for Cycle 7. A newversion of this material, organized by topicand including input from our Cycle 7Panellists, will be put on the Web wellbefore the Cycle 8 proposal deadline.

Following the allocation of time forCycle 7, I again asked the EuropeanPanellists to debrief themselves. However,instead of do’s and don’t’s, I asked themto focus on the proposed use by the world

More reflections on the TAC

Leon Lucy

community of NICMOS and STIS and, inparticular, to alert us to any genericscientific opportunities opened up by thesenew instruments that are being neglectedby European astronomers. Gratifyingly,the Panellists are essentially in unanimousagreement that the scientific quality of theEuropean proposals was fully competitivewith those from US-based PI’s. Moreover,these admittedly subjective impressionsare resoundingly confirmed by the hardstatistics of orbits allocated to proposalswith ESA PI’s and CoI’s (see below).

A second topic on which I requestedcomments is that of current researchpossibilities available using the HSTarchive. As most readers will know, US-based researchers can submit proposalspurely for research with archived data. Ifapproved, such proposals get financialsupport, often at a level allowing a postdocto be hired to do most of the actual work.Having participated in the reviewing ofsuch proposals, several of our Panellistsare convinced that cutting-edge sciencecan now be done with archived data, andthat the lack of comparable financialsupport in Europe puts us at a disadvantagein exploiting this increasingly valuableresource. Note that these qualitativecomments on the scientific quality of thebest archival proposals are decisively

supported by their being ranked in amongthe best of the proposals for new telescopetime.

Though funding in Europe for postdocsto mine the HST archive is undoubtedlydesirable and is being sought, Europeanresearchers would be well-advised not tobe inactive in the interim. In this regard,they should inform themselves of powerfulsoftware tools developed at the ST–ECFby Benoît Pirenne and colleagues thatgreatly facilitate HST data retrieval,recalibration and combination (see thearticle about WFPC 2 combinations andon-the-fly recalibration in this Newsletter).Quite possibly, a reader’s first guess of theeffort required to complete an HST archiveproject would in fact prove a grossoverestimate.

Given the severe oversubscription forHST time, many excellent proposals areinevitably rejected. Not unnaturally, thePI’s of such proposals are upset, and ahandful (no more) complain to the ST–ECF that, judging from the commentsreceived from the STScI, their proposalswere not understood by the reviewingPanel. I find it hard, if not impossible, tobelieve that this could happen to a clearly-argued proposal. Since each Panellist isrequired to submit electronically a gradefor each proposal before the Panel meets,they reach their understandings of theproposals independently, and so theprobability that all fail to grasp the pointof a well-written proposal is surelyminiscule. Disappointed PI’s withlingering doubts should look at the list ofCycle 7 Panellists in the recent STScINewsletter (January 1997) and askthemselves if the expertise was indeedpresent to understand and evaluate theirproposals. If the answer is yes, as I wouldconfidently expect, then the point of failuremust lie in the proposal itself. Withincreasing specialization, there are sometopics with only 2 or 3 real experts in thewhole world. If you are one of them, youshould not assume that the STScI willhave the foresight to recruit one of theothers as a Panellist. Accordingly,proposals must be written to be understoodby colleagues with adjacent expertise:after all, each Panel deals with aconsiderable diversity of topics.

0

2

4

6

8

10

12

14

16

18

20

Belg

ium

Fran

ce

Ger

man

y

Ital

y

Neth

erla

nds

Spa

in

Swe

den

UK

ESO

STS

cI (

ESA

)

0

50

100

150

200

250

300

ESA PINo. orbits

~ 17% by PI ~ 21% by orbits

Summary of the Cycle 7 allocations to ESA member state Principal Investigators(PI). The ESO and the (ESA) STScI PIs have been counted separately.

A second commentary on the HST time allocation process withspecial reference to European proposers. More details derived fromthe comments of European members of the Cycle 7 subject panelsand allocation committee will be given on the ST–ECF Web sitebefore the Cycle 8 deadline.

Page 12: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 12

WFPC 2 images of the Lagoon nebula

Adeline Caulet

The Lagoon Nebula (Messier 8) is a well-studied H IIregion in our Galaxy. It is 1.5 kpc from us and appearsa few degrees northwest of the Great Sagittarius Star

Cloud. This H II region is, like the well-known Orion Nebula,a site where new stars are being born from dusty molecularclouds. The HST picture reveals, in great detail, the small-scale structures in the interstellar medium: Bok globules, bowshocks around stars, ionized wisps, rings, knots and jets. Theimages here are constructed from archival observations

taken in filters which select ionized sulphur (red colour), hydro-gen (green) and doubly-ionized oxygen (blue). This choice ofcolour assignment is extremely valuable for revealing thedifferent levels of excitation in the clouds. This type of imagepresentation was applied before on other nebulæ imaged withWFPC 2 (see STScI press releases and published articles onthe Helix Nebula and the Orion Nebula). By comparing themorphology and the flux of extended features in differentemission lines and also in the continuum at blue and red

In a dramatic demonstration of the wealth of high quality data nowpublicly available from the HST archives, these colour-composite

pictures reveal new features in the familiar Lagoon nebula.

Page 13: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 13

wavelengths, it is possible to identify objects which appear tobe circumstellar disks around young forming stars like thoseseen in the WFPC 2 images of the Orion Nebula nicknamed‘proplyds’ in the literature. For example, among them is thebright blueish blob seen close to the brightest star, Herschel 36.

The HST images have sufficient spatial resolution to revealthe different positions of the ionization fronts that penetrate theparent cloud giving birth to the star. Here, the images haveclearly revealed the high excitation [O III] front on the nearside(facing Herschel 36), followed by the hydrogen and sulphurfronts. Those fronts have a characteristic arc/moon shapewhich is a small cross-section of the illuminated cloud. Thenew-born star embedded in the cloud is identified in thecontinuum light by a stellar object surrounded by scatteredlight. It is not always possible to detect the newborn stars in theoptical because of heavy dust extinction. Near-infrared imageshelp to identify the obscured stars and are complementary tothe optical identifications of circumstellar disks.

These colour-coded images show the distribution of theinterstellar molecular clouds in much more detail than before.In particular one can see the smoothness of several cloudsurfaces and the very thin dark ridges aligned almost vertically

in the middle of the image. Their widths can be as thin as0.004 pc (recall that the whole field or 1600 WF pixels corre-sponds to 1.2 pc). With the finer sampling of the PlanetaryCamera (0.0455 arcsec/pixel = 3.4 × 10-4 pc), the structure ofthe clouds of the Hourglass region shows details never seenbefore. It shows the processes of the wrapping of an ionizationand heating front around the clouds. As the gas gets ionized,it is heated from some 30 K to about 10,000 K with a corre-spondingly sudden jump in pressure, inducing what is calledphotoevaporation of the clouds. This is seen directly in theHST image as the blue haze in the upper-right WF quadrant.

These public images were retrieved from the HST archiveat the ST–ECF and have been processed with IRAF andSTSDAS tasks. The processing used for scientific analysisincluded the combination of all data in the archive in differentfilters, two series having being taken at different pointings.Removal of artifacts, cosmic rays, hot and dead pixels wascarried out and saturated pixels were flagged using STSDAStasks such as crrej. Image combination was carried out usingthe drizzling technique described elswhere in this Newsletterby Andy Fruchter and Richard Hook.

Page 14: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 14

I n some respects, HST has become aparadigm for planning, executing,calibrating and archiving astronomical

observations. Astronomers are nowaccustomed to receiving data which arecalibrated to high accuracy. Here, I discusssome new concepts for data calibrationbased on a physical understanding of theinstruments and the detection process.The operational advantages are outlinedand the interrelation between thispredictive calibration strategy and model-based data analysis is shown. Finally,reference is made to working imple-mentations for HST instruments as wellas ground based equipment in use or underconstruction at ESO.

The classical calibration conceptWith ever more complex instrumentsentering service at telescopes accessibleto the worldwide astronomical com-munity, an increasing importance isassigned to ‘calibration’ tasks. However,the concepts currently employed for thecalibration of observational data derivefrom the historical development ofinstrument and detector technology. Inorder to assess whether we employ thebest possible method of analysing theprecious data from large groundbasedtelescopes and space observatories, it isnecessary to take a careful look at thecalibration concept currently in use.

When the only detector was the humaneye, calibration basically consisted of theproper transfer of some linear readings(eg. from micrometers) into angles on thesky and the determination of the so called‘personal equations’ for timings andbrightness estimation (ie, calibration bycomparison with empirical readings from‘standards’). Since that time, mostcalibration strategies have been based onthat very same principle. Admittedly,computational tools have become muchmore elaborate, so that the ‘standard’readings can be filtered, statistical analysiscan be performed, and long term trendscan be studied with the help of archives.Still, at the heart of the process areempirical relations which are based onnoisy data. A typical example is the fittingof low-order polynomials to the ob-

Predictive calibrationusing instrument models

Michael Rosa

Universe

uO

O × C = 1 ?

dC

Observation

Raw data

Calibration

Analysis

Perception of the Universe

Figure 1: The generic data gathering and analysis scheme in astronomy.

servations of wavelength calibration lampspectra with data taking and analysisusually repeated for each new target nightafter night.

The calibration strategies currentlyadopted in most of optical astronomy arean adaption of empirical methods aimedat ‘cleaning’ data from the effects ofinstrumental and environmental (eg,atmosphere, particle radiation in orbit)processes. One of the most obvious signsof the empirical nature of this calibrationconcept is the linearised decompositionof the calibration process into individualtasks which are designed to correct forjust a single effect at each step. Thetransformation of detector-based units toastrophysical units is usually doneseparately as a final step at the boundarybetween calibration and data analysis.

This whole sequence makes it difficultto arrive at a physical understanding ofvariations in instrumental characteristicswhich could direct the subsequent effortsat maintenance or algorithmic design. Itfurther requires a long learning process to

determine the optimum deployment ofresources in the form of observing timefor calibration observations and manpowerfor subsequent analysis. While the linkbetween the calibration of individualobservations and the monitoring ofinstrument performance can be strength-ened by adopting improved operationalprocedures, the calibration process itselfis still not seen as an entity requiring acomprehensive gathering of instrumentknowledge and observed data.

From raw data to astrophysicalparametersFigure 1 depicts the generic informationflow in observational astronomy. Duringdata gathering an operator that describesthe telescope/instrument combinationprojects a sub-volume of the Universe’sparameter space onto the raw data domain.In the classical interpretational approach,we attempt to apply the ‘calibration’operator in order to re-project onto thephysical parameter space (eg, a flux scale)to obtain the ‘true’ spectral energy

By constructing instrument models whch incorporate as full as possible a knowledge ofoptical and detector physics, the calibration of astronomical data can be placed on a

firmer footing than is currently the norm. Ultimately, the comparison of observations withmodels can take place in the raw data domain. This article describes a methodology,

developed jointly at the ST–ECF and ESO, which is designed for application to HST andVLT instruments. It is now implemented in the form of a software library.

Page 15: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 15

Figure 2: The observed count rate for the star 16 Cyg B (in black)—which is verysimilar to the Sun. The blue spectrum shows the expected signal observed with anideal instrument and the red line shows the way this is affected by the scatteredlight predicted from the model.

distribution of a source or its ‘true’morphology and surface brightnessdistribution. Were it not for the presenceof noise (photon shot noise and detectornoise), we could in principle find an inverseof the ‘observation’ operator which wouldperform an exact re-projection. Radiointerferometry, which is not limited byphoton noise, comes very close toachieving exactly that. At opticalwavelengths, however, direct inversion isout of question because of the limitednumber of photons that can be collectedand the characteristics of most detectors.

The usual procedure therefore is todecompose the inversion process into asequence of instrument signature removalsteps. In imaging this might be bias andbackground subtraction, flat fielding,photometric correction and possiblydeconvolution. In spectroscopy one has todeal, in addition, with dispersion relationsand the wavelength dependence ofefficiencies. Cross-dispersed and long-slit spectra require additional complexity.Major efforts have been expended inobtaining more accurate and less noisyempirical calibration relations.

The classical concept takes noisy rawdata back through the ‘calibration’operator which is only an approximationof the inverse ‘observation’ operatorbecause of the sequential treatment ofeffects and because of the presence ofnoise. However, Figure 1 reveals adifferent route for arriving at statisticallysounder estimates of the astrophysicalparameters. It rests on the ability tosimulate the observational process in allits (relevant) detail on noise-free modelsof targets (eg, theoretical surfacebrightness and energy distributions). Therange of acceptable model targets thatcould result in the observed raw data canthen be explored using, for example,Monte-Carlo optimisation procedures.

Concepts based on instrument modelsIt is well known that even software modelsthat address only one aspect of theinstrument in question greatly help torecognise adverse characteristics of thedesign and the impact of various solutionson the later performance at the telescope.Such models, often in the form of exposuretime estimators, also support the pre-paration and planning of observations.However, most of these models are linearand empirical, ie, inverted versions of theclassical calibration process.

Yet we know much more about ourinstruments and about the observationalprocess than is usually incorporated intothe data calibration and analysis process.Already during the design phase weemploy physical concepts to optimize thelayout—for example by using ray tracingsoftware. In contrast, when dealing withwavelength calibration, all we do is to

repeatedly re-use polynomial approx-imations to very basic optical trigo-nometric equations.

Instrument models which are based onfirst principles clearly have the predictivepower required to make substantialprogress in the areas of calibration anddata analysis. In the road map of Figure 1,they can be applied at ‘calibration’, wheresuch models will provide noise-freecalibration relations that can be projectedeven into ‘uncalibrated’ modes of theinstrument: this is the concept of predictivecalibration. A preferred alternative for thephysical models is at ‘observation’, wherethey enable the full simulation of theobservation process on synthetic targets.The astrophysical data analysis is thenperformed by comparing actual raw datawith simulations of the expected ap-pearance of raw data for the instrumentconfiguration being used. This is theconcept of model-based data analysis.Such a scheme has been familiar to X-rayastronomers for a long time, a proceduredriven in their case by the relatively smallnumber of photons collected in a normalobservation.

Practical considerationsBefore implementing the two concepts of‘predictive calibration’ and ‘model baseddata analysis’, two questions must beanswered:❏ Can one construct from first principles

‘physical’ software models of astro-nomical instruments which

i. are able to predict observations of‘standards’, ie, calibration relations,to a sufficient degree of accuracy,and

ii. work reasonably fast so that theycan be embedded into routinecalibration and data analysis pro-cedures?

❏ Are such implementationsa) affordable in comparison with the

effort needed to maintain theclassical calibration scheme, and

b) do they bring any additional benefitsfor the operation of common-userinstrumentation?

The ST–ECF has long experience withthe transfer of such concepts from ideasinto working software packages. The firstcoherent database of HST instrumentcharacteristics, with the associated toolsto simulate observations, was created asthe STMODEL package in the MIDASenvironment at the ST–ECF in 1985. Thistool and the data served as a basis for theSYNPHOT package in IRAF/STSDASwhich has been widely used by the HSTcommunity ever since. In a mathematicalsense, the various methods for imagedeconvolution that blossomed in the earlyyears of HST operation and in particularthose developed by Lucy and Hook at theST–ECF (eg, ST–ECF Newsletter No. 19,6, 1993), are implementations of the‘model-based data analysis’ concept.

An established application of the‘instrument model’ concept in HST dataanalysis is the development of the

Page 16: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 16

understanding of scattered light in theFOS. The initial analysis (Rosa 1993,1994a) identified as the causes thecombination of broad wavelength rangesensitivity of the detectors and the non-zero light levels in the far wings of the linespread function of this spectrograph forred targets. The resulting spectra showveiling of the scarce UV photons by theabundance of red photons present in theLSF wings. It is obvious that the intrinsicUV properties of such targets can only berecovered via a software model thatsimulates to a precision of better than onepart per million the dispersion and imageformation of the aperture-collimator-grating-detector combination. Thissoftware model is available in MIDAS(Rosa 1994b) and IRAF/STSDAS(Bushouse, Rosa & Müller 1995). Thesimulation of an FOS/BLUE G190Hobservation of a solar type star with actualdata is shown in Figure 2 (see also FOSInstrument Handbook V5.0 or 6.0,Appendix C).

This example demonstrates that asoftware model going beyond a simplethroughput calculation, ie, correctlydescribing all relevant physical effects,can be very powerful in solving problemsencountered during the scientific analysisof data. Used in this way, the model appearssimply as an additional data analysis toolin support of the calibration process.Ultimately, the goal is to go one stepfurther, ie, to advance from the calibrationstrategy of ‘signature removal’ currentlyin use.

There are operational benefits to begained from the use of instrument modelswith high predictive power. Instrumentsbuilt and tested in the laboratory accordingto detailed performance specifications canbe understood in all aspects of theirperformance before and during com-missioning. The physics of opticalelements and the interior workings ofdetectors and electronics are known to adegree that is usually much better than can

be deduced from the analysis of noisy datafrom astronomical ‘standards’. Duringroutine operations, the model—fed byupdated calibration databases—providesthe reference for the nominal performanceof the instrument. Large deviations fromthe anticipated behavior will signal thenecessity of maintenance, while smalldeviations will usually be easily re-cognised as parameters not yet properlyaccounted for in the coded model.

The Model Project atST–ECF and ESOWith the experience gained during actualcase studies and with the recognition thatthere is a large overlap in requirements forthe operation of the pipeline calibrationbetween HST and VLT instrumentation,the ST–ECF has started a joint projectwith ESO to study, implement and exploitthe ‘predictive calibration’ and the ‘modelbased data analysis’ concept. This effortis three-fold: to further develop the generalmodelling concept; to construct instrumentmodels based on first principles for specificinstruments and modes; and to establish ageneric model software library. Thus codegenerated to handle the 2D geometry (orderlocation and shape, dispersion relations)of a particular ESO 2D-echelle spec-trograph (CASPEC) can easily be re-usedto provide the same functionality for HST’sSTIS and ESO’s VLT instrument UVES(cf. Ballester & Rosa 1997, A&A in press).

Short-term goals for HST are thecapability to support HST observers bypredicting calibration data for STIS modesnot covered in the early stages of in-orbitcalibration and the development ofadvanced methods for echelle dataextraction based on model-predicted orderlocation and inter-order backgroundprofiles. For NICMOS, the impact ofmodels on the extraction and analysis ofgrism data is being studied. Long-termgoals are to bind such models into thecalibration processes and ultimatelyimplement ‘model based data analysis’ as

a general methodology. An importantaspect will be the routine use of models toprovide calibration reference data.

Users of both the new HST ins-truments, particularly STIS, and thearchival data of the previous generationinstruments FOS and GHRS are likely tosee the impact of the instrument modelssoon. The finalising of the FOS and GHRSarchives (see article in this Newsletter)will rely heavily on such models. TheESO La Silla community already makesuse of the model software library any timeone of the new exposure simulators isactivated on the World Wide Web (eg.h t t p : / / w w w . e s o . o r g / n t t / s u s i /simulator.html), and extensions are inpreparation for the first generation of VLTinstruments.Acknowledgments: It is a pleasure toacknowledge the continued support andencouragement from Riccardo Giacconi forthis project. The outstanding collaboration withPascal Ballester (ESO/DMD) has resulted inmutual benefits for HST and the ESO VLT.

ReferencesBushouse, H., Rosa, M.R., Müller, Th.: 1995,

in ‘Astronomical Data Analysis Softwareand Systems’, R.A. Shaw, H.E. Payne andJ.J.E.Hayes (eds)., ASP Conf Ser Vol. 77,p. 345, ‘Modeling Scattered Light in theHST Faint Object Spectrograph’

Rosa, M.R.: 1993, ‘ST—ECF Newsletter’, 20,August 1993, p. 16-19, ‘Correcting FOSbackground’

Rosa, M.R.: 1994a, in ‘Calibrating HubbleSpace Telescope’ C. Blades and S. Osmer(eds), STScI, Baltimore, p190, ‘Backgroundsignals in FOS data’

Rosa, M.R.: 1994b, in ‘HST FOS Handbook’,V 5.0 (and higher versions), A. Kinney(ed), STScI, p. 54, ‘Appendix C. GratingScatter’

Rosa, M.R.: 1995, in ‘Calibrating andUnderstanding HST and VLT instruments’,P. Benvenuti (ed.), ESO/STECF Workshop,p. 48, ‘Predictive calibration strategies’

This February, the first generationspectrometers aboard HST were

returned to the ground at the end of a verysuccessful second servicing mission. TheFOS and GHRS data collection is thereforecomplete and it is appropriate to considerthe production of final archives

At the ST–ECF we are preparing forreprocessing versions of these data underthe project ‘The FOS and GHRS finalarchives’. This is in close consultationwith the planning group at STScI, where

Final archives for the FOS and GHRS

Michael Rosa

activities have concentrated on closingout the FOS and GHRS cycle 6 proposals.The aim here is to avoid any duplication,to provide mutual support in theseactivities within the HST project and,most importantly, to arrive at productswhich result in an increased scientificvalue of FOS/GHRS data for the usercommunity.

Why should one bother with ‘oldies’?STIS is a new HST spectrograph whichcombines the resolving powers and

wavelength ranges of FOS and GHRS,while providing large spatial and spectralmultiplex gains from 2D detectors. Thetotal time available for UV-visiblespectroscopy on HST is, however, unlikelyto increase substantially as imaging in theinfrared (with NICMOS) and later deepimaging on wider fields (with ACS)compete for observing time. It is clear thatprecious time will not be made availableto re-observe targets in modes that wouldsimply duplicate observations already

Page 17: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 17

secured with FOS or GHRS. Thereforethe FOS and GHRS science data sets willremain as valuable as they are now.Similarly the IUE project is preparing itsfinal archive, which will provide a uniqueresource of observations that cannot bemade from the ground.

The processed data currently ac-cessible from the FOS/GHRS part of theHST archive span the period from cycles0 to 6. Both instruments underwent secularchanges in sensitivity and both were fedwith a greatly improved HST PSF fromcycle 4. The presence of COSTAR resultedin smaller projected aperture sizes andmodified ultraviolet sensitivity curves. Thecalibration procedures used in the pipelinehave also seen a continuous improvementthroughout these seven cycles.

We can now take a global look at thecalibration of these data and can producea more homogeneous archive than couldbe produced by simply applying the ‘best’calibration reference files to the previouscycles of observations. In particular, aglobal recalibration can make use of thepredictive power of instrument models(see the article in this issue) for thedispersion curve analysis, the backgrounddetermination and the scattered light inFOS.

Most of the information and the toolsrequired to optimise the recalibration arealready available. A substantial building

block is ‘on-the-fly’ recalibration schemeimplemented in the ST–ECF HST archiveuser interface a year ago and described inthis issue. An area calling for a ratherlarge effort, however, is the exploitationof information contained in the engin-eering data stream which is currently notreadily accessible. Several items, such asjitter information, instrument temperaturesand magnetometer readings, are requiredin order to improve the current pipelinecalibration.

Given the appropriate manpower forthis level-of-effort project, we are planningto make available a self-contained volumefor each of the two instruments, FOS andGHRS, on CD-ROM. These final archiveswill contain:❏ The (publicly available) raw data sets,

accompanied by a re-calibrated versionfrom the standard pipeline using thebest available reference data. This isessentially the version derived fromthe current HST archive by requestingre-calibration with the best referencefiles.

The value-added items planned are:❏ All calibration reference data ever

produced to that date for both instru-ments.

❏ All software items required to run thelatest pipeline on the raw data, assumingthe presence of IRAF/STSDAS on a

UNIX platform.❏ All publicly available records of Phase I

and II proposals with SMS logs, so thatthe exact sequence of observing eventscan be reconstructed, eg, the locationof an aperture on a target galaxy.

❏ All relevant documents in electronicform. These will include all versions ofthe Handbooks, the data analysisguides, the Phase II instructions,technical documents and InstrumentScience reports.

❏ The procedures, reference data anddocumentation for the model-basedrecalibration, along with copies of theresulting data products.

The initial release of such an instrumentarchive should be viewed as a firstinstallment. A delivery date late this yearcan only be met if some sacrifices aremade. A number of data sets will still beproprietary at that time and the first issuecould contain only the core of the itemslisted above. The exercise itself willprovide us with the insights to deviseimprovements to calibration items andtools. Not least, the response andsuggestions from users will be incor-porated into enhancements, enabling atruly final archive to be produced.

The 1997 HST Calibration Workshop— with a new generation of instruments

22–24 September, 1997

The Space Telescope Science Institute will host the third HST Calibration Workshop on September 22–24, 1997.The workshop will be organized by STScI with the collaboration of the ST–ECF.

Topics will include:

Preliminary results of the calibration of the new instruments (STIS, NICMOS, and the new FGS)Closure information on the calibration history of FOS and GHRSUp-to-date calibration results for the continuing instruments (FOC and WFPC 2)New pipeline and post-processing softwareAn advance look at future instruments

The format will include invited talks, contributed posters, and splinter sessions for more detailed technical discussions.

If you are interested in participating, please complete the form athttp://www.stsci.edu/stsci/meetings/cal97/CalibWSForm.html

or send email to [email protected] information will be sent to all respondents, and the information you provide will be very helpful in our planning.

SPACE TELESCOPE SCIENCE INSTITUTE

Operated for NASA by AURA

SPACE TELESCOPE

European Coordinating Facility

Page 18: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 18

The Archive column

Benoît Pirenne & Alberto Micol

I n the past year, the archive has beenactive both in operations and de-

velopment. We have seen the archive usageincrease by a substantial factor: see fig. 1for accesses to the database services andfig. 2 for the evolution of the distributeddata volume. The on-the-fly recalibrationscheme has been released and user requestscan be delivered on CD-ROM. In addition,a Web interface for retrieval of the bestcalibration files has been implemented.

On-the-fly recalibrationThis novel approach to archive datadistribution is described in a separatearticle in this newsletter. In brief, it allowsone to submit a request for HST archivedata as before, but allows one extra option.The data files can be re-calibrated at thetime of the request using the best availablecalibration reference files together withthe most up-to-date pipeline softwareapplying at the instant of the requestsubmission.

The system, developed in collaborationwith the CADC, has been undergoingheavy testing in the past few months andhas just been released for wider use. It canbe accessed on the Web using the followingURL:

http://archive.eso.org/wdb/wdb/hst/science/form

To support the reprocessing exercise,we have added extra hardware power tothe archive system such that we achieve arecalibration time of about 3–5 minutesper WFPC(2) dataset.

The overall service time (fromsubmission to reception of results) shouldalso be greatly reduced due to the presence

of all the raw data on CD-ROM located in a jukebox. Thissetup allows for a very fast data access. The calibrationreference files are all on direct access magnetic disk.

HST data now also on CD-ROMIn addition to FTP, Exabyte and DAT tapes, we now offer HSTand ESO/NTT data on CD-ROM. This facility is limited torequests involving between 300 MB and 1200 MB of data. Forlarger and smaller datasets and special requests, please sendemail to ‘[email protected]’.

Retrieval of best calibration filesThis new Web interface has been designed principally for HSTusers willing to calibrate the observations themselves eventhough the on-the-fly recalibration service is available.

Using a web form, users can enter a list of datasets and/orcalibration files. If available, the requested files for the givendataset(s) are retrieved and made available for immediatedownloading through the user’s web browser.

The Hubble Deep FieldThe archive currently offers versions 1 and 2 of the HDF dataproducts. Please have a look at:

http://ecf.hq.eso.org/hdf/hdf.htmlfor current information on availability or send email to‘[email protected]’ to request the collection of processed/rawdata on tape or CD-ROM. Fig. 3 shows a very small section ofthe ‘BVI’ and ‘UBV’ colour composites.

The Digitised Sky SurveyThe northern part of the Digitised Sky Survey has been availablefor public access without restriction since early February 1996.The entire sky can now be accessed from:

http://archive.eso.org/dss/dssTo improve the speed with which regions of the southern skycan be retrieved, we have copied over 60 CD-ROM entirelyonto large, fast-access magnetic disks. The new setup has beenavailable since July 1996 and reduces the time to get anyrandom image of the southern sky from 30 to about 5 seconds.

199

0

1991

199

2

1993

199

4

199

5

199

6

0

500

1000

1500

2000

Ave

arge

acc

esse

s/m

onth

199

0

1991

199

2

1993

199

4

199

5

199

6

Year

1991

199

2

1993

199

4

199

5

199

6

0

20

40

60

80

100

Dist

ribut

ed G

B/ye

ar

1991

199

2

1993

199

4

199

5

199

6

Year

Figure 1: Average number of accesses permonth to the archive services.

Figure 2: Evolution of the distrubuted datavolume.

Page 19: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 19

Dennis Crabree†, Daniel Durand†, Norman Hill†, Séverin Gaudet† & Benoît Pirenne

Automatic recalibration— a new archive paradigm

The CADC and ST–ECF have togetherimplemented a pipeline which auto-

matically recalibrates data as it is requestedfrom the archive. This process uses thelatest recalibration software and therecommended calibration reference files,which may be different than the calibrationfiles used in the original calibration. Theservice is now available through the WorldWide Web.

BackgroundHST science data are automaticallycalibrated when they are received at theSTScI and are subsequently included inthe archive. The calibration software,which is contained in the IRAF/STSDAShst_calib package, takes as input the rawdata and any necessary calibrationreference images or tables if they arealready available. The software determineswhich calibration steps to perform bychecking the values of the calibrationswitches in the header. It selects whichreference files to use in the calibration byexamining the reference file keywords.The values of these switches and keywordsdepend upon the exact configuration ofthe instrument, the date of the observationand any other constraints. The values areset in the headers of the raw data in theRSDP (Routine Science Data Pipeline).

Until now, when users requestedcalibrated data from the HST archive,they received the data produced by theRSDP pipeline. However, instrument

† Canadian Astronomy Data Centre, Dominion Astrophysical Observatory, HerzbergInstitute of Astrophysics, National Research Council of Canada, Victoria, BC

properties change with time and betterapproaches for the calibration of someinstruments have been introduced andthere have been other general im-provements to the HST calibrationprocedures. So what is a user to do?

Fortunately, the same, or improved,software which runs in the calibrationpipeline at STScI is also available in thereleased version of IRAF/STSDAS. Onecan recalibrate data from the archive bystarting with the raw data, editing theappropriate header keywords to activatethe new calibration files and then runningthe appropriate software. The STScImaintains a database which contains therecommended calibration reference filesfor each observation. However, this is notthe most convenient approach for usersand this has led us to develop an automaticrecalibration process for HST data whichessentially duplicates what a user woulddo manually.

Automatic recalibration of HST dataIn order to offer automatic, quasi-on-linerecalibration of HST data, severalcomponents need to be in place:❏ the raw science data need to be available

on-line❏ the required calibration reference files

and tables need to be on-line❏ a pipeline to recalibrate the data needs

to be in place.Unlike the STScI, we do not have the

resources to store the large 12 inch opticaldisks in jukeboxes. The approach we havetaken for this recalibration effort is tostore the compressed raw data for allscience observations on-line on CD-ROM.Using standard ‘gzip’ compression, theraw data for a typical WFPC 2 observationoccupies less than 1 MB. The raw data forall public HST data currently fit on fewerthan 70 CDs which are stored in a Pioneer500 CD jukebox. Data are copied to CD-ROM as they become public at the CADCwhere a second disk is made for the ST-ECF.

In order for the recalibration to workproperly, all of the necessary calibrationfiles and tables need to be on-line. TheCADC maintains a directory structurecontaining all of the necessary files byquerying the ‘bestref’ tables each night toidentify the distinct reference files whichare needed. By comparing this list to theones currently on-line, a report is generatedof which files need to be installed.Currently, there are approximately 13 GBof reference files on-line. Often a referencefile is not available at the CADC, ie, it hasnot yet been distributed and must beretrieved from STScI using Starview. Aprocess at the ST–ECF copies the referencefiles regularly over the network fromCADC to ensure that both sites have all ofthe needed reference files on-line.

While there are tasks within IRAF/STSDAS for changing the headerkeywords associated with recalibration,they must be run manually. We havedeveloped a small IRAF/STSDAS

The concept of ‘on-the-fly’ recalibration means a smallerarchive and better quality calibrated products.

Astrocat at the CDSThe CDS has developed a Web interfaceto a large collection of astronomicalcatalogues (Vizier: see a description underhttp://vizier.u-strasbg.fr/). This service,which was not available from our ownWWW interface, is now offered directlyby the CDS. It offers all the nice featuresof a Web form query interface such asWDB, but also allows for the step-by-stepselection of catalogues to query as well asthe combination of several catalogues tobe searched together. Search by (solidangle) cone, with astronomical nametranslation, is also available from the queryscreens.

Figure 3: A small section of version 2 of the Hubble Deep Field data product withthe BVI image on the left and the UBV on the right, each coded with blue, greenand red respectively.

Page 20: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 20

pipeline to automate this process and toperform the complete ‘recalibrationpipeline’. One of the key steps in thisprocess is a script which queries thedatabase to identify the recommendedcalibration reference files and tables. Theoutput from this script is a small IRAFscript which updates the necessary IRAFparameters needed to update the raw dataheaders.

The process which occurs when arequest is received is the following:1. the raw data are copied from CD-ROM

to a staging area and decompressed2. an IRAF process is started which runs

the recalibration pipeline. This pipelinedoes the following:a. it reads the data into IRAF/STSDAS

(converts from FITS to internalformat)

b. runs the script to find out therecommended reference files

c. updates the raw headers to representthe new calibration information

d. runs the appropriate calibrationpipeline and writes the results backas FITS files

3. copies the recalibrated data to theanonymous ftp area for retrieval.

Our automatic process duplicates what auser would do manually. That is: get theraw data, update the header to reflect the

current information on which referencefiles to use and run the standard calibrationpipeline.

Developments in progressThere are several improvements to bemade to this system. Some of thepossibilities being addressed at themoment are:❏ inclusion of the pipelines for the new

instruments (STIS and NICMOS)❏ addition of extra calibration steps for

the WFPC2 (like cosmic ray removal,co-addition of frames). This however,entails the need to create the currentlyunavailable notion of ‘products’ orgroups of observations (see the articlein this Newsletter)

❏ addition of more ‘experimental’calibration steps with eg, improvedbackground treatment (FOS)

❏ the instrument physical models nowbeing developed at the ST–ECF forFOS and STIS

❏ interactive recalibration for expert userswhere processing steps can be turnedoff or on, and user-specified calibrationfiles used.

Other archives have also shown interest inthis new archive paradigm and one couldimagine the final archive of IUE beingoperated in this manner with the addition

of a physical model of its instrument. TheESO VLT observatory will also considerthis approach for its own data archive.

Access to recalibrated dataAlthough automatic re-calibration doesnot necessarily guarantee better resultsthan the original pipeline calibration, weexpect better results in almost all cases. Itis always advisable, however, to carefullyexamine the results and, when in doubt, tocompare both old and new calibrationresults. We will do our best to keepchecking the re-calibration resultsourselves but it would understandably bedifficult to make an exhaustive ex-amination of all results.

Access to recalibrated HST data isoffered though the CADC and ST–ECFWeb interface. Typical processing timefor a WFPC 2 dataset is 3–5 minutes whilefor the FOS and HRS it is approximately1 minute. Users are notified automaticallyand can monitor the progress of the job.The relevant URLs are:http://cadcwww.dao.nrc.ca/hstandhttp://arch-http.hq.eso.org/cgi-bin/wdb/hst/science/form

Recent ST–ECF staff changes

Caulet, Adeline:Contract ended on 3 March,1997

Dolensky, Markus:Took up duty on 3 February 1997

Tolstoy, Eline:ESA External Research Fellow from1 June 1996

Murtagh, Fionn:Contract ended on 14 October 1996

Villar-Martin, Montserrat:DFG studentship ended, Spring 1996

Mirror of NOAO IRAF Archive now available at ST–ECF

The IRAF system and its layered packages such as STSDAS and PROS are now widelyused in Europe and within ESO. When new versions of such packages are required, ordocumentation about the current ones needs to be consulted, it is normally necessary tocontact NOAO in Tucson or access the Web pages there. This typically means thatmultiple copies of many things cross the Atlantic unnecessarily.

In order to make access to this information easier for local ESO users and Europeanusers in general, a ‘mirror’ of the NOAO IRAF ftp directory tree, including all the Webpages, has been established at the ST–ECF. This mirror is automatically updated eachnight to keep it identical to the Tucson version. This means that the latest informationabout IRAF will be available here at the ST–ECF within a day of the changes being madein Tucson and can be accessed directly over fast internal network connections.

The Web pages which contain most of the information of interest to users can beaccessed at:

http://ecf.hq.eso.org/iraf/web

The IRAF ftp archive, which contains the software distributions for many platforms andalso many layered products, is available at:

http://ecf.hq.eso.org/iraf/ftp

or via anonymous FTP at:

ftp://ecf.hq.eso.org/iraf

Another mirror has already been established at the Rutherford Laboratory in England andbeen found very useful. We hope that the ST–ECF one will also be of help. I am verygrateful for help and encouragement from Mike Fitzpatrick (NOAO) and Dave Terrett(RAL) who did most of the work the first time around when they set up the RAL mirror.

Richard Hook

Editorial note (see pages 12,13):A paper entitled “The ultracompactH II region G5.97–1.17 — anevaporating circumstellar disk inM8” by B. Stecklum et al. has beenaccepted for publication in theAstronomical Journal and is also thesubject of a forthcoming ESO pressrelease. This paper makes in-dependent use of the same publicarchival HST WFPC 2 data illustratedand described on pages 12 and 13 ofthis Newsletter. The research utilisesNIR high-resolution imaging with theADONIS instrument at ESO inaddition to the HST data and it reportsthe discovery of the M8 proplyd.

Page 21: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 21

The data reduction procedure requiredto obtain a final observation from

WFPC 2 datasets is an arduous task,particularly so for some archival researchprojects. The implementation of on-the-fly re-calibration at the ST–ECF andCADC goes some way towards alleviatingthis problem. The data access paradigmremains, however, to consider eachexposure individually, re-calibrate it andthen offer the set of results to users forsubsequent data processing.

The archive stores each of the filesgenerated by individual exposures (rawdata, calibration files, calibrated data,anciliary data) but does not store cosmicray cleaned or co-added exposures, nordoes it currently have the capacity tohandle these data. The process of cosmicray removal can be rather time consuming,requiring the identification of the relevantexposures, application of CR-removalalgorithms to only those exposures whichare appropriately aligned. Furthercombination possibly involves complextechniques such as the ‘drizzling’ ofdithered data (see Dickinson & Fosbury1995, Fruchter & Hook 1996 and thearticle by Hook & Fruchter in thisNewsletter).

Ideally, a more user-friendly situationwould be preferred where users can requestthe ‘product’ of any given observation. Asimilar system is currently being preparedfor the new HST instruments (NICMOSand STIS) where the ground system willactually produce ‘observation products’rather than individual exposures in a semi-finished state as is currently the case.

To tackle this problem and provide amore complete service to archiveresearchers, the ST–ECF recently starteda which project aims to:❏ Group together into ‘associations’ those

WFPC 2 images which form a logicalentity. This is the case for perfectlyaligned or dithered exposures.

❏ Subject these associations to CR-removal using appropriate algorithmssuch as STSDAS crrej.

❏ With suitable software, it should bepossible to offer automatic co-additionof dithered images.

This scheme is intended to retrofit existingand future WFPC 2 exposures intoobservation products.

CR rejection and constructionof exposure groupsThe most straightforward, and therefore

WFPC 2 data associations

Paul Bristow, Benoît Pirenne & Alberto Micol

most reliably automated, algorithms forCR-rejection perform a comparison ofcorresponding pixels in several well-aligned exposures. If a pixel from oneexposure contains an anomalously highflux then it is disregarded and replaced bya scaled average derived from the otherexposure(s). Hence it is essential that theinput exposures are very closely alignedto avoid degrading the spatial resolution.Moreover, a small shift in the position ofthe pixel grid with respect to the centroidof a star may confuse the comparison forthe pixels involved.

In principle, one would expect to findmany such perfectly aligned exposures inthe archive. After all, most observingstrategies will implicitly allow for thiskind of CR-removal and also dynamicrange considerations necessitate multipleexposures of any target (though observersfrequently choose to dither these ex-posures). However, our initial attempts toidentify such associations were thwartedby the unreliability of the recordedpositional information for many ex-posures. This stems not from a lack ofprecision (relative offsets can be accuratelymeasured to within a fraction of a PCpixel) but from inconsistently recordedpositions, usually for one of the followingreasons:❏ A set of consecutive exposures is carried

out for a given target and dithering isrequested for some of this set. In somecases, exactly the same co-ordinatesare recorded in the headers for all of theexposures and no indication as to thedithering offset is recorded.

❏ A target is re-acquired at a later epochfor further examinations, the re-acquisition may not be precise, but therecorded co-ordinates reflect the desiredposition (ie, that of the earlier exposure)not the actual position.

These problems were more frequent earlyon. Nowadays, the reliability of theseparameters has improved. In both casesexposures which should not be groupedtogether for CR-removal will be if theheader information is trusted.

In order to avoid this we have opted toextract the mean RA and Dec as recordedin the so called ‘Jitter’ files which containinformation concerning HST’s pointingand other physical characteristics extractedfrom the engineering data. We arecurrently updating the ST–ECF databaseto incorporate this information.

We are currently investigating the most

suitable set of parameters for the crrejtask, however, the resulting images arenot critically dependent upon them.Nevertheless, some users may still preferto carry out this step for themselves,tweaking the parameters to suit theirpreference or even using preferredalgorithms. Such users will still benefitfrom the information we will be supplyingconcerning accurately aligned exposuresand may find our CR-removed imageuseful as a comparison. In most casesthough, we expect that our CR-removedimages will be sufficient.

Future possibilitiesWe want only to provide CR-removedimages which we are confident will bescientifically useful and reliable, hencethe stringent criteria for groupingexposures for CR-removal reflect aconservative approach. However, thismeans that an observation of a given objectremains a collection of CR-removedimages with the addition, perhaps, of someindividual exposures which could not begrouped for CR-removal. Therefore, weintend to group these images together andpresent them as observation products.These groups will contain imagesdeliberately dithered relative to oneanother and those acquired at differentepochs. They will come with the jitterposition information which will aidsubsequent co-addition.

At present we are studying the problemof co-adding these images using varioustechniques, including drizzling. However,for optimum reconstruction of dithereddata (Hook 1995), an interactive approachremains essential. The implementation ofan automated ‘shift and add’ type co-addition remains a possibility but willonly be offered where the products can beexpected to be of the same standard asachieved by an interactive approach.

ReferencesDickinson, M. & Fosbury, R.A.E. 1995, ST–

ECF Newsletter, 22, 14Fruchter, A.S. & Hook, R.N., 1996, http://

www.stsci.edu/~fruchter/ditherHook, R. N., 1995, ST–ECF Newsletter, 22,

16

A project at the ST–ECF to group positionally associatedWFPC 2 images from the archive.

Page 22: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 22

The Wide Field and PlanetaryCamera 2 was the first of HST’sscience instruments to offer an on-

line exposure time calculator accessibleon the World Wide Web (WWW). Itsdevelopment was triggered during 1994by Mark Johnston, who—then still at theSTScI—sensed that the time had come torevive an old idea which had been lyingdormant for several years.

Mark was familiar with some work,which had been pursued at the ST–ECF in1987/88—years before the HST launch—under the umbrella of the so-called‘Artificial Intelligence Pilot Project’(Adorf & Johnston 1987), and to whichthe history of the present WFPC 2 ETCcan be traced. One of the developments atthe time was a fully-fledged exposuretime calculator (Adorf & di SeregoAlighieri 1989a, b) for all first generationscience instruments on-board HST (ie,WF/PC, FOC, GHRS, FOS, and HSP).This tool had been used at the ST–ECF toaid astronomers in the first proposal Cycle.

This first-generation ETC wasdesigned according to quite advancedsoftware engineering principles whichcan still be called ‘modern’ by today’sstandards. No paper manual should berequired for using the software; instead aself-explanatory graphical (WIMP) userinterface was offered along with an on-line help facility using ‘hypertext’—aword virtually unknown at the time withinthe astronomical community.

Using the characteristics of the object,the background and the chosen instrumentas a generic input, the first-generationETCs allowed the estimation of theexposure time from a given signal-to-noise ratio, and vice versa. Underlyingthis ETC was a simplified instrumentdatabase derived from HST’s CalibrationDatabase System (CDBS). The exposuretime and signal-to-noise algorithms forthe original ETC were devised by Sperellodi Serego Alighieri (di Serego Alighieri &Adorf 1988).

While the first-generation ETCsoftware was conceived without priorexposure to spreadsheets, it implementedthe important concept of data-driven‘forward-chaining’ (which is lurkingbehind the scenes in the automatic update-mechanism of every spreadsheet). Thecomputational dependencies are explicitly

The WFPC 2 exposure time calculator— a retrospective

Hans-Martin Adorf

The WFPC 2 exposure time calculator (ETC for short) hasserved its purpose well in the past two proposal submission

Cycles and is now operated and maintained at the STScI. Here Ireview its origin and some of the salient features which have

made it the model for other exposure time calculators which canbe found on the WWW.

recorded, and the result—typically theexposure time—is updated in near real-time, whenever any of the input values(eg, filter, star brightness, etc.) changes.

The major disadvantage of the originalETC implementation, severely restrictingits wider use, was that the software onlyexecuted on dedicated LISP workstations,of which the ST–ECF had one and theSTScI had several.

Software technologyThe first-generation ETCs were imple-mented in Common Lisp and used somenon-standard graphics extensions. (Recallthat no universally accepted graphicsstandards existed at the time, neitherwithin the Lisp community, nor outside.)In addition, KEE-rules (KEE =Knowledge Engineering Environment)were used for data-driven ‘forward-chaining’ leading to the near-real timeupdating of the result, as explained above.

In the six years from 1988 to 1994,when the idea of an ETC was revived forWFPC 2 (then a new HST instrument),software and hardware technology hadadvanced significantly, particularly in thearea of graphical screens and networks.When the original suite of ETCs was im-plemented, they excecuted on one ofESO’s first workstations equipped with afull-scale graphics screen. By 1994,however, workstations with graphicsscreens were abundant at ESO aselsewhere in astronomy. The astronomicalcommunity had also moved away fromDEC VAXes and the DECnet based SPANnetwork towards UNIX workstations andto the universally accepted TCP/IP-basedInternet. The concept of client-servercomputing was becoming widely ac-cepted.

Not least in importance, the WWWwas about to bury several costly userinterfaces, such as that of STARCATdeveloped by the ST–ECF, that of ADSdeveloped by NASA, and that ofSTARVIEW, developed at the STScI.The question naturally arose of how torevive the ETC, considering the enormouschanges in the software environments nowcommon in the astronomical community.

Using the instrument modelsAt some stage, consideration was given tousing the fully-fledged SYNPHOT

instrument model software as the back-end compute engine of the WFPC 2 ETC.In the end, however, it turned out to be toocomplicated and to require too much co-ordination within too limited a time period.While SYNPHOT was not excluded as along-term solution, the quick fix requiredfor Cycle 6 effectively ruled it out.

Implementation languageDuring the design phase in 1994 it wascontemplated to revive the ETC in Lisp,which is such a beautiful, well-engineeredlanguage and so much fun to work with.This choice was not so completely out ofthe question as it may appear, since thereare now robust commercial Lispinterpreters/compilers for UNIX-workstations and , using an Internet-basedclient-server approach, the compute enginecould reside on the server machine,together with the simplified instrumentand object databases. Only the user-interface had to be exported to the clientmachine at the user’s institution.

The major drawback of the Lisp-approach appeared to be the long start-uptime required before a Lisp-based serverprocess becomes fully operational uponrequest by a remote client. Thus, in theend, a decision was made to re-implementthe compute engine in Tcl which, apartfrom its LISP-likeness, is one of the mostprominent languages for WWW CGI-binscripts, but with a less arcane syntax thanPerl.

User interfaceHaving chosen the implementationlanguage for the compute engine, adecision still had to be made concerningthe user-interface (UIF). Two choices werecontemplated. One was to establish adedicated UIF using the Toolkit language‘Tk’. This clearly had the advantage ofallowing the implementation of acompletely unconstrained UIF. However,the potential user would always have firstto download the current UIF code beforebeing able to run the ETC.

The other option was to base the UIFon HTML and so be ready for use via theWWW. Since the user interactions withthe ETC’s UIF were not too complicated,the options offered by an HTML-basedUIF appeared to be marginally sufficient.Such an implementation had the advantage

Page 23: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

Number 24March 1997 Page 23

that the current UIF code would bedownloadable transparently from theserver by simply clicking on the link (theURL) pointing to the ETC compute engineon the server. In the end, the reasonsfavouring the latter solution outweighedthe concerns that it might be too inflexibleand the decision was taken to go with anHTML-based UIF to be interfaced to theTcl-based compute engine.

While a single UIF or ‘spreadsheet’,usable for both point sources and extendedsources, would have had some virtues, thedesire for reducing the apparent com-plexity to the user combined with therestrictions of the HTML implementationlanguage enforced a split into two. (At theSTScI the split was later carried furtherwhen four such ‘spreadsheets’ werecreated by adding two which included astellar background.)

The algorithmsFor the second generation WFPC 2 ETC,support was sought from the STScI, anddelivered by Anatoly Suchkov, who at thetime was working in the WFPC 2instrument team. Anatoly’s algorithmswere based on those originally devised bySperello, but enhanced at various points.

ImplementationAfter the algorithm design was completed,the implementation languages chosen andthe go-ahead given by ST–ECF and theSTScI management for the second-generation version, the actual coding couldproceed, and the initial version wasfinished within about two days.

Subsequently, the software wassuccessfully exported to the STScI whereit was welcomed by John Biretta from theWFPC 2 team who quickly familiarisedhimself with the code and startedincorporating substantial enhancements.

A few bugs were weeded out, somefeatures added, and the results checkedagainst SYNPHOT. The modified codewas eventually returned to the ST–ECF.Thus, the WFPC 2 ETC was put intooperation on both sides of the Atlantic justin time for the Cycle 6 proposal deadlinein 1995.

UsageThe acceptance of the ETC by proposerswas overwhelming. The log-recordsshowed that the ETC was invoked 300times per week, apparently withoutoverloading system resources, totallingaround 12,000 calculations during theweeks immediately preceding the 1995proposal deadline. (Actually, the manyrequests occasionally caused an overflowin the system log, which therefore had tobe deleted daily.)

The use of the WFPC-2 ETC wasquoted in many proposals. One of themost missed features was the batch-

processing capability which was actuallyimplemented and available to local usersat the ST–ECF and the STScI but, due tothe restrictions of the HTML-based UIF,was not offered to remote users. Somepeople wanted to import the whole ETCto their home machines. A questionnairewas sent out by the STScI to sample useropinions more systematically.

ConclusionAfter its early experimental evolution, theWFPC-2 ETC eventually came to life in1995 largely due to the foresight andpersistent encouragement of MarkJohnston. Since then the ETC has servedas a forerunner for other exposure timecalculators, such as those for STIS andNICMOS on HST and the ESO SUSIimager.

For us at the ST–ECF, it was possibleto quickly recognise the power andpotential of the WWW which the ST–ECF had started using already in 1993(Adorf 1994). This was due in part to ourexperience with the ‘Artificial IntelligencePilot Project’, advanced user-interfacetechnology, data-driven forward-chainingfeature and hypertext help system. For thesame reason we were able to quickly realisethe virtues of a simple exposure timecalculator accessible via the WWW and,having had prior experience of all the

above techniques, could offer it in a timelyfashion.

In addition to HST’s long-termscheduler called SPIKE, which waspartially implemented while MarkJohnston was visiting the ST–ECF onleave from the STScI, the ‘AI Pilot Project’had the effect of spawning the currentgeneration of ETCs.

ReferencesAdorf, H.-M.: 1994, ‘Electronic access to HST

information II — the World Wide Web’,ST–ECF Newslett. 21, 31–34

Adorf, H.-M., Johnston, M.D.: 1987, ‘TheArtificial Intelligence Pilot Project of theST–ECF’, ST–ECF Newslett. 8, 4–5

Adorf, H.-M., di Serego Alighieri, S.: 1989a,‘An Expert Assistant Supporting HubbleSpace Telescope Proposal Preparation’, in:Proc. Workshop ‘Data Analysis inAstronomy III’, Jun 1988, Erice, Italy, V.D.Gesù et al. (eds.), Plenum Press, NewYork, USA, pp. 225–235

Adorf, H.-M., di Serego Alighieri, S.: 1989b,‘HST — Expert Assistant Supports ProposalPreparation’, ST–ECF Newslett. 10, 6–10.

di Serego Alighieri, S., Adorf, H.-M.: 1988,‘Hubble Space Telescope - Expert Assistant,Supplement to the Exposure TimeCalculators’, ST–ECF Technical Report

Availability of HST instrument exposure time calculators

The WFPC 2 ETC is accessible at the STScI via the URLhttp://www.stsci.edu/ftp/instrument_news/WFPC2/Wfpc2_etc/

wfpc2-etc.htmlAt the ST–ECF, a copy is held at the URL

http://ecf.hq.eso.org/wfpc2-etc/wfpc2-etc.html

The NICMOS ETC is accessible at the STScI via URLhttp://www.stsci.edu/ftp/instrument_news/NICMOS/NICMOS_tools/

NICMOS_etc_fast_2/nicmos_etc.html

The two STIS ETCs (one for imagery, one for spectroscopy) are accessibleat the STScI via the STIS proposal tools page at URL

http://www.stsci.edu/ftp/instrument_news/STIS/stis_prop.html

Page 24: Number 24 March 1997 ST–ECF Newsletter · 2013-12-06 · Mission’ is described in the article by Stiavelli et al. in this issue. There is, as yet, no official European Photo:

ST–ECF NewsletterPage 24

We should like this Newsletter to reach as wide an audience of interested astronomers as possible. If you arenot on the mailing list but would like to receive future issues, please write to the editor stating your affiliation.

Contacts at ST–ECFHead: P. Benvenuti

Tel. +49–89–320 06 290Computer: PBENVENU

Science Data andSoftware: R. Albrecht

Tel. +49–89–320 06 287Computer: RALBRECH

Science InstrumentInformation: R.A.E. Fosbury

Tel. +49–89–320 06 235Computer: RFOSBURY

Computer hot-line and informationHot-line: [email protected] (for e-mail)WWW: http://ecf.hq.eso.org/

Editor: Robert FosburySub-editor: Jeremy WalshEditorial assistant: Britt SjöbergProduction: Robert Fosbury

Contents

STS-82 and beyond ........................................................................ 2

Next Generation Space Telescope

design reference mission .......................................................... 4

NICMOS grism spectroscopy software .......................................... 7

Drizzling dithered WFPC frames .................................................... 9

NICMOS point spread functions ................................................... 10

More reflections on the TAC......................................................... 11

WFPC 2 images of the Lagoon nebula ........................................ 12

Predictive calibration using instrument models ............................ 14

Final archives for FOS and GHRS ............................................... 16

The Archive column ...................................................................... 18

Automatic recalibration

a new archive paradigm.......................................................... 19

WFPC 2 data associations ........................................................... 21

The WFPC 2 exposure time calculator

a retrospective ........................................................................ 22

SPACE TELESCOPE

European Coordinating Facility

Cover picture: Shorly before liftoffon the morning of the 11th Februaryfor the second Hubble SpaceTelescope servicing mission,Discovery — with two newinstruments, a new fine guidancesensor and various other pieces ofharware aboard — is illuminated byfloodlights on pad 39A at theKennedy Space Center in Florida.

Below: NASA photograph of liftoff.The same event, seen by the editorfrom 3.5 miles, is shown on page 2.It must be assumed that NASAemploys a braver photographer!

Typeset with Apple® Macintosh™Printed by Royal Media Werbgesellschaft mbH, Ottobrunn

Published by theSpace Telescope–European Coordinating FacilityEuropean Southern ObservatoryKarl-Schwarzschild-Str. 2D–85748 Garching bei MünchenGermanyTelephone: +49–89–320 06 291Telefax: +49–89–320 06 480Internet: <user>@eso.org