Australian Virtual Observatory A distributed volume rendering grid service
description
Transcript of Australian Virtual Observatory A distributed volume rendering grid service
Australian Virtual ObservatoryAustralian Virtual Observatory
A distributed volume rendering A distributed volume rendering grid servicegrid service
Gridbus 2003
June 7 Melbourne University
David BarnesSchool of Physics, The University of Melbourne
OverviewOverview
• what is a virtual observatory?
• astronomy data cubes 101
• volume rendering
• distributed data volume rendering
• turning it into a grid serviceturning it into a grid service
• future projects
Virtual observatoriesVirtual observatories
• bring legacy astronomy archives on-line and ensure future project compliance
• describe data fully, and support a finite, well-chosen set of interoperability protocols
• develop tools and interfaces to find, acquire, process and visualisevisualise data
• build national and international grids and embed the data, tools and interfaces in those grids
Astronomy data Astronomy data cubes 101cubes 101
• you may have only seen 2d astronomy images
• an increasing number of telescopes and simulations produce multi-dimensional data
• astronomy data cubes are 3d arrays of pixels (voxels)
• typically the axes might be latitude and longitude on the sky, and frequency of radiation
• lots of information!lots of information!
QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.
Right ascension
Declination
Radio frequency
Volume renderingVolume rendering
• 3d data can be viewed in slices, or we can render lines of sight through the entire volume - this is volume renderingvolume rendering and may offer new insights to complex data collections
Distributed data volume renderingDistributed data volume rendering
• split large volume into smaller pieces• share the pieces out to nodes of a Beowulf
cluster• on demand the nodes render their piece of
data• other nodes glue the pieces together to form
the final image• provides increased speed and ability to
handle larger-than-memory volumes• See Beeson, Barnes & Bourke, 2003. PASA, submitted
Distributed data volume renderingDistributed data volume rendering
• Rendering controlled by a remote client connected on a socket
• Joint project with Joint project with AstroGridAstroGrid (UK) to (UK) to recast the software as recast the software as a a grid servicegrid service for for demonstration in July demonstration in July at a major astronomy at a major astronomy conference in Sydney.conference in Sydney.
QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.
Making a grid serviceMaking a grid service
• Collaborating groups now include– Melbourne (Physics & Computer Science / SE)– AstroGrid (Cambridge, Leicester)– VPAC, APAC, CSIRO CMIS, …, as data centres and
rendering clusters
• Lead is being set by Guy Rixon (Cambridge) who has designed the system and is managing the project plan day-to-day
• Why?– Saves you from fetching large data files– Enables use of distributed computing resources– Demonstrator of grid technologies for VOs
StructureStructure
• PortalPortal provide an interface for the user to find and select data and to select a rendering cluster (80% complete)
• Data centreData centre service provides a registry of its data holdings and some tools to eg. extract sub-images (60%)
• Data centre runs a gsiftp server to provide authenticated access to the data (~100%)
• Cluster centreCluster centre service fetches the data, starts up a rendering tree, loads the data and opens up a port (90%)
• Portal provides an appletapplet to connect to that port and control and display the rendering (25%)
Development environmentDevelopment environment
• Globus 2.4 for gsiftp servers• Tomcat 4.1.24 for portals and service
wrappers• Globus 3.0 alpha 4 for grid services
deployed within Tomcat• Sun J2SDK 1.4.1_03• Netscape 7.02 (Gecko/20030208)• All data and rendering centres are Linux• Tested clients include Linux, Windows and
Mac OS X
““Release 0” - June 6 2003Release 0” - June 6 2003
• One hard-coded compressed FITS image in place of final data selection result
• One hard-coded rendering cluster in place of final cluster selection result
• Rendering cluster retrieves image from data centre via HTTP, decompresses it, converts it to volume rendering input format and stores it locally
• Applet served from portal server, running in client’s browser, successfully connects to rendering cluster and requests an image.
The futureThe future
• Jia’s GridFTP client code to be incorporated next week - render cluster service complete!
• Data registry and data centre grid service including selection to be ready in ~two weeks
• Display and control applet to be largely completed over next four weeks.
Beyond the demo…Beyond the demo…
• Review demonstration in August• CSIRO ATNF group developing Java
interface to legacy astronomy software– suitable long-term location of this project?
• Conversion of Beowulf-class rendering tree to genuine distributed grid service for the piecewise rendering?
• Integration with massive on-line parameterised databases?