EuroVO-DCA Grid Workshop

The workshop Agenda

Day Session Time Speaker Title Abstract
Day one - 9th April, Wednesday
Registration 9:00 - 13:00
Welcome 14:00 -14:10 Giuliano Taffoni
Application and Data Centres Experience Session
14:10-14:50 Charles Loomis Application experience in EGEE abstract/slides
14:50-15:30 Matthias Steinmetz D-Grid and AstroGrid-D abstract/slides
Coffee break / Posters 15:30 - 16:00
16:00-16:40 Luigi Fusco ESRIN and earth science experience abstract/slides
16:40-17:10 Claudio Vuerli Astro cluster in EGEE abstract/slides
17:10-17:50 Frank le Petit Astro Cluster in Paris abstract/slides
17:50-18:20 Petr Skoda Identification of Important VO Spectral Services benefitting from deployment on the GRID abstract/slides
18:20-18:40 All Discussion
Day two - 10th April, Thursday
Infrastructure and interoperability session

Grid concepts

9:00-9:40 Erwin Laure  EGEE infrastructure and interoperability issues abstract/slides
9:40-10:20 Claudio Geller The DEISA HPC Grid infrastructure for astrophysical applications abstract/slides

International scientific grid projects

10:20-11:00 Edwin Valentijn Lofar Information System Design abstract/slides
Coffee break / Posters 11:00 - 11:30

National grid infrastructure

11:30-12:00 Fabio Pasian Italian grid for Astrophysics abstract/slides
12:00-12:30 Ruben Alvaretz ESAC Grid Infrastructure abstract/slides
12:30-12:55 Juan de Dios Santander Vela Grid and VO activities in Spain abstract/slides
Lunch 11:55 - 13:55

National grid infrastructure (continue)

13:55-14:20 Grazina Tautvaisiene E-infrastructure in Baltic states abstract/slides/demo
14:20-14:50 Ugo Becciani PI2S2 : grid and new challenges abstract/slides

Grid and VO interoperability

14:50-15:20 Francoise Genova  EuroVO projects abstract/slides
15:20-16:00 Ohishi Masatoshi  JVO and NaReGi (Japanese Grid middleware initiative) abstract/slides
Coffee break / Posters 16:00 - 16:20
16:20- 17:00 Guy Rixon EuroVO-TECH and AstroGrid abstract/slides
17:00-17:30 André Schaaff Workflow systems and VO standards abstract/slides
17:30-18:00 Giuseppe Longo Using Support Vector Machines for AGN classification  abstract/slides
18:00-18:20 Iliya Nickelt Stellaris abstract/slides
Social Diner 19:00-...
Day Three, 11th April, Friday
Infrastructure and interoperability session (continue)
9:00- 9:40 Frank Capello Grid5k abstract/slides
9:40-10:00 Natalia Deniskina Astrogrid and GRID interfacement abstract/slides
10:00-10:20 All Discussion
Coffee break / Posters 10:20 - 10:40
Tools session
10:40-11:10 Harald Kornmayer g-Eclipse abstract/slides
11:10-11:40 Torsten Ensslin, Wolfgang Hovest The Planck Process Coordinator workflow engine on the Grid abstract/slides - demo
11:40-12:00 Kevin Benson Taverna in the VO abstract/ slides
12:00-12:15 Richard Hook ESO Reflex abstract/slides
12:15-12:45 Guy Rixon Astrogrid tools abstract/slides
Close 12:45-13:00

The Abstract ebook

Dr. Charles Loomis

Applications Using the EGEE Grid Infrastructure

The presentation will give a general overview of those scientific disciplines that are using the EGEE grid infrastructure. It will describe the how the grid helps people within those disciplines achieve their goals, including what grid features they use. As the grid middleware continues to evolve, the talk will conclude with a brief list of features that can be expected in the future.

Prof. Matthias Steinmetz

D-Grid and AstroGrid-D

Dr. Luigi Fusco

ESRIN and earth science experience

Dr. Claudio Vuerli

EGEE AA cluster

The Astronomical and Astrophysical Cluster in EGEE aims at establishing and consolidating a well motivated astronomical community that make use of the Grid technology. I present the status of the cluster, participants and activity. I highlight the future activity that will be done on EGEE-III.

Dr. Frank le Petit

Astro Cluster in Paris

A Cluster Astronomy & Astrophysics has been created in EGEE III. The french participation gathers projects from 5 Observatories: Paris, Grenoble, Lyon, Nice and Strasbourg. This cover a broad range of hot topics. In the Horizon project, scientists are interested to use the Grid for simulations in cosmology to constrain dark energy scenarii, to perform radiative transfer on large scale structures in preparation to SKADS, to model mergers of galaxies. The Herschel/ALMA projects cover computations in atomic and molecular physics and modeling of the interstellar medium for a fast scientific return of the large equipments. The IMCCE projects aim at computing orbits in the solar system. Some Grid projects are direcly linked to the Virtual Observatory as data-processing with workflows, datamining to discover small planetary corpses and access to legacy codes and computing ressources thanks to the VO. I will present a review of these scientific projects, of the concerned scientists' expectations towards the Grid, and will do the point on the French A&A infrastructure concerning the Grid.

Dr. Petr Skoda

Identification of Important VO Spectral Services benefitting from deployment on the GRID

The majority of VO-compatible spectra handling applications operates only with a few spectra entirely downloaded from single or several SSAP servers. We try to identify the scientific cases which could immediately benefit from future SSAP applications designed for GRID deployment. Their key feature is the sophisticated spectra preselection and preprocessing done on distributed servers using the intelligent agent summarising the results and performing final high-level processing or analysis.

Dr. Erwin Laure

Towards Seamless Grid Computing - The EGEE Experience on Interoperable Grid Infrastructures

As result of the significant research effort put into Grid computing, large scale international as well as regional production Grid infrastructures offer their users access to distributed computing and data resources at an unprecedented level. EGEE (Enabling Grids for eScience) for instance, operates a large scale production Grid infrastructure federating over 250 sites from 48 countries world-wide providing over 45000 CPUs and about 15 PB of disk storage to a wide variety of scientific applications. In this talk we review the challenges and successes of EGEE in building, operating, and evolving the Grid infrastructure and highlight a few example applications. We discuss challenges and successes in interoperating with other Grid infrastructures and present future directions of Grids in Europe, in particular how National Grid Infrastructures will pave the way to the sustainable provision of production Grids.

Dr. Claudio Gheller

The DEISA HPC Grid infrastructure for astrophysical applications

Many scientific projects require large computational resources. Fast CPUs, large amounts of memory, huge and reliable file systems, long term storage devices are needed to complete all the steps of the data production, reduction, management and analysis pipeline. Distributed architectures can provide the necessary power to go through the whole workflow. We present here the solutions proposed by the DEISA HPC Grid infrastructure, with its Extreme Computing Initiative (DECI). The DECI focused mainly on “Grand Challenge” applications that could be migrated and adapted to the DEISA environment with little or moderate application enabling work. Some of the selected and succesfully run applications are in the field of astrophysics. We will in particular focus on the simulation program for the ESA-Planck mission.

Prof. Edwin Valentijn

Lofar Information System Design

We present an architectural design of the long term storage and user-enabling of the LOFAR post Blue Gene data stream. The study describes the LOFAR Long Term Archive (LTA) and involves connectivity to and possible enhancements of various existing infrastructures, ranging from the TARGET/Groningen Center for Information technology to Astro-WISE, EGEE/Grid and the EURO-VO. While the life cycle and storage of data items around the Central Processing (CEP) will in general be limited by a 2-week cycle, the LTA will host all relevant LOFAR data items beyond this 2-week cycle.

Dr. Fabio Pasian

IGI - the Italian Grid initiative and its impact for the Astrophysics community

IGI - the Association for the Italian Grid Infrastructure - has been established as a consortium of 12 different national institutions to provide long term sustainability for the Italian Grid. Its formal predecessor, the project, has come to a close in 2006; to extend its benefits of this project, IGI has taken over and acts as the national coordinator for the different pieces of the Italian e-Infrastructure present in EGEE. IGI plans to support activities in a vast range of scientific disciplines - e.g. Physics, Astrophysics, Biology, Health, Chemistry, Geophysics, Economy, Finance - and any possible extensions to other sectors such as Civil Protection, e-Learning, dissemination in Universities and secondary schools. Among these, the Astrophysics community is active as a user by porting applications of various kinds, but also as a resource provider in terms of computing power and storage, and as middleware developer.

Dr. Ruben Alvarez Timon

Grid usage in an Astronomical Data Centre

After a few years of usage of Grid technologies in the European Space Astronomy Center (ESAC) and once the astronomers are getting used to these technologies new possibilities both for result enhancement and for collaboration with other Astronomy institutes have started to be explored. This talk will focus on the evolution of the Grid infrastructure at ESAC and some examples of such usage will be presented, showing the current status and also the immediate future development:

  • Herschel pipeline processing
  • The XMM-Newton Remote Interface for Science Analysis (RISA)
  • The production of mosaics on the Grid from XMM-Newton data, and
  • The Integral bulk data processing
Then future evolution and conclusions will be presented.

Juan de Dios Santander Vela

Virtual Observatory and Grid in Spain

The Virtual Observatory is nearing maturity, and in Spain we have had a Spanish Virtual Observatory since June 2004. There have also been numerous attempts at providing more or less encompassing grid initiatives at the national level, and finally Spain has an official National Grid Initiative (NGI). In this talk we will show the VO and Grid development status in Spain, and we will hint at potential joint VO-Grid use-cases.

Prof. Gra¸ina Tautvai¨ienė

E-infrastructure in Baltic states and its application in astrophysics

E-infrastructure in Baltic states currently is rapidly developing. A large impact in this area was given by the FP6 project "BalticGrid". A review of progress in this field as well as currently developed grid applications in astrophysics will be presented.

Dr. Ugo Becciani

PI2S2: grid and new challenges

Dr. Francoise Genova

EuroVO projects

Prof. Ohishi Masatoshi

JVO and NaReGi(Japanese Grid middleware initiative)

Japanese Virtual Observatory (JVO) project has entered into its operations on the data service ( since March 2008. It provides accsess to more than 1,000 data resourses via the IVOA standards, SIAP, SSAP and so on. JVO has constructed a Grid-like data analysis server system in order to process the SuprimeCAM data of the Subaru telescope, and the processed mosaic images can be downloaded on demand. JVO has also tried to construct a trial Grid system by means of the NaReGi middleware, by federating the National Astronomical Observatory of Japan and the KEK (Institute for High Energy Physics, located in Tsukuba, Japan).

Dr. Guy Rixon

Grid computing with IVOA standards and VOTech components

The VOTech project is a design-study to prepare for the construction of the European Virtual Observatory. VOTech is providing infrastructure-software based on IVOA standards. This report describes the standards and products relevant to Grid computing in two respects: the Virtual Observatory used as a grid of applications and the Virtual observatory connected to external grids of processing and storage resources.

Astro Grid tools

The AstroGrid project of the UK provides software for remote data-processing via the Virtual Observatory. Some of this software is included in the EuroVO infrastructure provided by VOTech. This report shows how these capabilities are presented to scientists in the user interface. EuroVO relies on member institutions to host remote-data-processing applications, and the report outlines how this is may be done with AstroGrid software.

André Schaaff

Workflow systems and VO standards

After a quick introduction to workflow systems, the presentation will focus on the use of Characterization, an IVOA standard, in a workflow test bed architecture. The execution of a workflow may require substantial computing resources and this can take a significant amount of time. Our goal is to introduce a validation step in the workflow process before the "real" execution. This work is done in the frame of the VOTECH project.

Prof. Giuseppe Longo

Classifying AGN with Support Vector Machines in a distributed computing environment

Accurate and robust clustering algorithm (both supervised and unsupervised) scale very badly with the number of records and features to be processed and this has so far prevented their use on massive astronomical data sets. We discuss how recent advances in the implementation of the Virtual Observatory Infrastructure and more in particular of the VONeural tool implemented within The Euro-VO can be used to perform complex Data Mining tasks. After summarizing the general characteristics of the Support Vector Machines (SVM) for supervised clustering and how they have been implemented in a distributed computing environment, we present the results of an an application of Support Vector Machines to the recognition and characterization of AGN within the SDSS data set.

Dr. Iliya Nickelt-Czysykowski


Stellaris is a flexible information service that we use as a central instance in AstroGrid- D to manage and structure resource- and job information, as well as other necessary metadata. It uses XML and RDF and understands SparQL requests. We are also planning to extend Stellaris for VO use, e.g. to store and publish meta information of scientific data.

Dr. Franck Capello


The Computer Science discipline, especially in large scale distributed systems like Grids and P2P systems and in high performance computing areas, tends to address issues related to increasingly complex systems, gathering thousands to millions of non trivial components. Theoretical analysis, simulation and even emulation are reaching their limits. Like in other scientific disciplines such as physics, chemistry and life sciences, there is a need to develop, run and maintain generations of scientific instruments for the observation and the experimentation of complex distributed systems running at real scale and under reproducible experimental conditions. Grid'5000 is a large scale system designed as scientific instruments for researchers in the domains of Grid, P2P and networking. More than a a testbed, Grid'5000 has been designed as a "Computer Science Fully Reconfigurable Large Scale Distributed and Parallel System". It allows researchers to share experimental resources spanning over large geographical distances, to allocate resources, to configure them, to run their experiments, to realize precise measurements and to replay the same experiments with the same experimental conditions. Computer scientists use this platform to address issues in the different software layers between the hardware and the users: networking protocols, OS, middleware, parallel application runtimes, applications. In this talk, we will present:

  1. the motivations, design and current status of Grid'5000,
  2. some key results at different level of the software stack,
  3. the impact of this system as research tools,
  4. ALADDIN, the INRIA initiative to make Grid'5000 a sustainable research platform.

Dr. Natalia Deniskina

Astrogrid and GRID interfacement

N. Deniskina, G. Longo, G. D'Angelo Dipartimento di Scienze Fisiche – Università Federico II di Napoli, via Cinthia 6 – 80131 Napoli
We present GRID-launcher: an interface between the ASTROGRID and GRID infrastructures. GRID-launcher allows an ASTROGRID user (who has the ASTROGRID certificate but no GRID certificate) to start a computational tasks on the Grid from the ASTROGRID Workbench. “Grid_launcher” has been implemented and tested on :
VONeural_MLP (supervised clustering), VONeural_SVM (supervised clustering), Sextractor (extraction of object-catalogs from astronomical images), SWARP (re-sample and co-add FITS images using any arbitrary astrometric projection defined in the WCS standard).
All these programs are registered inside the CEC of ASTROGRID.

Dr. Harald Kornmayer


The g-Eclipse project provides an integrated workbench framework to access existing Grid infrastructures. The framework is built on top of the reliable Eclipse platform. The focus of the framework is the development of reliable and easy-to-use tools to access Grids in a middleware independent way. These tools may easily be extended for many different middlewares. The g-Eclipse framework will support Grid users, Grid operators and Grid developers. g-Eclipse itself comes with support for the gLite and the GRIA middleware. Additionally the first prototype implementation for Amazon Storage (S3) is available. The talk will give an introduction to the g-Eclipse framework, its status and a short demo.

Torsten Ensslin, Wolfgang Hovest

The Planck Process Coordinator workflow engine on the Grid

The 'Process Coordinator' (ProC) is a generic scientific workflow engine, originally developed for the Planck Surveyor mission of ESA. In the frame of the AstroGrid-D project it's being interfaced with Grid middleware to allow simple utilisation of supercomputer clusters. In this talk the project will be presented and a live demo will be shown.

Kevin Benson

Taverna in the VO

K M Benson, N A Walton, D K Witherick, T Oinn

Taverna (see has been developed by the BioInformatics community - where it is now well established in use. The European Southern Observatory (ESO) has used Taverna as the basis for its Reflex workflow system. ESO aim to utilise Reflex in implementing data reduction pipelines specifically tailored for a range of their instruments on the VLT and other ESO telescopes.
This demo shows how AstroGrid have developed a plug-in adaptor for Taverna, thereby exposing the standard range of IVOA standard services, thus those for images (SIAP), spectra (SSAP), tables (TAP), and applications (CEA-Common Execution Architecture) to Taverna.

Dr. Richard Hook

ESO Reflex : A Graphical Workflow Engine for Data Reduction

Sampo was a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008.
The project developed a prototype application called ESO Reflex that integrates a modern graphical workflow system and existing legacy data reduction algorithms. Most of the raw data produced by ESO instruments is reduced sing recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Relex can invoke CPL-based recipes in a flexible way through a general purpose interface.
ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access VO web services have been successfully performed.
This contribution will describe ESO Reflex and show several examples of its use both locally and with the VO. ESO Reflex is expected to be released to the community in late 2008/early 2009.


Topic attachments
I Attachment History Action Size Date Who Comment
Pptppt 20080409-eurovo-v1.2.ppt r1 manage 5468.0 K 10 Apr 2008 - 07:51 TaffoniGiuliano Loomis Talk
Pdfpdf 20080411-g-EclipseAstroWorkshop.pdf r1 manage 2744.8 K 28 Apr 2008 - 11:57 AndreSchaaff  
Pdfpdf AAG.pdf r1 manage 1420.4 K 10 Apr 2008 - 07:52 TaffoniGiuliano Vuerli Talk
Pptppt D-Grid+AstroGrid_Steinmetz.ppt r1 manage 14367.5 K 10 Apr 2008 - 07:53 TaffoniGiuliano D-Grid and AstroGrid-D Talk
Pdfpdf EURO-VO-2008April10_Ohishi.pdf r1 manage 5983.2 K 10 Apr 2008 - 15:54 TaffoniGiuliano JVO and NAREGI
Pptppt EURO-VO-DCA-GRID-08-valentin-pub.ppt r1 manage 5026.0 K 10 Apr 2008 - 11:32 TaffoniGiuliano e-Lofar talk
Pdfpdf EuroVODCA-Garching-Schaaff.pdf r1 manage 925.8 K 10 Apr 2008 - 16:29 AndreSchaaff Andre Schaaff Workflow systems and VO standards
Pptppt EuroVO_April_2008.ppt r1 manage 1758.0 K 11 Apr 2008 - 08:54 AndreSchaaff Richard Hook ESO Reflex
Pdfpdf FrancklePetit.pdf r1 manage 19502.0 K 10 Apr 2008 - 08:01 TaffoniGiuliano AstroCluster at Paris
Pdfpdf Garching.SpanishVOGrid.20080410.pdf r2 r1 manage 12919.3 K 10 Apr 2008 - 11:55 AndrewBelikov Juan de Dios Santander Vela Grid and VO activities in Spain
Pdfpdf Grid5000EuroVo.pdf r1 manage 12072.8 K 11 Apr 2008 - 09:31 AndreSchaaff Frank Capello grid2k
Pdfpdf Grid_at_ESA_Apr_08.pdf r1 manage 7814.6 K 10 Apr 2008 - 12:35 AndreSchaaff Luigi Fusco ESRIN and earth science expirience
Pptppt Laure-EGEE-EuroVO.ppt r1 manage 9686.5 K 10 Apr 2008 - 09:55 AndrewBelikov Erwin Laure
Pptppt Nickelt_Stellaris.ppt r1 manage 991.0 K 10 Apr 2008 - 19:43 AndrewBelikov Iliya Nickelt Stellaris
Pdfpdf ProC_Grid.pdf r1 manage 2226.1 K 11 Apr 2008 - 11:09 AndreSchaaff  
Pdfpdf RubenAlvarez_garching.pdf r1 manage 10813.8 K 10 Apr 2008 - 11:23 TaffoniGiuliano ESAC Grid
Mpgmpg SYNTSPEC_DEMO.mpg r1 manage 18094.6 K 10 Apr 2008 - 15:54 AndrewBelikov Grazina Tautvaisiene SYNTSPEC demo
Pptppt VO-Grid-Tautvaisiene.ppt r1 manage 2739.0 K 10 Apr 2008 - 12:22 AndrewBelikov Grazina Tautvaisiene E-infrastructure in Baltic states
Pdfpdf ag-tools-dca-april-2008.pdf r1 manage 1200.6 K 10 Apr 2008 - 08:02 TaffoniGiuliano AstroGrid Tools
Pptppt defanimated_VOGRID.ppt r1 manage 6577.5 K 10 Apr 2008 - 07:58 TaffoniGiuliano Skoda Talk
Pptppt fpasian_IGI.ppt r1 manage 907.5 K 10 Apr 2008 - 09:45 TaffoniGiuliano Pasian Talk
Pptppt garching-deisa-2008.ppt r1 manage 4961.5 K 10 Apr 2008 - 08:04 TaffoniGiuliano DEISA Talk
Pdfpdf grid_astrogrid_interface.pdf r1 manage 423.9 K 10 Apr 2008 - 10:05 TaffoniGiuliano EGEE AstroGrid Interface
Pdfpdf gridworkshop_genova.pdf r1 manage 3833.0 K 10 Apr 2008 - 13:56 TaffoniGiuliano Genova Presentation
Pdfpdf longo.pdf r1 manage 2420.4 K 11 Apr 2008 - 08:19 AndrewBelikov Giuseppe Longo pdf file converted from presentation
Pptxpptx longo.pptx r1 manage 2465.7 K 11 Apr 2008 - 08:20 AndrewBelikov Giuseppe Longo original presentation
Pdfpdf taverna_vopdf.pdf r1 manage 374.4 K 11 Apr 2008 - 11:48 AndreSchaaff  
Pdfpdf ube_10Aprile08.pdf r1 manage 2777.2 K 10 Apr 2008 - 13:47 TaffoniGiuliano Becciani Talk
Aviavi video4.avi r1 manage 2409.4 K 11 Apr 2008 - 11:09 AndreSchaaff  
Pdfpdf votech-ivoa-grid-dca-april-2008.pdf r1 manage 462.2 K 10 Apr 2008 - 08:03 TaffoniGiuliano ivo and EuroVO TECH
Edit | Attach | Watch | Print version | History: r36 < r35 < r34 < r33 < r32 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r36 - 28 Apr 2008 - AndreSchaaff
This site is powered by the TWiki collaboration platform Powered by Perl This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback