anasac, 2003-08-25, chicago computing b.e. glendenning (nrao)

44
ANASAC, 2003-08-25, Chicago Computing Computing B.E. Glendenning (NRAO)

Upload: miles-haynes

Post on 26-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago

ComputingComputing

B.E. Glendenning (NRAO)

Page 2: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 2

ALMA Project

Outline

1. Context

2. Management

3. PDR results

4. Software Overview / Architecture

5. Science testing plans

6. AIPS++

7. Pipeline

Page 3: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 3

ALMA Project

ALMA Context

• Timeline: Interim operations ~2007.5, Regular operations 2012.0

• Computing is one of 9 Integrated Product Teams• $35M / $552M (Computing/Total) = 6.3%

– FTE-y ratio = 230/1515 = 15%

Page 4: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 4

ALMA Project

Scope

• Computing:– Software development– Necessary operational computer equipment– System/network administration (operations phases)

• Subsystems: proposal preparation, monitoring, dynamic scheduling, equipment control and calibration, correlator control and processing, archiving, automated pipeline processing, offline processing– Not: embedded software (hardware groups), algorithm development

(Science)

• Activities: management, requirements, analysis, common software, software engineering, integration and test

Page 5: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 5

ALMA Project

Scope (2)

Scope: Observing preparation and support through archival access to automatically produced (pipeline) images

Page 6: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 6

ALMA Project

Development Groups

• 50%/50% Europe/North America• 10 Institutes, 13 Sites!

Range: HIA/Victoria = 0.5FTE, NRAO/AOC = 15

– Communications difficult, more formal processes

+ Access to world expertise

Page 7: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 7

ALMA Project

Buy vs. Build

• Want to be parsimonious with development funds• Want system sufficiently “modern” that it will suffice for

the construction period and some time thereafter (CORBA, XML, Java, C++, Python)– Many open tools/frameworks (ACE/TAO, omniOrb, GNU tools,

etc)

• After search, ALMA Common Software (ACS) code base adopted from accelerator community

• Simplified CORBA framework, useful services

• Astronomy-domain adopted code bases– Calibration, imaging: AIPS++– Archive: NGAST– Atmosphere: ATM

Page 8: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 8

ALMA Project

Strategies

• ACS– provide a common technical way of working

• Continuous scientific input through Subsystem Scientists• Synchronized 6-month releases

– A common pace for the project– Check requirements completion

• Yearly design/planning reviews– React to surprises

• Retain some construction budget to allow for discoveries made during interim operations period

Page 9: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 9

ALMA Project

Management Planning

Computing Plan (framework) and 15 subsystem agreements prepared– Computing Management Plan approved by JAO

– Subsystem agreements approved by Computing IPT

Management model:– Agreement “contracts” followed by subsystem scientists (scope) and

software management (schedule, cost)

– Initial PDR, yearly CDRs and release planning

– Synchronized 6-month releases across all subsystems

– SSR requirements mapped to releases for progress tracking

Page 10: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 10

ALMA Project

Management Planning (2)

– Readiness reviews, acceptance tests, and delivery to interim operations in 2007

– “Data flow” items get a second development period during interim operations

• review requirements after first experience with operations• Not subject to current agreements – institutional responsibility could shift

Package Feature name Short description SSR requirement Numbers

To be completed at Release (R1.0-3.0)

Status at Milestone T0 (N,P,C)

ObsProject CheckObsDatabase Check observation database for conflict

3.0R10 R3.0 N

ObsProject Local save Save/recover programs to/from local disk in human readable form

3.0-R11, 3.1R3, 3.1R12

R1.0 N

Page 11: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 11

ALMA Project

Preliminary Design Review

• March 18-20, Tucson– Panel

• R. Doxsey (CHAIR) (STScI, Head HST Mission Office)• P. Quinn (ESO, Head of Data Management and Operations Division)• N. Radziwill (NRAO, Head of GBT Software Development)• J. Richer (Cambridge, ALMA/UK Project Scientist, Science Advisory

Committee)• D. Sramek (NRAO, ALMA Systems Engineering IPT Lead)• S. Wampler (NSO, Software Architect)• A. Wootten (NRAO, ALMA/North America Project Scientist) 

Page 12: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 12

ALMA Project

Preliminary Design Review (2)

• Well prepared PDR; at or ahead of where similar projects have been at this stage

• The ALMA Project needs to develop an understanding of operations, and the Computing IPT needs to fold this in to their planning and priorities. This might result in reprioritization of SSR requirements

• ALMA project needs to define clearly the steps necessary for getting to, and implementing, Interim Operations and the consequent requirements for computing support

Page 13: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 13

ALMA Project

Preliminary Design Review (3)

• The interaction with AIPS++ requires careful management. Operational (Pipeline) requirements on AIPS++ need further development.

• Testing effort seems satisfactory; management will need to follow-up to ensure subsystem unit tests are in fact carried out

Page 14: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 14

ALMA Project

Software Scope

•From the cradle… – Proposal Preparation

– Proposal Review

– Program Preparation

– Dynamic Scheduling of Programs

– Observation

– Calibration & Imaging

– Data Delivery & Archiving

•Afterlife: – Archival Research & VO Compliance

Page 15: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 15

ALMA Project

And it has to look easy…

• “1.0-R1 The ALMA software shall offer an easy to use interface to any user and should not assume detailed knowledge of millimeter astronomy and of the ALMA hardware.

• “1.0-R4 The general user shall be offered fully supported, standard observing modes to achieve the project goals, expressed in terms of science parameters rather than technical quantities. Observing modes shall allow automatic fine tuning of observing parameters to adapt to small changes in observing conditions.”

• Which means that what is simple for the user will be complex for the software developer.– Architecture should relieve developer of unnecessary complexity

– Separation of functional from technical concerns

• But the expert must be able to exercise full control

Page 16: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 16

ALMA Project

Observatory tasks

• Administration of projects• Monitoring and quality control• Scheduling of maintenance• Scheduling of personnel• Security and access control

Page 17: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 17

ALMA Project

The numbers

• Average/peak data rates of 6/60 Mbyte/s– Raw (uv) data ~ ⅔, image data ~ ⅓ of the total

– Assumes • Baseline correlator, 10 s integration time, 5000 channels

• Can tradeoff integration time vs. channels

– Implies ~ 180 Tbyte/y to archive

– Archive access rates could be ~5 higher (cf. HST)

• Feedback from calibration to operations– ~ 0.5 s from observation to result (pointing, focus, phase noise)

• Science data processing must keep pace (on average) with data acquisition

Page 18: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 18

ALMA Project

Meta-Requirements

• “Standard Observing Modes” won’t be standard for a long time– e.g., OTF mosaics, phase calibration at submm λ

• Instrument likely to change & grow– Atacama Compact Array (ACA)

– Second-generation correlator• Could drastically increase data rate (possible even with baseline correlator,

might be demanded for OTF mosaics)

– Computer hardware will continue to evolve

• Development spread across ≥ 2 continents & cultures

Page 19: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 19

ALMA Project

What do these requirements imply for the architecture of the software?

• Must facilitate development of new observing modes (learning by doing)

• Must allow scaling to new hardware, higher data rates• Must enable distributed development

– Modular

– Standard• Encourage doing the same thing either a) in the same way

everywhere; or b) only once.

Page 20: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 20

ALMA Project

ObservationPreparation

Scheduling

Data ReductionPipeline

Archive

Executive

ALMA Common Software

PrincipalInvestigator

1. Create observing project

2. Store observingproject

3. Get projectdefinition

4. Dispatch scheduling block id

6. Start data reduction

8. Notify PI

7.1. Get raw data & meta-data

7.2. Store science results

9. Get projectdata

ArchiveResearcher

TelescopeOperator

f. Get science data

d. Notifyof

SpecialCondition

e. StartStop

Configure

c. Alter Schedule / Override action

Control System

Correlator

Calibration Pipeline

Quick Look Pipeline

5. Execute scheduling block

5.2 Setup correlator

5.3. Storeraw data

5.4. Storemeta-data

5.6. Store calibration results

5.7. Store quick-look results

Primary functional paths Additional functions ALMA software subsystem external agent

Real-time

a. Monitorpoints

b. Monitorpoints

5.5b. Access raw data & meta-data

h. Store admin data

g. breakpointresponse

5.5a. Access raw data & meta-data

5.1. Get SB

Page 21: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 21

ALMA Project

Functional Aspects

• Executive, Archive, ACS are global: all other subsystems interact with them.– ACS: common software – a foundational role – Executive: start, stop, monitor – an oversight role– Archive: object persistence, configuration data, long-term science data – a

structural support role• Instrument operations

– Control, correlator, quick-look pipeline, calibration pipeline – real-time– Scheduling (near real-time)

• “External” subsystems– Observation preparation and planning– Science data reduction pipeline (not necessarily “online”)

Page 22: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 22

ALMA Project

ObservationPreparation

Scheduling

Archive

2. Store observingproject

3. Get projectdefinition

9. Get projectdata

1. Create observing project

4. Scheduling block id

Configure, Control,Acquire, Calibrate, Image

Real-time5.1. Get SB10 Kb/30 min

5.3. Raw data4-40 Mbyte/s

5.4. Meta-data100 Kbyte/s

5.6. Calibrationresults

8. Notify PI

Data ReductionPipeline

6. Start data reduction

7.1. Raw data & meta-data~ 4 Mbyte/s

7.2. Science results2 Mbyte/s

QL Results

Monitor data25-500 Kbyte/s

PI

Operator

Page 23: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 23

ALMA Project

Archive

4. Dispatch scheduling block id (once per ~30 min)

Control System

Correlator

Calibration Pipeline

Quick Look Pipeline

5. Execute scheduling block

5.4. Storemeta-data

5.6. Store calibration results

5.7. Store quick-look results1-10 Mbyte/30 min

Real-time

a. Monitorpoints

5.5b. Access raw data & meta-data5.5a. Access raw data & meta-data

5.1. Get SB

5.3. Storeraw data

b. Monitorpoints

5.2 Setup correlator

4-40 Mbyte/s

Scheduler

Ptg, Focus,Phase Noise

Scheduling Block Execution

Page 24: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 24

ALMA Project

ALMA Software User Test Plan: Status

Software test plans being developed by SSR subsystem scientists and subsystem leads.

Test plan components:Use Cases – descriptions of operational modes and what external dependencies exist. Designed to exercise subsystem interfaces, functionality, & user interfaces. Test Cases – Use Case subset designed to test specific functions.Testing timeline (when tests run in relation to Releases, CDRs).Test Definitions - specifies which test case will be run, what the test focus is, and whether the test is automated or involves users.Test Reports (e.g., user reports, audit updates, summary).

Test Plan drafts for all subsystems to be completed by Oct 1.

Page 25: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 25

ALMA Project

Software test plan guidelines June 5, 2003: Test Plan approved by Comp mgt & leads. June 11, 2003: Test Plan presented to SSR. (ALMA sitescape, SSR draft documents).

Use Case development:July 9, 2003: Use Case guidelines, html templates, & examples put on the web (www.aoc.nrao.edu/~dshepher/alma/usecases) & presented to SSR. Aug 2003: Detailed Use Cases being written for all Subsystems.

Test Plan development:Sept 2003: Draft test plans to be completed. Oct 1, 2003: Test plans problems identified/reconciled, resources identifiedNov 2003: First User test scheduled (for Observing Tool subsystem).

ALMA Software User Test Plan: Status

Page 26: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 26

ALMA Project

Use Cases written to-date:Observing Preparation Subsystem: OT.UC.SingleFieldSetup.html Single Field, Single Line setupOT.UC.MultiFieldMosaicSetup.html Multi-Field Mosaic setup OT.UC.SurveyFewRegionsSeveralObjects.html Set Up to do a Survey of a Few Regions with Several

Objects. OT.UC.SpectralSurvey.html Set Up to do a Spectral Line Survey. Control Subsystem: Control.UC.automatic.html Automatic Operations Use Case. Offline Subsystem: Offline.UC.SnglFldReduce.html Reduce & Image Single Field DataOffline.UC.MosaicReduce.html Reduce & Image Multi-Field MosaicOffline.UC.TotPower.html Reduce, Image Auto-Correlation DataPipeline Subsystem: Pipeline.UC.ProcSciData.html Process Science Data Pipeline.UC.SnglFld.html Science Pipeline: Process Single Field DataPipeline.UC.Mosaic.html Science Pipeline: Process Mosaic, no short spacingsPipeline.QLDataProc.html Quick-Look Pipeline: Data Processing Pipeline.QLCalMon.html Quick-Look Pipeline: Monitor Calibration DataPipeline.QLArrayMon.html Quick-Look Pipeline: Monitor Array DataScheduling Subsystem: Sched.UC.dynamic.html Dynamic Mode (Automatic) Operations Sched.UC.interactive.html Interactive Mode (Manual) Operations

ALMA Software User Test Plan Status

Page 27: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 27

ALMA Project

Page 28: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 28

ALMA Project

Page 29: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 29

ALMA Project

AIPS++ Evaluation

• AIPS++ (along with the ESO Next Generation Archive System) is a major package used by ALMA– Both to ensure at least one complete data reduction package is

available to users and in implementing ALMA systems (notably the Pipeline)

• AIPS++ is a very controversial package (long development period, has not received wide acceptance)

• ALMA Computing has arranged several evaluations– Audit of capabilities based on documentation– AIPS++/IRAM test to test suitability for millimeter data– Benchmarking tests

• Technical review of AIPS++ March 5-7 2003 – Sound technical base, management changes needed

Page 30: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 30

ALMA Project

AIPS++ Audit

Criti

cal

Impo

rtant

Desi

rabl

e

AcceptableInadequate

UnavailableTBD

00.5

11.5

22.5

33.5

4

Acceptable

Inadequate

Unavailable

TBD

Work to be done by ALMA

These should be 0 (in ~2007)

These should be <10% of the total

Explanatory – not results!

Page 31: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 31

ALMA Project

Critical

Important

Desirable

Acceptable

Inadequate

Unavailable

TBD

0

20

40

60

80

100

120

140

160

180

200

Overall

Acceptable

Inadequate

Unavailable

TBD

Page 32: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 32

ALMA Project

AIPS++ Audit Results - Summary

All: 58% (Acceptable) / 16 % (Inadequate) / 16% (Unavailable) / 10% (TBD)

– Critical 66% / 14% / 12% / 8%

– Important 52% / 19% / 19% / 10%

– Desirable 35% / 17% / 33% / 15%

14% of requirements have had differing grades assigned by auditors

Page 33: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 33

ALMA Project

AIPS++/IRAM Tests

• Phase 1: Can AIPS++ Reduce real mm wave data?– Yes, but schedule was very extended

• Partly underestimated effort, mostly priority setting

• ALMA/NRAO and EVLA now directly manages AIPS++– And for the next 12 months ALMA has complete control of priorities

• Phase 2: Can new users process similar but new data?– Generally yes, but it is too hard

• Phase 3: Performance (described next)

Page 34: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 34

ALMA Project

AIPS++ Benchmark Status:

Four requirements related to AIPS++ performance:

2.1.1 R4 – Performance of the Package shall be quantifiable and commensurate with data processing requirements of ALMA and scientific needs of users. Benchmarks shall be made for a fiducial set of reduction tasks on specified test data.

2.2.2 R1.1 – GUI window updates shall be < 0.1s on same host.

2.3.2 R4 – Package must be able to handle, efficiently & gracefully, datasets larger than main memory of host system.

2.7.2 R3 – Display plot update speed shall not be a bottleneck. Speed shall be benchmarked and should be commensurate with comparable plotting packages.

ASAC: AIPS++ must be within factor of 2 of comparable pkgs.

Page 35: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 35

ALMA Project

AIPS++ Benchmark Strategy

• Finish AIPS++/IRAM Phase 3 (performance) test

• Set up automated, web-accessible, performance regression tests of AIPS++ against AIPS, Gildas, and Miriad– Start simple, then extend to more complex data

• Systematically work through performance problems in importance order– Resolution of some issues will require scientific input (e.g., when is an

inexact polarization calculation OK)

• Decide in Summer 2004 (CDR2) if AIPS++ performance issues have arisen from lack of attention or for fundamental technical reasons (“fatal flaw”)

Page 36: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 36

ALMA Project

Full AIPS++/AIPS/Gildas/Miriad Comparison not possible

• Different processing capabilities (polarization) and data formats

Standard ALMA-TI AIPS++ PdBI MIRIAD VLA Export

Package FITS FITS format format format format

GILDAS

MIRIAD

AIPS

AIPS++

Compare AIPS++ with GILDAS on one dataset in ALMA-TI FITS format Compare AIPS++ with MIRIAD & AIPS on another dataset in FITS format

Page 37: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 37

ALMA Project

AIPS++/IRAM Phase 3(ALMA-sized data, single field spectroscopic)

GILDAS/CLIC AIPS++ A/G Comments

Filler 1873 109395.8Init (write header info) 385 n/aFill model/corr data cols. 2140n/aPhCor (Check Ph-corr data) 889 34843.9 (AIPS++ Glish)RF (Bandpass cal) 5572 22980.4Phase (Phase cal) 3164 11110.4Flux (Absolute flux cal) 1900 20931.2 (AIPS++ Glish)Amp (Amplitude cal) 2242 6140.3Table (Split out calib src data) 1200 51504.3Image 332 7502.3

Total 17600s 28600s 1.6

• Caveats: DRAFT results, bug in AIPS++ bandpass calibration requires too much memory (1.7GB AIPS++ vs. 1.0GB Gildas)

• Gildas executables copied, not compiled an benchmark machine• Several AIPS++ values still be amenable to significant improvement

Page 38: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 38

ALMA Project

AIPS++ Benchmark Status:

SSR has identified 2 initial benchmark datasets:

Pseudo GG Tau – PdBI data of 25 March. Original observation expanded to 64 antennas with GILDAS simulator & source structure converted to point source. 3 & 1 mm continuum & spectral line emission. Data in ALMA-TI FITS format (same data used during AIPS++ re-use Phase III test).

Ensure continuous comparisons in time with AIPS++ Ph III ‘re-use’ testCompare core functions (fill, calibrate, image) on ALMA-size datasetExercise mm-specific processing steps

Polarized continuum data – VLA polarized continuum emission in grav lens 0957+561, 6cm continuum, 1 spectral window. Snapshot observation extended in time with AIPS++ simulator to increase run-time. Data in Std FITS format.

Exercise full polarization calibration, self-calibration, non-point source imaging (polarization processing can only be compared with MIRIAD/AIPS).

Results to be published on web for each AIPS++ stable release.

Page 39: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 39

ALMA Project

Calibrater Performance Improvements vs. Time

Page 40: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 40

ALMA Project

Calibration Performance vs. AIPS

Page 41: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 41

ALMA Project

Calibrater – Still TODO

Page 42: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 42

ALMA Project

Imager PerformanceImaging Performance Improvements:

Exe

cuti

on T

iim

e ( s

ec)

Image Size (NxN pixels)

Imaging performance:

Improved by factor of 1.8 for 2048 pixels.

Improved by factor of 4.4 for 4096 pixels.

AIPS++/AIPS ratio now 1.6 for 2048 pixels & 1.8 for 4096 pix.

Now dominated by more general polarization processing in AIPS++?

This is I Multi-polarization should be relatively faster in AIPS++, but needs to be demonstrated

Page 43: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 43

ALMA Project

AIPS++ Benchmark Status:

Dataset expansion: SSR will identify datasets in the following areas:

Spectral line, polarized emission. Multi-config dataset if possible Multi-field interferometric mosaic Large, simulated dataset, includes atmospheric opacity variations

and phase noise Single-dish + interferometer combination in uv plane (no SD

reduction now (MIRIAD/AIPS do not process SD data, GILDAS only processes IRAM-format SD data & cannot convert to FITS).

NOTE: Glish-Based GUIs will be replaced with JAVA GUIs once ACS/Corba framework conversion is complete. benchmark comparisons affecting GUI and plotting interface will be delayed until JAVA GUIs ready to test.

Page 44: ANASAC, 2003-08-25, Chicago Computing B.E. Glendenning (NRAO)

ANASAC, 2003-08-25, Chicago 44

ALMA Project

Pipeline

Three current development tracks:• Paperwork: assemble use cases, write test plans, develop

heuristics “decision trees”• Implement top-level interfaces required by the rest of the system

(e.g., to start a “stub” pipeline when ordered to by the scheduling subsystem)

• Technology development / prototype pipeline– VLA GRB observations– Bind AIPS++ “engines” to ALMA technology (Python, CORBA (i.e.,

ACS)– Gain experience for possible package-independent execution framework

• To be used by at least AIPS++• Allow pipeline computations to be performed by different packages• Possible relevance for VO