apecs 2.0 upgrading the apex control system
Post on 30-Jan-2016
45 Views
Preview:
DESCRIPTION
TRANSCRIPT
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 1
APECS 2.0APECS 2.0Upgrading the APEX Control SystemUpgrading the APEX Control System
Dirk Muders, Heiko Hafok, MPIfR, BonnDirk Muders, Heiko Hafok, MPIfR, Bonn
APECS History APECS 2.0
Development The Upgrade
Jan. 2009 Changes
compared to APECS 1.1
AP
EC
S =
Ata
cam
a P
athf
inde
r E
xper
imen
t Con
trol
Sys
tem
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 2
APECS Origins
The ALMA Test Interferometer Control Software (TICS) was immediately usable at APEX due to the common hardware interface
Decided to re-use ALMA Common Software (ACS) and TICS and to benefit from large development team (already a dozen people in the year 2000)
ACS provides the Common Object Request Broker Architecture (CORBA) middleware
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 3
ACS Container
CORBA in 30 Seconds…
CORBA
Component
ACS
ORB
(Knows
deploy-
ment)
Client
(Does not
know
deploy-
ment)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 4
TICS
TICS is used to control the antenna via ACS/CORBA components to set up observing patterns
These components run under VxWorks to fulfill the real-time Controller Area Network (CAN) bus protocol
Testing the ALMA prototypes was performed mainly via low-level Python scripts using those components directly
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 5
What else was needed for APE(X)CS ?
To use ACS and TICS at APEX all our devices (instruments, auxiliary hardware, etc.) needed to be represented as CORBA components
An astronomer-friendly user interface to set up typical sub-mm observing scans
An online calibration pipeline was required to provide data products for astronomers
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 6
APECS Component Interface
In contrast to ALMA, APEX was always supposed to have many different instruments (receivers and spectrometers), facility and PI
Thus we decided to define generic high-level instrument interfaces to be used by all devices of the same kind
This facilitates adding new devices and setting up observations enormously since it is now also generic
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 7
SCPI Interface Level
The low-level hardware control systems could not use CORBA directly
Instead we adopted an SCPI (Standard Commands for Programmable Instrumentation) ASCII protocol via sockets to communicate between CORBA components and hardware controllers
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 8
ACS Container
APECS SCPI Setup
Real
Hardware
Simulated
Hardware
(emuEmbSys)
CORBA
Component
ACS
ORB
CORBA
SCPI
SCPI
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 9
SCPI Communication
Decoupling of CORBA and hardware controllers proved to be extremely useful because it isolates APECS from hardware which is
developed by many different groups / institutes it allows to plug in simple emulator scripts to
simulate a full system of instruments for APECS developments without real hardware
all CORBA components can be created fully automatically without any further manual interaction
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 10
The APECS Pipeline
All communication between the Observing Engine and devices or other high-level applications is performed via ACS/CORBA
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 11
APECS 0.1 Installation (09/2003)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 12
APECS 0.1 Installation (09/2003)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 13
APECS 0.2-1.1
Following the initial phase, APECS was debugged and extended a lot during the antenna and early instrument commissioning
We continued to deliver patches and new releases to provide control for more complex instruments and observing modes and to fix bugs
APECS 1.1 was still based on ACS 2.0.1 and ran on machines with RedHat Linux 7.2 (released in 2001, i.e. stone age !)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 14
The APECS Upgrade
Why upgrade at all ? Mainly the old Linux was making trouble Data rates ever increase and new servers are
needed to handle them RH 7.2 can no longer be installed on them In addition, ACS 2.0.1 has a number of known
issues that have been cured in later versions Old Linux libraries limit the development of new
APECS applications
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 15
APECS 2.0 Development
An upgrade was already planned in 2004 In an effort led by J. Ibsen, ALMA had ported
TICS to ACS 3.0/4.0 But the ongoing APEX commissioning delayed
our APECS porting Only in 2007 we found the time to port the
APECS and TICS codes to then ACS 5.0
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 16
The VxWorks Drama
Unfortunately, we discovered bugs in the ACS property monitoring that required at least ACS 6.0.4
But that ACS version no longer worked with VxWorks → canceled upgrade in 01/2008
A joint effort of ESO, UTFSM, the Keck Observatory, Remedy IT in the Netherlands made ACS 7.0.2 work under VxWorks in mid 2008
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 17
APECS 2.0 Software
APECS 2.0 is based on ACS 8.0, Scientific Linux 5.2 (this is ahead of ALMA who use SL 4.4 !) and VxWorks 6.6
Using an up-to-date ACS allows to benefit again from future ALMA developments and bug fixes
During 2008 most of the TICS and APECS functionality was kept aligned to APECS 1.1
Some areas are more advanced in APECS 2.0 (e.g. Calibrator → Heiko’s presentation)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 18
APECS 2.0 HardwareName Type CPU RAM Disks Location
control2 HP ProLiant DL380 G4
Dual Xeon HT 3.2 GHz
4 GB 2 x High-Altitude SCA Disks (RAID 1)
New server room
instruments2 HP ProLiant DL380 G4
Dual Xeon HT 3.2 GHz
4 GB 2 x High-Altitude SCA Disks (RAID 1)
Instrument container
apexdev2 HP ProLiant DL380 G4
Dual Xeon HT 3.2 GHz
4 GB 2 x High-Altitude SCA Disks (RAID 1)
New server room
display2 HP ProLiant DL360 G5
Dual QuadCore Xeon E5345
4 GB 1 x High-Altitude SCA Disk, DotHill RAID
New server room
opt2 Industry PC Core2Duo T2500 4 GB 1 x High-Altitude SCA Disk
Instrument container
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 19
APECS 2.0 Deployment
Control2
CORBA Services
Observing Engine
Display2
FitsWriter
Calibrator
Instruments2
CORBA Containers with Instr. Comp.
SCPI Parsers
Opt2
Frame Grabber Comp.
DB2 Data Base
Apexdev2
Observing Clients
Monitoring Clients
Instrum
ents
Abm
CORBA Container with Antenna Components
An
ten
na
(AC
U, P
TC
, WIU
)CORB
A
SCPI
CAN
Bus
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 20
Software Changes
No changes in “apecs” CLI. Observing scripts will run like before
Access to CORBA properties via “apexObsUtils” did not change
Component and method access did change with new ACS (→ special training session)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 21
Operational Changes
To reduce the dependency on a working microwave link, APECS is now running only at the high site
Access from Sequitor and remote via VNCs. “stopAllAPECSClients” now stops observer
processes too. Always use “restartAPECSServers” which includes stopping the clients. Restart now takes only 6 min. !
More details in the training session
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 22
New Network Setup
In addition to the new APECS, the network has been upgraded to 1(/10) Gbit (reached 55 MB/s (!) for file transfers) and physically split into 3 subnets at Chajnantor: Control (CORBA, SCPI) Data (TCP streams from backends to FitsWriter) Maintenance (everything else: web cams, thin
clients, notebooks, etc.) Additional “SciOps” subnet for observational
machines in Sequitor
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 23
opt 2Optical CameraTerminal to ABM
instruments2instrument control
control2ObsEngine
CORBA-ServicesVNCServer
display2FitsWriterCalibrator
apexdev2Development
APECSvncserver
Hardware devices
Pointing Telescope
Backends
Telescope/WobblerABM
Monitoring Things WEB Cams tcserver
pularVNC client
apexdevVNC client
tacoraVNC client
apexdb2VNC client
VNC Connections
SequitorChajnantor
Control netw
ork, Data netw
ork
Maintenance network
lastarriaArchive
rsync
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 24
APECS 2.0 Testing [1]
During the installation a number of tests have been made to verify the performance: Initial drive tests: no strange motions / vibrations Tracking tests at different az/el: like in APECS 1.1 Optical pointing runs: agree with previous results Radio scans with SHFI and SZ (calibration,
pointing, focus, skydip, on, raster, otf (also holo mode), spiral raster) produced MBFITS and Class data as expected. Line profiles and intensities agree.
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 25
APECS 2.0 Testing [2]
Throughput tests: AFFTS with 28 x 8192 channels @ 10 Hz (8.75
MB/s !) without delays or other problems SZ beam maps (which failed in Nov. 2008) with 141
subscans, OTF, continuous data (i.e. one 25 minute subscan !) without problems
Data acquisition via data network was tested with PBEs and SZ backend. ABBA/Bridge tests soon. Spectrometers later.
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 26
Issues
APECS 2.0 is good, but not perfect: ABM NC tasks (az/el & wobbler pos.) crashes
sometimes. Being iterated with B. Jeram (ACS). Online bolometer reduction needs iteration with real
data. Gildas as of 12/2008 has some bugs. Will need to
update to a more recent version soon. Cron jobs handling syslogs need to be adjusted. optView GUI freezes after “movie” mode.
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 27
Conclusions
APECS 2.0 with new network performs much better than the previous system
We are no longer bound to very old hardware and can fulfill requirements of new instruments
Observing modes that started failing last year are now possible again
New observing modes with higher data rates can now be used
Future developments made easier by new Linux and libraries.
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 28
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 29
APECS Design
APECS is designed as a pipeline system starting with a scan description (“scan object”) and eventually leading to data products
The pipeline is coordinated by the central “Observing Engine”
Most APECS applications are written in Python, but they use a lot of compiled libraries to speed up computations and transactions
The astronomer interface is an IPython shell
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 30
Multi-Beam FITS (MBFITS)
The lack of a good format to store array receiver data of single-dish radio telescopes led to the development of the MBFITS raw data format
MBFITS stores the instrument and telescope data in a hierarchy of FITS files
MBFITS is now being used at the APEX, Effelsberg and the IRAM 30m telescopes
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 31
APECS 0.1
The first version of APECS was installed in Chile in September 2003
Much of the time was spent on hardware installations (network, racks, servers)
We had to fight initial problems with failing pressurized hard disk boxes and missing infrastructure
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 32
The APECS Pipeline
For each scan the Observing Engine accepts a Scan Object from a user CLI and Sets up the receivers (tuning; amplifier calibration) Configures the backends Sets up auxiliary devices such as synthesizers, IF
processors or the wobbler Tells the telescope to move in the requested pattern Starts and stops the data acquisition Asks the Calibrator to process the raw data to
produce the final data product or result
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 33
APECS 2.0 Software (ctd.)
Basic operator / observer interfaces unchanged Many improvements such as:
Jlog with automatic filter loading New Qt GUI behavior (e.g. Calibrator Client) New Qt widgets available New Python (2.5) New libraries (e.g. SciPy, NumPy)
New Gildas (12/2008) which fixes weighting errors when combining spectra
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 34
APECS' Astronomer CLI
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 35
Software Changes [1]
Names of some variables have changed: APEXROOT → APECSROOT APEXCONFIG → APECSCONFIG APEXSYSLOGS → APECSSYSLOGS
CORBA component name separator “/” instead of “:”, e.g. “APEX/RADIOMETER/RESULTS”
Component access (needs above syntax): apexObsUtils.sc.get_device →
apexObsUtils.sc.getComponentNonSticky
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 36
Software Changes [2]
Property access compatible with APECS 1.1: apexObsUtils.getMCPoint with arbitrary separator
chosen from “/”, “:” and “.”, e.g. apexObsUtils.getMCPoint(‘APEX:HET345.state’) or apexObsUtils.getMCPoint(‘APEX.HET345/state’)
List of components: apexObsUtils.sc.COBs_available →
apexObsUtils.sc.availableComponents Enums in MonitorQuery are now strings instead
of integers (e.g. SHUTTER_OPEN)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 37
Software Changes [3]
Web scripts now need to run on “opt2” to fetch data from the DB2 (e.g. for the weather page)
Python wrapper for Gnuplot has been removed from SciPy. Use Pylab / matplotlib instead.
Observer account administration now on “apexdev2”
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 38
Operational Changes [1]
Separate AMD maps to access /apexdata in Sequitor (lastarria:/apex_archive) and Chajnantor (display2:/apexdata) to minimize microwave link traffic. Initial reduction on “apexdev2”. Later maybe also on “display2”.
System VNC: control2:1 Observing VNC(s): apexdev2:1 Other VNC(s) (Wobb. GUI, SHFI, etc.):
apexdev2:2/3/4 Observing accounts only on “apexdev2”
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 39
Operational Changes [2]
Syslogs are now split per hour to reduce link traffic
Syslogs can be loaded into “jlog” for inspection (“jlog –u $APECSSYSLOGS/APECS-<time stamp>.syslog.xml”)
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 40
Operational Changes [3]
Two new accounts: “apexops”: pointing & focus model, system source
& line catalog administration in $APECSCONFIG “apexdata”: raw & science data and obslog areas
are owned by this account to avoid manipulations via the “apex” account whose password is not secret. The “apexdata” password must not be given to observers !!
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 41
Operational Changes [4]
As a consequence the data production programs have to run under “apexdata”
This is accomplished by special scripts that use corresponding “ssh” and “sudo” commands: fitsWriter start | stop | restart onlineCalibrator start | stop | restart obsLoggerServer start | stop | restart obsEngine start | stop | restart (for convenience)
Do not use the old restart scripts ! The overall system still runs under “apex” !!
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 42
Operational Changes [5]
tycho, apexOptRun and MonitorQuery now to be run on “opt2”
DB2 entries now obey the complete naming hierarchy
“stopAllAPECSClients” now stops observer processes too. Always use “restartAPECSServers” which includes stopping the clients. Restart takes only 6 min. now !
ABM console now via “minicom 1” on “opt2”
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 43
APEX Staff To Do List
All Linux machines in Sequitor should be updated to SL 5.2
At least “llaima” should be available for offline data reduction with SL 5.2 and APECS 2.0 soon
The LDAP and DNS services need to be moved to new servers (ideally off of normal PCs)
The remaining web cams etc. need to be reconfigured to the maintenance subnet
Port “fieldtalk” programs for optical camera to SL 5.2
APECS 2.0 Upgrade, Group Meeting, 10.3.2009 44
We would like to thank the APEX staff for their support during the
installation and tests of APECS 2.0 !
top related