iter fast plant system controller prototype based on...

19
Page 1 ITER Fast Plant System Controller Prototype Based on PXI Platform M.Ruiz & J.Vega on behalf of CIEMAT/UPM/IST/ITER team Universidad Politécnica de Madrid Asociación Euratom/CIEMAT IPFN, Instituto Superior Técnico ITER Organization

Upload: hamien

Post on 11-Jun-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1

ITER Fast Plant System Controller Prototype Based on

PXI Platform M.Ruiz & J.Vega

on behalf of CIEMAT/UPM/IST/ITER team

Universidad Politécnica de Madrid Asociación Euratom/CIEMAT

IPFN, Instituto Superior Técnico ITER Organization

Page 2 07/02/2011 Meeting

Outline

•  Project scope and requirements. •  FPSC HW elements. •  FPSC SW elements.

–  Applications running in the controller. –  Data acquisition. FPSC control using EPICS PVs –  Streaming/archiving applications

•  Conclusions.

Page 3 07/02/2011 Meeting

Project Scope and requirements

•  ITER CODAC Design identifies two types of Plant System Controllers (PSC): –  Slow PSC is based on industrial automation technology(control loops rates <1 kHz). –  The Fast PSC is based on embedded technology with higher sampling rates and

more stringent real-time requirements. •  Essentials requirements of FPSC:

–  Data acquisition and preprocessing –  Interfacing with the networks (PON, TCN, SDN, streaming/archiving networks) –  LINUX OS and EPICS IOC. System setup and operation using process variables. –  COTS solutions.

•  •  Developing a prototype FPSC targeting Data

Acquisition for ITER IO –  Two different form factors for the

implementation: •  ATCA based solution (IST) •  PCIe based solution (CIEMAT/UPM)

–  A two steps approach: Alpha and Beta version.

Page 4 07/02/2011 Meeting

ATCA form factor (IST/IPFN): Alpha version

14  U

2  U

2  U

Network  cards

IO

Self-descrption toolkit

CODAC

Central  database

_________________________________________________________________

______

_________________________________________________________________

______

_________________________________________________________________

______

CODAC  systems

2  U

PCIe

/MXI

co

nnec

tor b

oard

2  U

Fast  Plant  System  Controller

4  U

PCIe  bus  extension

Synchronous  Databus  Network  (SDN)

Scientific  Data  Archiving  (SDA) High  

PerformanceComputers

2  U

IO  shelf

1588  switch

1588

33  MHz

1588Master  clock

Time  Communication  Network  (TCN)Plant  Operational  

Network  (PON)

CPU  shelf

CPU

4  U PSH

Diagnostics

Actuators

Fusion Experiment

HMI

Page 5 07/02/2011 Meeting

PXI form factor: alpha version

•  Requirements –  Hardware: COTS –  Software: Linux RHEL 64 bits & EPICS

•  Issues (2010) –  The drivers (and device support ) are not available under Linux 64 bits

•  Other people in charge of the development •  Greatly complicated development to be finished in a limited time

•  Solution for alpha version: –  Labview Real Time based (to avoid third parties dependences, to test system

capabilities and to learn about problems and gain experience for the beta version) –  PXIe solution using:

•  National Instruments hardware (PXI chassis, timing modules, DAQ using FlexRIO and external controller)

•  LabVIEW RT Module applications running in the controller •  LabVIEW FPGA for FlexRIO •  LabVIEW EPICS IOC for real time target for supporting channel access. •  Specific application developed running in external computers for streamming/

archiving, data processing with GPUs, and monitoring using ITER CODAC Core System.

Page 6 07/02/2011 Meeting

HW elements: Block Diagram

R eal T im e C ontro ller 19 ” 1 U C hasis N I 8353 R T

PX I C lk10

PFSCSystem C ontro ller C PU

PXIe-PC Ie8372

)

PX Ie-PC Ie

-

N I-8370

172.17.152.13

172.17.152.11

ETH ER N ET

N I PXI-7952R

N I PX I-6682N I PX I-6653

172.17.152.40

M iniC O D AC

PC D esktop

PC D esktop

ETH ER N ET

172.17.152.34

Real Time Controller 19 ” 1 U Chasis NI 8353 RT

PXI Clk10

PFSCSystem Controller CPU

PXIe-PCIe8372

)

PXIe-PCIe Data Archiving servers

NI-8370

172.17.152.13

172.17.152.11

ETHERNET

NI PXI-7952R

NI PXI-6682NI PXI-6653

172.17.152.33

172.17.152.40

MiniCODAC

PC Desktop

PC Desktop

ETHERNET

172.17.152.34

DEVELOPMENT HOST

Page 7 07/02/2011 Meeting

GPU

Real Time Controller 19 ” 1 U Chasis NI 8353 RT

PXI Clk10

PFSCSystem Controller CPU

PXIe-PCIe8372

)

PXIe-PCIe Data Archiving servers

NI-8370

172.17.152.13

172.17.152.11

ETHERNET

NI PXI-7952R

NI PXI-6682NI PXI-6653

172.17.152.33

172.17.152.40

MiniCODAC

PC Desktop

PC Desktop

ETHERNET

172.17.152.34

DEVELOPMENT HOST

RTPGPU GPU

Server

Page 8 07/02/2011 Meeting

Development tools

R eal T im e C ontro ller 19 ” 1 U C hasis N I 8353 R T

PX I C lk10

PFSCSystem C ontro ller C PU

PXIe-PC Ie8372

)

PX Ie-PC Ie

-

N I-8370

172.17.152.13

172.17.152.11

ETH ER N ET

N I PXI-7952R

N I PX I-6682N I PX I-6653

172.17.152.40

M iniC O D AC

PC D esktop

PC D esktop

ETH ER N ET

172.17.152.34

Real Time Controller 19 ” 1 U Chasis NI 8353 RT

PXI Clk10

PFSCSystem Controller CPU

PXIe-PCIe8372

)

PXIe-PCIe Data Archiving servers

NI-8370

172.17.152.13

172.17.152.11

ETHERNET

NI PXI-7952R

NI PXI-6682NI PXI-6653

172.17.152.33

172.17.152.40

MiniCODAC

PC Desktop

PC Desktop

ETHERNET

172.17.152.34

DEVELOPMENT HOST

Page 9 07/02/2011 Meeting

Archiver Service

Archiver Service

d1wave and event

files

EPICS IOC

Archive Viewer

NFS

FPSC Labview

RT

EPICS IOC

EPICS IOC

CODAC IOC

CODAC State

Machine

Pulse Control

Archiving monitoring

Configuration Management

Development System

iocLog Service

FPSC GPU

System

Signal Generator

EPICS IOC

FPSC software elements

Page 10 07/02/2011 Meeting

Archiver Service

Archiver Service

d1wave and event

files

EPICS IOC

Archive Viewer

NFS

FPSC Labview

RT

EPICS IOC

EPICS IOC

CODAC IOC

CODAC State

Machine

Pulse Control

Archiving monitoring

Configuration Management

Development System

iocLog Service

FPSC GPU

System

Signal Generator

EPICS IOC

FPSC software elements

Page 11 07/02/2011 Meeting

FPSC applications running in NI8353 computer

•  LabVIEW Modules implemented: –  CORE. General queues management. Creation, destruction and state machine control. –  EPICS-IOC. Channel access and PVs management. –  TCN. Management of PXI6653 and PXI6682 for clock generation and event time-stamping.

•  PXI CLK 10MHz is in phase with PXI6682 IEEE1588 clock –  ACQ. Data acquisition and selection. –  FPGADAC. Data acquisition application for RIO devices with time-stamping. Also include a

signal simulator (inside FPGA) for debugging purposes. –  EVT. Event management. SDN. Implemented using NI-Time Triggered Variables –  RTP. Real time processing. Basic algorithms. RTPGPU. GPU management.

Events Management

Hardware management

module

FPSC softwareRemoteI/O

[Sample,Status,Error]

Fast Events

SDA

SDN

Data acquisiti

on

Output generat

or

CODAC / Mini CODAC

DEVHOST

Provided by ITER IO

Interface Type

TCN

HWM

Firmware[HM]

Synch [TMG]

Data Tag[HM]

Channel Info [HM]

PON

HardwareInfo

DEVELOPMENT HOST / mini-CODAC

Low level programming, Control, Algorithms

PSH switch

EPICS IOC moduleCode [RTS]

Configuration

Status

ConfigurationStatus

Configuration

Status

Configuration

Status

Dotted arrows when flow

implemented in software

Configuration

Status

ConfigurationStatus

ConfigurationStatus

TCN

Configuration

Status

Configurationand status

Software ModuleImplemented in FPSC project

Configuration

Status

Real-time Processes module [RTS]

Internal Data FlowManagement

module[IDF]

Error &Trace

Page 12 07/02/2011 Meeting

Main features of FPSC software

•  ADQ parameters are controlled & changed using PVs (also during the pulse):

–  Sampling rate and block size for FlexRIO device. –  Decimation factor and modes (samples and blocks) for EPICS monitoring

•  FPSC State machine control and status using PVs: start/stop, memory used, CPU load, etc.

•  Acquired data can be sent to streaming, monitoring with EPICS, real time processing and GPU using «FANOUT PVs».

•  Preprocessing algorithm s can be dynamically selected using PVs.

D ata se lection

Acquired dataAD Q hrate

R em ote I/O data

D ata Fanout

EPIC S IO C

O utput data stream ing STM

R eal-T im e processor D ata R TP

R em ote I/O data

PVs: fanoutpropertiesPVs: data

ra tes

M onitoring W aveform sM O N EPIC S

FPSC -C H x:D MFPSC -C H x:D F

FPSC -C H x:FAN O U T

FPSC -C H x:FAN O U T= 1

FPSC -C H x:FAN O U T= 2

FPSC -C H x:FAN O U T=4

XM L file w ith AD Q ru les

Page 13 07/02/2011 Meeting

Archiver Service

Archiver Service

d1wave and event

files

EPICS IOC

Archive Viewer

NFS

FPSC Labview

RT

EPICS IOC

EPICS IOC

CODAC IOC

CODAC State

Machine

Pulse Control

Archiving monitoring

Configuration Management

Development System

iocLog Service

FPSC GPU

System

Signal Generator

EPICS IOC

FPSC software elements

Page 14 07/02/2011 Meeting

GUI using EDM (EPICS), LOG and states machine

•  Manual start/stop of FPSC •  Basic control of PVs during the pulse. •  Implementation of IocLog client in LabVIEW •  IOC with the pulse states machine and

configuration management (XML files)

CODAC IOC

CODAC State

Machine

FPSC IOC

Configuration Management

Page 15 07/02/2011 Meeting

Archiver Service

Archiver Service

d1wave and event

files

EPICS IOC

Archive Viewer

NFS

FPSC Labview

RT

EPICS IOC

EPICS IOC

CODAC IOC

CODAC State

Machine

Pulse Control

Archiving monitoring

Configuration Management

Development System

iocLog Service

FPSC GPU

System

Signal Generator

EPICS IOC

FPSC software elements

Page 16 07/02/2011 Meeting

Archiving System

•  Data sources can be assigned to data archivers

•  netCDF file is the fundamental storage unit

•  A file per data source (signal) and pulse

•  Two types of data are currently implemented: “d1wave” and “event”.

•  EPICS IOC currently used for monitoring

EPICS IOC

EPICS IOC

Page 17 07/02/2011 Meeting

Archiving Viewer and monitoring •  “Online” and “Offline” mode •  On remote via NFS (Network File

System) •  Time slice positioning •  Self Description data visualization •  Flexible plotter

–  Zooms –  Export options

•  Completely based on EPICS channel access

–  Every archiver implements its own EPICS IOC

•  System variables: –  CPU load –  Memory Usage

•  Archiving system performance –  Receiving data rate per channel –  Total received data rate –  Storing data rate per channel –  Total saved data rate

Page 18 07/02/2011 Meeting

Conclusions

•  Implementation of a basic FPSC devoted to data acquisition following essential ITER requirements:

–  “Intelligent data acquisition” using FPGA DAQ devices with IEEE-1588 time-stamping.

–  System DAQ parameters controlled by EPICS’ PVs (changed dynamically during the PULSE)

–  Streaming capabilities. –  Preprocessing algorithms using local processor and GPU (controlled with

EPICS PVs). –  Integration with EPICS CODAC system (v1.1). –  100kS/s per channel with streaming, time-stamping, EPICS monitoring,

and 2 channels preprocessing •  LabVIEW based tools (RT/FPGA) have been a good choice for quick

prototyping in a short period of time (3 months). –  Graphical oriented design simplifies: the definition of complex software

models , the debugging of the different applications, and the test of complex hardware setups.

Page 19

ITER Fast Plant System Controller Prototype Based on PXI Platform

Thank you for your attention!!

CIEMAT: J. Vega, R. Castro IST: B. Gonçalves, J. Sousa, B. Carvalho

ITER: N. Utzel, P. Makijarvi. Technical University of Madrid: M. Ruiz, J.M. López, E. Barrera, G.

Arcas, D. Sanz & J. Nieto

Thank you to NI for the strong support in the development of this project