dmitrii stepanov, aleksandr bakhshiev, d.gromoshinsky, n.kirpan f.gundelakh - determination of the...

13
Россия, 194064, г. Санкт-Петербург, Тихорецкий пр., 21 тел.: (812) 552-0110 (812) 552-1325 факс: (812) 556-3692 http://www.rtc.ru e-mail: [email protected] Г О С У Д А Р С Т В Е Н Н Ы Й Н А У Ч Н Ы Й Ц Е Н Т Р Р О С С И И ЦЕНТРАЛЬНЫЙ НАУЧНО-ИССЛЕДОВАТЕЛЬСКИЙ И ОПЫТНО-КОНСТРУКТОРСКИЙ ИНСТИТУТ РОБОТОТЕХНИКИ И ТЕХНИЧЕСКОЙ КИБЕРНЕТИКИ Г О С У Д А Р С Т В Е Н Н Ы Й Н А У Ч Н Ы Й Ц Е Н Т Р Р О С С И И ЦЕНТРАЛЬНЫЙ НАУЧНО-ИССЛЕДОВАТЕЛЬСКИЙ И ОПЫТНО-КОНСТРУКТОРСКИЙ ИНСТИТУТ РОБОТОТЕХНИКИ И ТЕХНИЧЕСКОЙ КИБЕРНЕТИКИ Computer Vision Laboratory Dmitrii Stepanov Aleksandr Bakhshiev D.Gromoshinsky N.Kirpan F.Gundelakh Determination of the Relative Position of Space Vehicles by Detection and Tracking of Natural Visual Features with the Existing TV-cameras Определение взаимного положения космических аппаратов по телевизионному изображению на основе обнаружения и сопровождения особенностей конструкции наблюдаемого корабля

Upload: aist

Post on 27-Jul-2015

25 views

Category:

Presentations & Public Speaking


2 download

TRANSCRIPT

1. , 194064, . -, ., 21 .: (812) 552-0110 (812) 552-1325 : (812) 556-3692 http://www.rtc.ru e-mail: [email protected] - - - - Dmitrii Stepanov Aleksandr Bakhshiev D.Gromoshinsky N.Kirpan F.Gundelakh Determination of the Relative Position of Space Vehicles by Detection and Tracking of Natural Visual Features with the Existing TV-cameras 2. Introduction Goal: Determine all six relative coordinates using a single TV-camera mounted on a docking spacecraft Proposed solution: Select feature points on ISS with known 3D-coordinates (relative to ISS) at the training stage. Train a cascade classifier from existing video recordings of previous dockings. At a new docking capture the video signal }(images) from TV- camera of Progress or Soyuz spacecraft. Detect and track the feature points on the images. Given the known 3D and 2D coordinates of the feature points solve the PnP problem (determine all six relative coordinates). Estimate the quality of the pose estimation. 3. Coordinate Systems 222 d zyx x y x atg z y atg z x x y y Ry Rx d distance between the origins x, y angles of the camera (spacecraft) Z-axis to the line of sight x,y of the target (ISS) Z-axis to the line of sight x, y, z linear displacements between the origins Rx, Ry, Rz rotation angles along the X-, Y- and Z-axis 4. Algorithm of relative 3D pose estimation 5. Analog Signal Distortion and Camera Model Correction 6. Distance-based Sets of Points Switching 200 m 60 m 20 m 10 m 7. Software Implementation Implements all the proposed methods Adds: user (cosmonaut) interface, frame capture, camera calibration, global preprocessing, Course module Supports modes: automatic (w/o an operator interaction); semi-automatic (the operator points initial points positions); freeze (the image is frozen, so the operator may analyze some specific moment). Video sources analogue TV-signal (via a frame grabber); digital MPEG-stream; video files (including previous rendezvous recordings); detailed ISS 3D-model. 8. Task Way to evaluate Results Feature points detection and tracking stability Docking video recordings (Progress, Soyuz, ATV; split to training and test sets) Stable tracking of at lest 5 points in most cases (excluding no visibility at all) Feature points detection and tracking precision Mockup (precise 6DOF coordinate camera and target positioning system) MSE about 2 pixels in most cases 3D-pose estimation module (PnP) Mockup, 3D modeling Docking video recordings Acitve: 0.1 Passive: 2.9 Distance: 4% Roll: 2.5 Complete system Docking video recordings Mission control center (real-time MPEG and analogue video) + Course telemetry 2 passed, 2 left (28 April, 28 May) 2-5 FPS (depending on number of points, detection and tracking stability) Evaluation 9. Evaluation on Course data 10. Evaluation on Course data 11. Conclusion The system calculates all 6 coordinates and is based on detection and tracking of natural visual features from an existing docking TV-camera The algorithms are implemented and run on a Lenovo T60p laptop 1-5 at FPS The system has been evaluated against models, mockups and recordings of previous rendezvous of "Progress", "Soyuz" and ATV spacecrafts (with the corresponding reference data received from the Course rendezvous system) demonstrating promising results. The most important problems of the proposed approach are dependency on the visual conditions and the analog distortions (fusion is needed) relatively high error of the passive angles estimation (mutual measurements are possible solution) 2 real-time experiments at the Mission Control Center passed successfully, 2 more on their way (April-May). Each time optimizations and improvements to the user interface and the algorithms are proposed and implemented Next stage evaluation at the ISS using an existing laptop and become an independent docking control (evaluation) system 12. MCC: 17 Feb 2015, Progress M-26M to Zvezda module, KL-154 cam. 13. , 194064, . -, ., 21 .: (812) 552-0110 (812) 552-1325 : (812) 556-3692 http://www.rtc.ru e-mail: [email protected] - - - - Thank you for your attention! Head of Computer Vision Laboratory Dmitrii Stepanov [email protected]