05356135

Upload: syahmi-hasan

Post on 07-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/6/2019 05356135

    1/6

    Industrial Haptic Robot Guidance System for

    Assembly Processes

    Marwan Radi and Gunther ReinhartInstitute for Machine Tools and Industrial Management (iwb)

    Technische Universitt Mnchen (TUM)

    Garching bei Mnchen, Germany

    Email: [email protected]: +49 (0) 89 289 15500

    Fax: +49 (0) 89 289 15555

    Abstract Human haptic sense is an important aspect for

    numerous assembly processes, especially in manual assembly,

    where the worker uses this sense to identify the occurring contact

    forces and tries to reduce them. Such an act minimizes the risk of

    unintended collisions or deformation of parts and thusguarantees a successful completion of the assembly process.

    Although the assembly of small/medium lots is normally carried

    out manually by a worker, the help of robots is mandatory in

    some cases, where the ability of the human being is hindered,

    such as the assembly of heavy parts. Some industrial robotic

    systems are available in the market, in which the robot is guided

    by a human operator to perform some tasks. However, those

    systems lack the haptic feedback. This work aims at the design

    and evaluation of an industrial Haptic Robot Guidance System

    (HRGS). The main role of the HRGS in assembly is to combine

    both human haptic sense and industrial robot capability. In this

    paper some research issues and approaches for the design of

    HRGS are introduced. Some preliminary experimental results

    are also included.

    I. INTRODUCTIONHuman haptic sense (sense of touch) is an important aspect

    for numerous assembly processes, especially in manualassembly where 1) human intelligence is necessary forsuccessful execution of given assembly tasks and 2)automation is costly [1]. In manual assembly, the worker useshis haptic sense to identify the position and orientation ofcomponents with respects to each other while trying to reducethe occurring contact forces. Such an act minimizes the risk ofunintended collisions or deformation of parts and thusguarantees a successful completion of the assembly process.

    Although the assembly of small/medium lots is normallycarried out manually by a worker, the ability of the humanbeing is hindered in some cases, such as the assembly of heavyparts. Therefore, the help of lifting machines such as industrialrobots is mandatory if the product to be assembled is heavierthan predefined limits. The assembly of physically large andheavy products calls for a system where the worker couldeasily and ergonomically manipulate the parts to be assembledwithout exerting huge physical efforts. The industrial HapticRobot Guidance System (HRGS), which is introduced in this

    paper, can be considered as a solution for such a problem.

    Up to now, the industrial robot and the worker are mostlyseparated during assembly. In manual assembly work-cells, theworker uses his superior sensory capability and intelligence toaccomplish the task. Flexibility is one of the advantages of

    such a system. However, the load-bearing capacity of theworker and the production rate of the work-cell are low. Inaddition, the running costs are also high. In contrast, theautomated assembly work-cell has a higher production rate andaccuracy and also steady quality. Although the robots in such awork-cell have higher load-bearing in comparison to theworker, they are inflexible. Also the initial costs of theautomated work-cell are very high. In the case of small lotsizes, high number of variants and short life time of products,there is a need for a flexible and changeable system.Combining the positive points of both manual and automatedwork-cells enables new concepts of flexible systems and opensup new application scopes. One way to realize such acombination is the Human-Robot Interaction (HRI).

    Figure 1. Spatial Classification of Human-Robot Interaction (HRI)

    HRI is a field of study dedicated to understanding,designing and evaluating robotic systems for use by or withhumans [2]. In order to get the interaction in HRI system thehuman operator should communicate with the robot. Accordingto the communication link and the spatial range of the humanand robot, the HRI can be classified (as shown in Fig. 1) intotwo main categories [2] [3] [4]:

    978-1-4244-4218-8/09/ 25.00 2009 IEEE

  • 8/6/2019 05356135

    2/6

    Remote interaction (discrete workspaces) - the humanand the robot are not co-located and separated bydiscrete workspaces.

    Proximate interaction (common workspace) - thehuman and the robot are co-located; either they have acommon workspace or overlapping workspaces.

    Within these main categories, there are subcategories

    according to the applications. The remote interaction with amobile robot, e.g. the mars rover, is often referred to as asupervisory control, while performing a physical manipulationduring the remote interaction is referred to as atelemanipulation. A proximate interaction with a direct

    physical contact between the human and the robot is called physical interaction. The second subcategory within the proximate interaction is overlapping workspaces, in which thehuman and the robot cooperate to accomplish a task, but notnecessarily have a physical contact.

    Following this introduction, this paper is organized asfollows: The next section introduces the Haptic RobotGuidance System (HRGS) and the role of it in the assembly.

    Afterwards, the state of the art of some research work andindustrial systems, where the human and robot have direct

    physical interaction, is given in section III. The researchchallenges and main objectives are then introduced in sectionIV. The experimental setup and preliminary results are given insection V and VI, respectively. The conclusion is finally givenin section VII.

    II. HAPTIC ROBOT GUIDANCE SYSTEM (HRGS)In this work, HRI with a physical telemanipulation and

    physical interaction will be considered. From these twosubcategories the HRGS is derived. By such a system the robotis guided manually using a haptic input device and the forces

    which exerted by the robot on the environment during themanipulation are displayed to the human operator. Fig. 2 showstwo concepts of such a system.

    Figure 2. Two Concepts of the HRGS

    In tele-guidance concept, the human guides the robotremotely and performs an assembly task, while in direct-guidance, the human has a direct contact with the robot throughthe haptic input device and the robot and human are co-locatedin one workspace. In tele-guidance the robot can be movedover its whole workspace with different velocities (fastmovement is allowed). In contrast, the robot should move

    slowly by the direct guidance (for safety reasons) and thiswould slow the production rate. Also in this concept theworkspace is smaller and depends on the human armworkspace. However, these concepts are intended for manualassembly of small lots, in which the production rate is alreadylower compared to dedicated automated systems.

    A. The Role of HRGS in Assembly ProcessesIn general, forces and torques arise during an assembly

    process. In manual assembly the worker uses his haptic senseand intelligence to reduce the occurring contact forces. Such anact minimizes the risk of unintended collisions or deformationof parts and thus guarantees a successful completion of theassembly process. In the case of heavy parts, which should bemanually assembled, the human capability is hindered and thusthe help of the robot is mandatory. Guiding a robot manuallycan solve the problem of carrying heavy parts, but the hapticfeedback in this case is also important to give the humanoperator the haptic sense of the assembly. Thus the role of theHRGS is to combine both human haptic sense and guided robotcapability by using a haptic feedback device as a guiding tool.

    III. STATE OF THE ARTSeveral research projects have successfully addressed

    automated assembly [5] and automated assembly in motion [6].In spite of significant advances in the field of artificialintelligence and industrial automation [7], human intelligenceis far superior in terms of reasoning, language comprehension,vision, and ingenuity, among others [8]. Some tasks require

    both the acute reasoning and perceptive abilities of a human.Therefore, manual assembly continues to be an importantfeature of many industrial processes. In heavy part assembly,some pieces of raw material or equipment are too heavy to besafely handled by a worker and therefore a need of assistant

    devices is required. This section shows some work done withinthe research community in this area, and some industrialsystems used to assist the worker in this regard. Cobotsintroduced in [9] provide guidance through the use ofservomotors, while the human operator provides motioncommands. Cobots are passive (which means stable)mechanical devices, and are used for the assembly of car doors.The virtual surfaces are used to constrain and guide theworkers motion. Schraft et al. introduced the PowerMate as arobot assistant [10]. PowerMate works together with theworker and conforms to safety category 3 according to DINISO 954. The main flaws of PowerMate system are the needfor a large floor space and the limited velocity. Krger et al.[11] introduced an intelligent power assist device (IPAD)

    which integrates sophisticated force-feedback andprogramming functions, but requires direct interaction with themanipulator. The IPAD is restricted, however, to 2-DOF.

    There are some industrial robotic systems in which therobot is guided by a human operator to perform some tasks likeassembly, lifting a load, maintenance, etc. These systems aredivided into two classes: systems without haptic feedback andsystem with haptic feedback. However, an industrial assemblysystem where an industrial robot and a worker hapticallyinteract has not been found.

  • 8/6/2019 05356135

    3/6

    Figure 3. Safe-Handling with joystick from KUKA GmbH (left) and

    robothandling with 6D mouse from Reis Robotics (right)

    Figure 4. Industrial teleoperation: Operator side (left) and the industrial

    overhead crane (right)

    Figure 5. EMSM from Telerob: Operator side (left), and dapple arm slave

    robot (right)

    A. Industrial Systems without Haptic FeedbackThe company KUKA Roboter GmbH [12] provides a

    system called safe handling (Fig. 3), by which the humanoperator can manually move the robot in all DOF using a

    joystick. However, this system lacks the haptic feedback,which means that the operator is never informed about thecontact between the parts. Safe-technology is also integrated inthis system to allow the human operator to work beside therobot. Another system found in the industry is from ReisRobotics [13], by which the operator moves and teaches therobot using a 6D mouse (Fig. 3). This 6D mouse can be

    arbitrary attached on the robot. For safety the Reis RoboticsSafety Controller is integrated.

    B. Industrial Systems with Haptic FeedbackAlthough extensive research work has been conducted in

    haptic feedback, wide scale deployment in the industrial andcommercial domain has yet to be realized. One example isintroduced in [14], where the sway of the load in an industrialoverhead crane is reduced using the haptic feedback andteleoperation system (Fig. 4). The user moves the crane by ahaptic input device and the force feedback depends on the sway

    angle. A second example is introduced by the company Telerob(Fig. 5). A master slave system is used for maintenance andmilitary tasks. Although this system has haptic feedback, it is aspecial and expensive solution and standard off-the-shelfcomponents are not used.

    IV. RESEARCH CHALLENGES AND MAIN OBJECTIVESAs mentioned before, the combination of the superiorsensory capabilities of the human being and enormous load-

    bearing capacity of the industrial robot enables new concepts offlexible production systems and opens up new applicationscopes. Yet an industrial manual assembly system, in which thehuman and robot haptically interact, has not been deployed.Therefore, the main objectives of this work are:

    Design of an industrial HRGS for assembly processes. Study the effects of the scaling between human scale

    and robot scale on stability and performance of such asystem.

    Evaluate the performance quantitatively.The following research issues, which are described in the

    following subsections, will be considered:

    Requirements and Specifications of HRGS Scaling and Stability Assistance tools Evaluation methods

    A. Requirements and SpecificationsIn the design of an industrial HRGS, off-the-shelf

    components will be used. For haptic devices there are some

    devices available in the market. They range from very cheapdevices, such as the Falcon from Novint Technologies [15], tovery expensive, such as the Phantom from SensAbleTechnologies [16]. Also the robot that would be used in such asystem is a standard articulated industrial robot, such asKUKA, FANUC, ABB, etc.

    From the software side, a graphical user interface should bedesigned to give the user the possibility to set some parameterssuch as force and velocity limits, degrees of freedom mapping(between the haptic device and the robot) and virtual walls. Thecontroller of the system will be a bilateral controller, by whichthe desired position is measured by the haptic device and sentto the robot controller (the standard robot position controller),

    and the measured forces are sent back to the human operator.This bilateral controller has a communication channel betweenthe haptic device side and the robot side. This could cause astability problem regarding the time delay.

    Safety and quality are also important and will be consideredin the requirements and specifications of HRGS. Somestandards dealing with the safety are DIN EN 954-1 (cat. 3)and ISO-10218. For example, the ISO-10218 states that one ofthe following conditions always has to be fulfilled for allowinghuman-robot interaction: The tool center point/flange velocityneeds to be 0.25m/s, the maximum dynamic power 80W,

  • 8/6/2019 05356135

    4/6

    or the maximum static force 150N. That means in such asystem the forces and velocity should be monitored online. Inaddition, the quality of products would be raised because theworker gets informed about the contact forces, which willminimize unintended collisions or deformation of the parts.

    B. Scaling and StabilitySince the human and robot scales are different, a scaling

    between the movement of the haptic device (which is movedwithin the human scale) and the robot is needed. Furthermore,the forces which measured by a force torque sensor mountedon the robot should be scaled down to the human scale (to therange of forces which the haptic device can display).

    In free movement, while the robot is not in contact with theenvironment, only position scaling is needed. Some methodsare position control, rate control [17], indexing [18], and hybridcontrol [19]. In the contact case, both scaling should be dealttogether in order to keep a consistent displayed stiffness (thestiffness felt by the human operator which corresponds to theenvironment stiffness). This can be achieved by setting thesame scaling factor for both position and force [20], or by

    using impedance shaping concept [21].

    Regarding the stability, if there is a time delay in thecommunication channel between the two sides (haptic deviceand robot sides), this channel will be an active element in thesystem. This leads to instability. To solve such a problem,some control methods are used to stabilize the system, such astime domain passivity controller [22] and wave variablescontroller [23].

    C. Evaluation MethodsEvaluating the HRGS will be also pursued in this work.

    Some methods are thought to be used, such as transparency

    [24], industry acceptance, and economic feasibility. Idealtransparency means that the human operator feels a directhaptic interaction with the remote environment (as that therobot and other components are not present between the humanand the task). But this ideal transparency becomes affected bysome barriers such as time delay, scaling, and distance. Tomeasure the transparency of the system quantitatively, twomethods are introduced in [25]. The first method is ananalytical model-based method, based on the maneuverabilityindex [26], and the second one is an empirical measurement,

    based on the Z-width [27]. The industry acceptance andeconomic feasibility will be also performed as evaluationmetrics.

    V. EXPERIMENTALSETUPIn this section the system setting is described. Fig. 6 shows

    the architecture of the industrial HRGS, which consists of thehuman operator side, industrial robot side, central controllerunit, and communication links in between. An experimentalsetup is shown in Fig. 7.

    At the human operator side there is an input haptic interfacedevice, which displays the forces/torques sensed at the toolcenter point of the robot to the human operator and receives themotion commands from the human operator and sends them to

    the robot side via the communication link. In general, it is preferred to have a simple input device, in order to make iteasier for the human operator to understand its movement.Therefore, a 2-DOF force feedback joystick is used in oursystem as an input device. This joystick can display a force upto 8.9N.

    Figure 6. System architecture

    Figure 7. Experimental Setup: operator workplace (left) and teleoperator

    station (right)

    The robot side is where the robot performs the remoteassembly tasks. The robot should have a force/torque-sensormounted at the flange to measure the interaction forces/torques

    generated during the assembly tasks execution. The KUKAindustrial robot KR6 is used in our system. It is a 6-DOFarticulated industrial robot with a payload of 6 kg. The robothas a controller with a real time communication interface(Remote Sensor Interface) which is explained in the following.

    Between the operator and robot sides the central bilateralcontroller is found. This controller is implemented on a centralcontroller unit and running on a real time operating systemQNX. Hence it guarantees a real time execution of thecontroller software.

  • 8/6/2019 05356135

    5/6

    The force feedback joystick is connected to the centralcontroller through a User Datagram Protocol (UDP)connection. Avoiding the overhead of checking whether every

    packet actually arrived makes UDP faster and more efficientthan e.g. Transmission Control Protocol (TCP). Since oursystem is a time-sensitive system it is decided to use UDP,

    because dropped packets are preferable to delayed packets [28].

    The KUKA.Ethernet Remote Sensor Interface (RSI) XML(Extensible Markup Language) is used to connect the robotwith the central controller. The exchanged data are transmittedvia the Ethernet TCP/IP protocol as XML strings. The cyclicaldata transmission from the robot controller to the centralcontroller is in the interpolation cycle of 12 milliseconds. Thisinterface allows a direct intervention in the path planning of therobot during motion in real time.

    To facilitate the manipulation task, some assistance toolsare implemented. First, the degrees of freedom (DOF) mappingstrategy is needed when the number of the haptic devices DOFare not equal to the DOF of the task to be accomplished. Forexample, the task could have a rotational DOF, which thehaptic device does not. In this case, one translational DOF ofthe haptic device can be mapped to that rotational DOF of thetask.

    The mapping strategy presented in this paper is to decouplethe translational and rotational motions of the robot, i.e. to limitthe motion to one mode at a time, either translational orrotational mode. This strategy makes it easier for the operatorto understand the action of the joystick. The switching betweenthese modes is done by means of a manual switch in the centralcontroller interface.

    By the translational mode, the two buttons of the joystickare used to switch between the three translational DOF; i.e. thefirst 2-DOF (in xy-plane) and the third DOF (z direction) are

    enabled by the first and second button, respectively. When thehuman operator switches on the rotational mode, he can use thesame buttons again to switch between the three rotational DOF;i.e. the rotations around x-axis and y-axis are enabled by thefirst button, while the rotation around the z-axis is enabled bythe second button.

    Figure 8. Virtual walls as forbidden region borders

    The second assistance tool is the use of virtual walls. Theycan be implemented either as a guiding (assistant) virtual wallor as forbidden region borders (e.g. surrounding a particularregion to avoid collisions with some other devices or machines,

    see Fig. 8). The contact between the robot and the virtual wallshould be stable. This calls for some stabilization methods,such as the time domain passivity controller [22].

    VI. PRELIMINARY RESULTSAn important process during any assembly scenario, the

    pick-and-place process, was realized as an experiment. To

    study the effect of the haptic feedback during this process, the pressure force on the assembly platform (in vertical direction)is used as a metric variable.

    The participants (N = 12) are middle-aged (mean age =27.1 years) and most of them (83.33%) are right-handed and donot have previous experience with such force feedback

    joysticks. They were instructed to pick a block from an initial position and place it as accurate and quick as possible at atarget position. The block to be gripped is thin in order toassure a contact between the gripper and the platform duringthe pick and place processes. After a sufficient training period,the subjects were asked to repeat the task eight times (trials =8) with and without force feedback (No force feedback = NF,with force feedback = FF). To avoid learning effects, the twotypes of feedback were applied in a random order. Within eachtrial the teleoperator position and the pressure force on the

    platform were recorded.

    The evaluation of the results shows a significant decreaseof the pressure force with haptic feedback contrary to the robotguidance without haptic feedback (Fig. 9). Furthermore, thesubjects were asked to evaluate qualitatively the usefulness ofhaptic feedback in performing the pick-and-place task. 16.7%found the task easier without haptic feedback (NF), whereas41.7% preferred haptic feedback (FF). Therefore it becameclear that with haptic feedback the pressure force is fasterrecognized and thus the risk of a damage of the assembly partis reduced.

    Figure 9. Experimental results

    VII. CONCLUSIONAlthough extensive research work has been conducted in

    human-robot interaction, wide scale deployment in theindustrial and commercial realm has yet to realize. This could

    be attributed to safety reasons. However, many industrial robot

  • 8/6/2019 05356135

    6/6

    manufacturers have recently integrated the safe robottechnology into their robotic systems. This allows for a

    proximate physical interaction between the worker and robots.Although some industrial robotic systems, in which the robot isguided by a human operator to perform some tasks, areavailable in the market, they lack the haptic feedback. Thishaptic feedback is very important to accomplish some task,especially in assembly.

    In this work the design of an industrial HRGS for assembly processes is introduced. Some research issues and approachessuch as the mapping strategy and evaluation methods are alsohighlighted. In this paper it is also shown by some preliminaryexperimental results that the integration of the haptic feedbackin the HRGS would be of a great benefit for assembly

    processes, since the worker will be informed about theoccurring contact forces. This will minimize the risk ofunintended collisions or deformation of parts and thusguarantee a successful assembly and raise the quality of

    products.

    ACKNOWLEDGMENT

    This work is supported in part by the German ResearchFoundation (DFG) within the Collaborative Research CenterSFB 453 on High-Fidelity Telepresence and Teleaction.

    REFERENCES

    [1] G. Reinhart, M. Radi and S. Zaidan, Industrial telepresence robotassembly system: Preliminary simulation results. In 2nd CIRPConferenceon Assembly Technologies and Systems, Toronto, 2008.

    [2] M. A. Goodrich and A. C. Schultz, Human-Robot Interaction: A Survey.Foundations and Trends in Human-Computer Interaction, 1(3), 203-275,2007.

    [3] H. A. Yanco and J. Drury, Classifying human-robot interaction: anupdated taxonomy. In IEEE International Conference on Systems, Manand Cybernetics, 3(1), 2841-2846, 10-13 October, 2004.

    [4] E. Helms,Roboterbasierte Bahnfhrungsuntersttzung von industriellenHandhabungs- und Bearbeitungsprozessen. PhD thesis in FraunhoferInstitute for Manufacturing Engineering and Automation IPA, Stuttgart,Germany, 2007.

    [5] K. S. Chin, M. M. Ratnam and M. Rajeswari, Force-guided robot inautomated assembly of mobile phone. Assembly Automation, 23(1), 75-86, 2003.

    [6] G. Reinhart and J. Werner, Flexible automation for the assembly inmotion. The International Academy for Production Engineering CIRPAnnals, 56(1), 25-28, 2007.

    [7] R. A. Brooks, L. Aryananda, A. Edsinger, P. Fitzpatrick, C. Kemp,U.-M. OReilly, E. Torres-Jara, P. Varshavskaya and J. Weber, Sensingand manipulating built-for-human environments. International Journal ofHumanoid Robotics,1(1), 1-28, 2004.

    [8] Z. Nichol, Y. Liu, P. Suchyta, M. Prokos, A. Goradia and N. Xi, Super-media enhanced internet-based real-time teleoperation. In Hands-OnInternational Mechantronics and Automation Conference, 2005.

    [9] J. E. Colgate, W. Wannasuphoprasit and M. Peshkin, Cobots: Robots forcollaboration with human operators. In Proceedings of the InternationalMechanical Engineering Congress and Exhibition, 58(1), 433-439,Atlanta, GA,1996.

    [10] R. Schraft, C. Meyer, C. Parlitz and E. Helms, PowerMate a safe andintuitive robot assistant for handling and assembly tasks. In IEEEProceedings of the Robotics and Automation, ICRA 2005.

    [11] J. Krger, R. Bernhardt and D. Surdilovic, Intelligent assist systems for flexible assembly. The International Academy for Production

    Engineering CIRP Annals, 55(1), 29-32, 2006.[12] KUKA Roboter GmbH: http://www.kuka.com06/2009[13] Reis Robotics: http://www.reisrobotics.com06/2009[14] I. Farkhatdinov and J. Ryu, A Study on the Role of Force Feedback for

    Teleoperation of Industrial Overhead Crane. In Haptics: Perception,Devices and Scenarios. Lecture Notes in Computer Science, SpringerBerlin / Heidelberg, pp. 796-805, 2008.

    [15]Novint Technologies, Inc.: http://www.novint.com 06/2009[16] SensAble Technologies, Inc.: http://www.sensable.com 06/2009[17] W. S. Kim, F. Tendick, S. Ellis and L. Stark, A comparison of position

    and rate control for telemanipulations with consideration ofmanipulator system dynamics. IEEE Journal of Robotics andAutomation, 3(5), 426-436, October 1987.

    [18] F. Conti and O. Khatib, Spanning large workspaces using small hapticdevices. Eurohaptics Conference and Symposium on Haptic Interfaces

    for Virtual Environment and Teleoperator Systems. World Haptics 2005,pp. 183-188, 18-20 March 2005.

    [19] S. E. Salcudean, N. M. Wong and R. Hollis, Design and contol of aforce-reflecting teleoperation system with magnetically levitated masterand wrist. IEEE Transactions on Robotics and Automation, 11(6), 844-858, December, 1995.

    [20] C. Preusche and G. Hirzinger, Scaling Issues for Teleoperation, 5thPhantom User Group Workshop, Aspen - Colorado, October 2000.

    [21] J. E. Colgate, Robust Impedance Shaping Telemanipulation. IEEETransactions on Robotics and Automation, 9(4), 374-384, 1993.

    [22] B. Hannaford and J. Ryu, Time Domain Passivity Control of HapticInterfaces. IEEE International Conference on Robotics and Automation,Seoul, pp. 1863-1869, May 2001.

    [23] G. Niemeyer and J. Slotine, Telemanipulation with Time Delays. TheInternational Journal of Robotics Research, 23(9), 873-890, 2004.

    [24] D. A. Lawrence, Stability and transparency in bilateral teleoperation.IEEE transactions on robotics and automation, 9(5), 624-637, 1993.

    [25] M. Radi, J. Artigas, C. Preusche, H. Roth, Transparency Measurementof Telepresence Systems. In Haptics: Perception, Devices and Scenarios.Lecture Notes in Computer Science, Springer Berlin/Heidelberg, pp.766-775, 2008.

    [26] Y. Yokokohji and T. Yoshikawa, Bilateral control of master-slavemanipulators for ideal kinesthetic coupling-formulation and experiment.IEEE transactions on robotics and automation, 10(5), 605-620, 1994.

    [27] J. E. Colgate and J. M. Brown,Factors affecting the Z-Width of a hapticdisplay. In Proceedings of IEEE International Conference on Roboticsand Automation, 4(1), 8-13 May 1994.

    [28] R. Oboe and P. Fiorini, Internet-Based Telerobotics: Problems andApproaches. In Proceedings of the International Conference onAdvanced Robotics ICAR 97, pp. 765-770, Monterey (CA), US, 1997.