D. Tsai, I. A. Nesnas, and D. Zarzhitsky, Autonomous vision-based tethered-assisted rover docking, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.2834-2841, 2013.

J. Iqbal, S. Heikkila, and A. Halme, Tether tracking and control of rosa robotic rover, Control, Automation, Robotics and Vision, pp.689-693, 2008.

S. Prabhakar and B. Buckham, Dynamics modeling and control of a variable length remotely operated vehicle tether, Proceedings of OCEANS 2005 MTS/IEEE, vol.2, pp.1255-1262, 2005.

Z. Echegoyen, I. Villaverde, R. Moreno, M. Graa, and A. Danjou, Linked multi-component mobile robots: Modeling, simulation and control, Robotics and Autonomous Systems, vol.58, issue.12, pp.1292-1305, 2010.

J. Estevez and M. Graña, Robust Control Tuning by PSO of Aerial Robots Hose Transportation, ch. Bioinspired Computation in Artificial Systems: International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2015, pp.291-300, 2015.

T. Lee, Geometric controls for a tethered quadrotor uav, 2015 54th IEEE Conference on Decision and Control (CDC), pp.2749-2754, 2015.

T. Dallej, M. Gouttefarde, N. Andreff, R. Dahmouche, and P. Martinet, Vision-based modeling and control of large-dimension cable-driven parallel robots, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.1581-1586, 2012.
URL : https://hal.archives-ouvertes.fr/lirmm-00737658

F. Chaumette and S. Hutchinson, Visual servo control. i. basic approaches, IEEE Robotics Automation Magazine, vol.13, issue.4, pp.82-90, 2006.
URL : https://hal.archives-ouvertes.fr/inria-00350638

, Visual servo control. ii. advanced approaches [tutorial], IEEE Robotics Automation Magazine, vol.14, issue.1, pp.109-118, 2007.

B. Espiau, F. Chaumette, and P. Rives, A new approach to visual servoing in robotics, IEEE Transactions on Robotics and Automation, vol.8, issue.3, pp.313-326, 1992.

A. Comport, E. Marchand, and F. Chaumette, Robust model-based tracking for robot vision, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS'04, vol.1, pp.692-697, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00352025

A. Petit, E. Marchand, and K. Kanani, A robust model-based tracker for space applications: combining edge and color information, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS'2013, pp.3719-3724, 2013.

O. Tahri and F. Chaumette, Application of moment invariants to visual servoing, International Conference on Robotics and Automation, vol.3, pp.4276-4281, 2003.
URL : https://hal.archives-ouvertes.fr/inria-00352082

P. Li, F. Chaumette, and O. Tahri, A shape tracking algorithm for visual servoing, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, pp.2847-2852, 2005.
URL : https://hal.archives-ouvertes.fr/inria-00351892

A. Y. Yazicioglu, B. Calli, and M. Unel, Image based visual servoing using algebraic curves applied to shape alignment, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.5444-5449, 2009.

A. Comport, E. Marchand, and F. Chaumette, Kinematic sets for realtime robust articulated object tracking, Image and Vision Computing, IVC, vol.25, issue.3, pp.374-391, 2007.
URL : https://hal.archives-ouvertes.fr/inria-00350642

M. Higashimori, K. Yoshimoto, and M. Kaneko, Active shaping of an unknown rheological object based on deformation decomposition into elasticity and plasticity, Robotics and Automation (ICRA), 2010.

, IEEE International Conference on, pp.5120-5126, 2010.

D. Navarro-alarcon, Y. Liu, J. G. Romero, and P. Li, On the visual deformation servoing of compliant objects: Uncalibrated control methods and experiments, The International Journal of Robotics Research, vol.33, issue.11, pp.1462-1480, 2014.

B. Gerkey and K. Conley, IEEE Robotics Automation Magazine, vol.18, issue.3, pp.16-16, 2011.

J. Stewart, Calculus, p.464, 2007.
URL : https://hal.archives-ouvertes.fr/hal-00554549

R. I. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, p.155, 2004.

D. Tsai, I. A. Nesnas, and D. Zarzhitsky, Autonomous vision-based tethered-assisted rover docking, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.2834-2841, 2013.

J. Iqbal, S. Heikkila, and A. Halme, Tether tracking and control of rosa robotic rover, Control, Automation, Robotics and Vision, pp.689-693, 2008.

S. Prabhakar and B. Buckham, Dynamics modeling and control of a variable length remotely operated vehicle tether, Proceedings of OCEANS 2005 MTS/IEEE, vol.2, pp.1255-1262, 2005.

O. Khatib, X. Yeh, G. Brantner, B. Soe, B. Kim et al., Ocean one: A robotic avatar for oceanic discovery, IEEE Robotics Automation Magazine, vol.23, issue.4, pp.20-29, 2016.
URL : https://hal.archives-ouvertes.fr/lirmm-01567467

D. P. Perrin, A. Kwon, and R. D. Howe, A novel actuated tether design for rescue robots using hydraulic transients, Proceedings. ICRA '04. 2004 IEEE International Conference on, vol.4, pp.3482-3487, 2004.

P. Mcgarey, F. Pomerleau, and T. D. Barfoot, System Design of a Tethered Robotic Explorer (TReX) for 3D Mapping of Steep Terrain and Harsh Environments, pp.267-281, 2016.

M. Krishna, J. Bares, and E. Mutschler, Tethering system design for dante ii, Proceedings of International Conference on Robotics and Automation, vol.2, pp.1100-1105, 1997.

V. A. Rajan, A. Nagendran, A. Dehghani-sanij, and R. C. Richardson, Tether monitoring for entanglement detection, disentanglement and localisation of autonomous robots, Robotica, vol.34, issue.3, p.527548, 2016.

Z. Echegoyen, I. Villaverde, R. Moreno, M. Graa, and A. Danjou, Linked multi-component mobile robots: Modeling, simulation and control, Robotics and Autonomous Systems, vol.58, issue.12, pp.1292-1305, 2010.

D. Navarro-alarcon, Y. Liu, J. G. Romero, and P. Li, On the visual deformation servoing of compliant objects: Uncalibrated control methods and experiments, The International Journal of Robotics Research, vol.33, issue.11, pp.1462-1480, 2014.

T. Dallej, M. Gouttefarde, N. Andreff, R. Dahmouche, and P. Martinet, Vision-based modeling and control of large-dimension cable-driven parallel robots, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.1581-1586, 2012.
URL : https://hal.archives-ouvertes.fr/lirmm-00737658

J. Estevez and M. Graña, Robust Control Tuning by PSO of Aerial Robots Hose Transportation, ch. Bioinspired Computation in Artificial Systems: International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2015, pp.291-300, 2015.

T. Lee, Geometric controls for a tethered quadrotor uav, 2015 54th IEEE Conference on Decision and Control (CDC), pp.2749-2754, 2015.

M. Laranjeira, C. Dune, and V. Hugel, Catenary-based visual servoing for tethered robots, 2017 IEEE International Conference on Robotics and Automation (ICRA), pp.732-738, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01657118

F. Chaumette and S. Hutchinson, Visual servo control. i. basic approaches, IEEE Robotics Automation Magazine, vol.13, issue.4, pp.82-90, 2006.
URL : https://hal.archives-ouvertes.fr/inria-00350638

, Visual servo control. ii. advanced approaches [tutorial], IEEE Robotics Automation Magazine, vol.14, issue.1, pp.109-118, 2007.

R. I. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, p.155, 2004.

C. Collewet and F. Chaumette, Positioning a camera with respect to planar objects of unknown shape by coupling 2-d visual servoing and 3-d estimations, IEEE Transactions on Robotics and Automation, vol.18, issue.3, pp.322-333, 2002.
URL : https://hal.archives-ouvertes.fr/inria-00352088

A. Cherubini, F. Chaumette, and G. Oriolo, An image-based visual servoing scheme for following paths with nonholonomic mobile robots, 2008 10th International Conference on Control, Automation, Robotics and Vision, pp.108-113, 2008.
URL : https://hal.archives-ouvertes.fr/inria-00351859

B. Espiau, F. Chaumette, and P. Rives, A new approach to visual servoing in robotics, IEEE Transactions on Robotics and Automation, vol.8, issue.3, pp.313-326, 1992.

F. Chaumette, P. Rives, and B. Espiau, Classification and realization of the different vision-based tasks, Visual servoing, vol.7, pp.199-228, 1993.
URL : https://hal.archives-ouvertes.fr/hal-01548352

N. Koenig and A. Howard, Design and use paradigms for gazebo, an open-source multi-robot simulator, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), vol.3, pp.2149-2154, 2004.

B. Gerkey and K. Conley, IEEE Robotics Automation Magazine, vol.18, issue.3, pp.16-16, 2011.

J. T. Feddema, C. S. Lee, and O. R. Mitchell, Automatic selection of image features for visual servoing of a robot manipulator, International Conference on Robotics and Automation, vol.2, pp.832-837, 1989.

M. Laranjeira, C. Dune, and V. Hugel, Catenary-Based Visual Servoing for Tethered Robots, IEEE Int. Conf. on Robotics and Automation, ICRA'17, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01657118

M. Laranjeira, C. Dune, and V. Hugel, Local Vision-Based Tether Control for a Line of Underwater Robots, Workshop on "New Horizons for Underwater Intervention Missions: from Current Technologies to Future Applications, 2018.

M. Laranjeira, C. Dune, and V. Hugel, Embedded Visual Detection and Shape Identification of Underwater Umbilical for Vehicle Positioning, MTS/IEEE OCEANS 2019
URL : https://hal.archives-ouvertes.fr/hal-02350067

M. Laranjeira, Contrôle référencé vision pour la robotique sous-marine, 8es Rencontres CARTT'16, 2016.

M. Laranjeira, La commande de robots interconnectés, 2016.

M. Laranjeira, Asservissement visuel pour objets déformables paramétriques, 9es Rencontres CARTT'17, 2017.

M. Laranjeira, Étude comparative de contrôleurs référencés vision pour la commande coordonnée d'un ombilical, publications, Workshop Participations and Scientific Popularization Activities, 2017.

M. Laranjeira, Asservissement visuel appliqué au contrôle de forme d'ombilicaux pour la robotique sous-marine, Journée GT2 Robotique marine et sous-marine, 2017.

, la distance maximale entre robots est d'environ 10 mètres

, 2. les robots peuvent naviguerà des profondeurs légèrement différentes (différence inférieureà 5 mètres, en fonction de la longueur de l'ombilical)

, le mouvement de roulis et de tangage des robots est mécaniquement stabilisé ou réguléà bas niveau pour

, 4. les robots sontéquipés d'une caméra frontale et/ou arrière qui filment le câble

, 5. chaque robot dans la chaîne doit gérer le câble qui le précède

, le robot leader ne gère aucune partie de l'ombilical, et reste libre pour explorer son environnement et pour exécuter d'autres tâches

, le robot leader peut se trouver en dehors du champ de vision de la caméra du robot suiveur

, le câble est détectable dans le flux d'images de la caméra

, ombilical reliant les deux robots est pesant et le plan du câble reste dans le plan vertical. Le point le plus bas est toujours situé entre les deux robots

, Le câble reliant les robots est modélisé par une chaînette, dont les paramètres géométriques servent d'entrées aux algorithmes de contrôle. Le paramétrage est rendu symétrique par rapport aux points d'attache. Ces paramètres sont définis par rapport a chaque robot,à savoir la différence de hauteur entre les points d'attache ?H i , la hauteur H i de câble et l'orientation

B. A. Abel, Underwater Vehicle tether management systems, Proceedings of OCEANS'94, vol.2, 1994.

A. Aili and E. Ekelund, Model-Based Design, Development and Control of an Underwater Vehicle, 2016.

P. K. Allen, A. Timcenko, B. Yoshimi, and P. Michelman, Automated tracking and grasping of a moving object with a robotic hand-eye system, IEEE Transactions on Robotics and Automation, vol.9, issue.2, pp.152-165, 1993.

G. Antonelli, T. I. Fossen, and D. R. Yoerger, Underwater robotics, Springer Handbook of Robotics, pp.987-1008, 2008.

M. P. Bell, Flexible Object Manipulation, 2010.

W. M. Bessa, M. S. Dutra, and E. Kreuzer, Thruster dynamics compensation for the positioning of underwater robotic vehicles through a fuzzy sliding mode based approach, ABCM Symposium Series in Mechatronics, vol.2, pp.605-612, 2006.

. Bluerobotics, T200 Thruster Characteristics, 2019.

A. D. Bowen, M. V. Jakuba, D. R. Yoerger, L. L. Whitcomb, J. C. Kinsey et al., Nereid UI: A Light-Tethered Remotely Operated Vehicle for Under-Ice Telepresence, OTC Arctic Technology Conference. Offshore Technology Conference, 2012.

T. Bretl and Z. Mccarthy, Quasi-static manipulation of a Kirchhoff elastic rod based on a geometric analysis of equilibrium configurations, The International Journal of Robotics Research, vol.33, issue.1, pp.48-68, 2014.

L. Brignone, E. Raugel, J. Opderbecke, V. Rigaud, R. Piasco et al., First sea trials of HROV the new hybrid vehicle developed by IFREMER, OCEANS 2015 -Genova, pp.1-7, 2015.

T. Brown, A. Stefanini, N. Georgiev, J. Sawoniewicz, and I. Nesnas, Series Elastic Tether Management for Rappelling Rovers, IROS: International Conference on Intelligent Robots and Systems, pp.2893-2900, 2018.

. Bibliography and B. J. Buckham, Dynamics Modelling of Low-Tension Tethers for Submerged Remotely Operated Vehicles, 2003.

B. J. Buckham, F. R. Driscoll, B. Radanovic, and M. Nahon, Three dimensional dynamics simulation of slack tether motion in an ROV system, The Thirteenth International Offshore and Polar Engineering Conference. International Society of Offshore and Polar Engineers, 2003.

M. Carreras, A. Carrera, N. Palomeras, D. Ribas, N. Hurtós et al., Intervention payload for valve turning with an AUV, International Conference on Computer Aided Systems Theory, pp.877-884, 2015.

F. Chaumette, Visual servoing using image features defined upon geometrical primitives, Proceedings of 1994 33rd IEEE Conference on Decision and Control, vol.4, pp.3782-3787, 1994.

F. Chaumette, Image moments: A general and useful set of features for visual servoing, IEEE Transactions on Robotics, vol.20, issue.4, pp.713-723, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00352019

F. Chaumette and S. Hutchinson, Visual servo control. I. Basic approaches, IEEE Robotics Automation Magazine, vol.13, issue.4, pp.82-90, 2006.
URL : https://hal.archives-ouvertes.fr/inria-00350638

F. Chaumette and S. Hutchinson, Visual servo control. II. Advanced approaches, IEEE Robotics Automation Magazine, vol.14, issue.1, pp.109-118, 2007.
URL : https://hal.archives-ouvertes.fr/inria-00350638

F. Chaumette, P. Rives, and B. Espiau, Classification and realization of the different vision-based tasks, World Scientific Series in Robotics and Intelligent Systems, vol.7, pp.199-228, 1993.
URL : https://hal.archives-ouvertes.fr/hal-01548352

A. Cherubini, F. Chaumette, and G. Oriolo, An image-based visual servoing scheme for following paths with nonholonomic mobile robots, 10th International Conference on Control, Automation, Robotics and Vision, pp.108-113, 2008.
URL : https://hal.archives-ouvertes.fr/inria-00351859

B. A. Childers, D. K. Gifford, R. G. Duncan, M. T. Raum, M. E. Vercellino et al., Fiber optic position and shape sensing device and method relating thereto, US Patent, vol.7, p.724, 2010.

R. D. Christ and R. L. Sr, The ROV Manual: A User Guide for Remotely Operated Vehicles, 2013.

C. Collewet and F. Chaumette, Positioning a camera with respect to planar objects of unknown shape by coupling 2-D visual servoing and 3-D estimations, IEEE Transactions on Robotics and Automation, vol.18, issue.3, pp.322-333, 2002.
URL : https://hal.archives-ouvertes.fr/inria-00352088

A. Comport, E. Marchand, C. , and F. , Robust model-based tracking for robot vision, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, IROS'04, vol.1, pp.692-697, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00352025

A. Comport, E. Marchand, C. , and F. , Kinematic sets for real-time robust articulated object tracking, Image and Vision Computing, vol.25, issue.3, pp.374-391, 2007.
URL : https://hal.archives-ouvertes.fr/inria-00350642

G. Conte, D. Scaradozzi, D. Mannocchi, P. Raspa, L. Panebianco et al., Development and Experimental Tests of a ROS Multi-agent Structure for Autonomous Surface Vehicles, Journal of Intelligent & Robotic Systems, vol.92, issue.3-4, pp.705-718, 2018.

V. Creuze, Robots marins et sous-marins. Perception, modélisation, commande, vol.661, p.7783, 2014.
URL : https://hal.archives-ouvertes.fr/lirmm-01084620

T. Dallej, M. Gouttefarde, N. Andreff, R. Dahmouche, and P. Martinet, Vision-based modeling and control of large-dimension cable-driven parallel robots, Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference On, pp.1581-1586, 2012.
URL : https://hal.archives-ouvertes.fr/lirmm-00737658

T. Dallej, M. Gouttefarde, N. Andreff, M. Michelin, and P. Martinet, Towards vision-based control of cable-driven parallel robots, Intelligent Robots and Systems (IROS), pp.2855-2860, 2011.
URL : https://hal.archives-ouvertes.fr/hal-00691562

F. R. Driscoll, M. Nahon, and R. G. Lueck, A comparison of ship-mounted and cage-mounted passive heave compensation systems, Journal of Offshore Mechanics and Arctic Engineering, vol.122, issue.3, pp.214-221, 2000.

C. Dune, Localisation et Caractérisation d'objets Inconnusà Partir d'informations Visuelles, 1920.

Z. Echegoyen, I. Villaverde, R. Moreno, M. Graña, and A. Anjou, Linked multi-component mobile robots: Modeling, simulation and control, Robotics and Autonomous Systems, vol.58, issue.12, pp.1292-1305, 2010.

O. A. Eidsvik and I. Schjølberg, Time domain modeling of rov umbilical using beam equations, IFAC-PapersOnLine, vol.49, issue.23, pp.452-457, 2016.

B. Espiau, F. Chaumette, and P. Rives, A new approach to visual servoing in robotics, IEEE Transactions on Robotics and Automation, vol.8, issue.3, pp.313-326, 1992.

J. Estevez and M. Graña, Robust control tuning by PSO of aerial robots hose transportation, International Work-Conference on the Interplay Between Natural and Artificial Computation, pp.291-300, 2015.

J. Estevez, J. M. Lopez-guede, and M. Graña, Quasi-stationary state transportation of a hose with quadrotors, Robotics and Autonomous Systems, vol.63, pp.187-194, 2015.

J. T. Feddema, C. G. Lee, and O. R. Mitchell, Weighted selection of image features for resolved rate visual feedback control, IEEE Transactions on Robotics and Automation, vol.7, issue.1, pp.31-47, 1991.

J. T. Feddema, C. S. Lee, and O. R. Mitchell, Automatic selection of image features for visual servoing of a robot manipulator, 1989 International Conference on Robotics and Automation Proceedings, vol.2, pp.832-837, 1989.

J. T. Feddema and O. R. Mitchell, Vision-guided servoing with feature-based trajectory generation (for robots), IEEE Transactions on Robotics and Automation, vol.5, issue.5, pp.691-700, 1989.

T. I. Fossen, Handbook of Marine Craft Hydrodynamics and Motion Control, 2011.

T. I. Fossen, T. A. Johansen, and T. Perez, A survey of control allocation methods for underwater vehicles, Underwater Vehicles, 2009.

J. E. Frank, R. Geiger, D. R. Kraige, and A. Murali, Smart tether system for underwater navigation and cable shape measurement, US Patent, vol.8, p.979, 2013.

E. F. Fukushima, N. Kitamura, and S. Hirose, A new flexible component for field robotic system, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), vol.3, pp.2583-2588, 2000.

D. B. Gennery, Visual tracking of known three-dimensional objects, International Journal of Computer Vision, vol.7, issue.3, pp.243-270, 1992.

B. Gerkey and K. Conley, , 2011.

, IEEE Robotics Automation Magazine, vol.18, issue.3, pp.16-16

K. Hashimoto, Visual Servoing, vol.7, 1993.

G. S. Hawkes and D. C. Jeffrey, Tether cable management apparatus and method for a remotely-operated underwater vehicle, US Patent, vol.4, p.927, 1987.

C. T. Howell, Investigation of the Dynamics of Low-Tension Cables, 1992.

P. Huber, Robust Statistics. Wiley Series in Probability and Statistics, 1981.

T. L. Huntsberger, A. Trebi-ollennu, H. Aghazarian, P. S. Schenker, P. Pirjanian et al., Distributed control of multi-robot systems engaged in tightly coupled tasks, Autonomous Robots, vol.17, issue.1, pp.79-92, 2004.

A. Huster, H. Bergstrom, J. Gosior, and D. White, Design and operational performance of a standalone passive heave compensation system for a work class ROV, OCEANS 2009, MTS/IEEE Biloxi-Marine Technology for Our Future: Global and Local Challenges, pp.1-8, 2009.

A. Inzartsev and A. Pavin, AUV Application for Inspection of Underwater Communications, p.884025486, 2009.

J. Iqbal, S. Heikkila, and A. Halme, Tether tracking and control of ROSA robotic rover, Control, Automation, Robotics and Vision, pp.689-693, 2008.

H. M. Irvine, Cable Structures, Mobile Robotics. Mobile Robotics, vol.17, 1981.

E. R. Johnston, F. Beer, and E. Eisenberg, Vector Mechanics for Engineers: Statics and Dynamics, 2009.

C. Katlein, M. Schiller, H. J. Belter, V. Coppolaro, D. Wenslandt et al., A New Remotely Operated Sensor Platform for Interdisciplinary Observations under Sea Ice, Frontiers in Marine Science, vol.4, p.281, 2017.

C. Kervrann and F. Heitz, A hierarchical Markov modeling approach for the segmentation and tracking of deformable shapes, Graphical Models and Image Processing, vol.60, issue.3, pp.173-195, 1998.

O. Khatib, X. Yeh, G. Brantner, B. Soe, B. Kim et al., Ocean one: A robotic avatar for oceanic discovery, IEEE Robotics & Automation Magazine, vol.23, issue.4, pp.20-29, 2016.
URL : https://hal.archives-ouvertes.fr/lirmm-01567467

S. Kiribayashi, K. Yakushigawa, and K. Nagatani, Position estimation of tethered micro unmanned aerial vehicle by observing the slack tether, 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), pp.159-165, 2017.

V. V. Klemas, Coastal and environmental remote sensing from unmanned aerial vehicles: An overview, Journal of Coastal Research, vol.31, issue.5, pp.1260-1267, 2015.

N. Koenig and A. Howard, Design and use paradigms for Gazebo, an opensource multi-robot simulator, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), vol.3, pp.2149-2154, 2004.

M. Krishna, J. Bares, and E. Mutschler, Tethering system design for Dante II, Robotics and Automation, 1997. Proceedings, vol.2, pp.1100-1105, 1997.

S. Krupinski, R. Desouche, N. Palomeras, G. Allibert, and M. Hua, Pool testing of AUV visual servoing for autonomous inspection, IFAC-PapersOnLine, vol.48, issue.2, pp.274-280, 2015.
URL : https://hal.archives-ouvertes.fr/hal-01230498

M. Laranjeira, C. Dune, and V. Hugel, Catenary-based visual servoing for tethered robots, 2017 IEEE International Conference On, pp.732-738, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01657118

T. Lee, Geometric controls for a tethered quadrotor UAV, Decision and Control (CDC), 2015 IEEE 54th Annual Conference On, pp.2749-2754, 2015.

M. L'hour and V. Creuze, French Archaeology's Long March to the Deep -The Lune Project: Building the Underwater Archaeology of the Future, Experimental Robotics, pp.911-927, 2016.

D. G. Lowe, Robust model-based motion tracking through the integration of search and estimation, International Journal of Computer Vision, vol.8, issue.2, pp.113-122, 1992.

Y. Ma, S. Soatto, J. Kosecka, and S. S. Sastry, An Invitation to 3-d Vision: From Images to Geometric Models, vol.26, 2012.

V. Mallapragada, N. Sarkar, and T. K. Podder, Toward a robot-assisted breast intervention system, IEEE/ASME Transactions on Mechatronics, vol.16, issue.6, pp.1011-1020, 2011.

M. M. Manhães, S. A. Scherer, M. Voss, L. R. Douat, and T. Rauschenbach, UUV Simulator: A Gazebo-based package for underwater intervention and multi-robot simulation, OCEANS 2016 MTS/IEEE Monterey, 2016.

N. Mansard and F. Chaumette, Task Sequencing for High-Level Sensor-Based Control, IEEE Transactions on Robotics, vol.23, issue.1, pp.60-72, 2007.

G. Marani, S. K. Choi, Y. , and J. , Underwater autonomous manipulation for intervention missions, AUVs. Ocean Engineering, vol.36, issue.1, pp.15-23, 2009.

E. Marchand, P. Bouthemy, F. Chaumette, and V. Moreau, Robust real-time visual tracking using a 2D-3D model-based approach, Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference On, vol.1, pp.262-268, 1999.
URL : https://hal.archives-ouvertes.fr/inria-00352549

E. Marchand and F. Chaumette, Feature tracking for visual servoing purposes, Robotics and Autonomous Systems, vol.52, issue.1, pp.53-70, 2005.
URL : https://hal.archives-ouvertes.fr/inria-00351898

E. Marchand, F. Spindler, C. , and F. , ViSP for visual servoing: A generic software platform with a wide class of robot control skills, IEEE Robotics and Automation Magazine, vol.12, issue.4, pp.40-52, 2005.
URL : https://hal.archives-ouvertes.fr/inria-00351899

P. Mcgarey, K. Mactavish, F. Pomerleau, and T. D. Barfoot, The line leading the blind: Towards nonvisual localization and mapping for tethered mobile robots, Robotics and Automation (ICRA), 2016 IEEE International Conference On, pp.4799-4806, 2016.

P. Mcgarey, K. Mactavish, F. Pomerleau, and T. D. Barfoot, TSLAM: Tethered simultaneous localization and mapping for mobile robots, The International Journal of Robotics Research, vol.36, issue.12, pp.1363-1386, 2017.

P. Mcgarey, F. Pomerleau, and T. D. Barfoot, System design of a tethered robotic explorer (TReX) for 3D mapping of steep terrain and harsh environments, Field and Service Robotics, pp.267-281, 2016.

P. J. Mckerrow and D. Ratner, The design of a tethered aerial robot, Proceedings 2007 IEEE International Conference on Robotics and Automation, pp.355-360, 2007.

J. Merlet, Computing Cross-Sections of the Workspace of a Cable-Driven Parallel Robot with 6 Sagging Cables Having Limited Lengths, International Symposium on Advances in Robot Kinematics, pp.392-400, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01965231

J. Merlet, An Experimental Investigation of Extra Measurements for Solving the Direct Kinematics of Cable-Driven Parallel Robots, 2018 IEEE International Conference on Robotics and Automation (ICRA), pp.6947-6952, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01965232

J. Merlet, Some properties of the Irvine cable model and their use for the kinematic analysis of cable-driven parallel robots, European Conference on Mechanism Science, pp.409-416, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01965230

M. Milutinovi?, N. Kran?cevi?, and J. Deur, Multi-mass dynamic model of a variable-length tether used in a high altitude wind energy system, Energy conversion and management, vol.87, pp.1141-1150, 2014.

M. Molchan, The role of micro-rovs in maritime safety and security, 2005.

A. C. Murtra and J. M. Tur, IMU and cable encoder data fusion for in-pipe mobile robot localization, TePRA, pp.1-6, 2013.

D. Navarro-alarcon and Y. Liu, Fourier-Based Shape Servoing: A New Feedback Method to Actively Deform Soft Objects into Desired 2-D Image Contours, IEEE Transactions on Robotics, vol.34, issue.1, pp.272-279, 2018.

D. Navarro-alarcon, Y. Liu, J. G. Romero, L. , and P. , Visually servoed deformation control by robot manipulators, 2013 IEEE International Conference on Robotics and Automation, pp.5259-5264, 2013.

D. Navarro-alarcon, Y. Liu, J. G. Romero, L. , and P. , Model-free visually servoed deformation control of elastic objects by robot manipulators, IEEE Transactions on Robotics, vol.29, issue.6, pp.1457-1468, 2013.

D. Navarro-alarcon, Y. Liu, J. G. Romero, L. , and P. , On the visual deformation servoing of compliant objects: Uncalibrated control methods and experiments , On the visual deformation servoing of compliant objects: Uncalibrated Bibliography control methods and experiments, The International Journal of Robotics Research, vol.33, issue.11, pp.1462-1480, 2014.

K. G. Nayar, M. H. Sharqawy, L. D. Banchik, V. , and J. H. , Thermophysical properties of seawater: A review and new correlations that include pressure dependence, Desalination, vol.390, pp.1-24, 2016.

K. Nickels and S. Hutchinson, Model-based tracking of complex articulated objects, IEEE Transactions on Robotics and Automation, vol.17, issue.1, pp.28-36, 2001.

M. M. Nicotra, R. Naldi, and E. Garone, Nonlinear control of a tethered UAV: The taut cable case, Automatica, vol.78, pp.174-184, 2017.

N. P. Papanikolopoulos, P. K. Khosla, and T. Kanade, Visual tracking of a moving target by a camera mounted on a robot: A combination of control and vision, IEEE transactions on robotics and automation, vol.9, issue.1, pp.14-35, 1993.

J. Park, B. Jun, P. Lee, and J. Oh, Experiments on vision guided docking of an autonomous underwater vehicle using one camera, Ocean Engineering, vol.36, issue.1, pp.48-61, 2009.

D. P. Perrin, A. Kwon, and R. D. Howe, A novel actuated tether design for rescue robots using hydraulic transients, IEEE International Conference On, vol.4, pp.3482-3487, 2004.

A. Pettersson, T. Ohlsson, S. Davis, J. Gray, and T. Dodd, A hygienically designed force gripper for flexible handling of variable and easily damaged natural food products, Innovative Food Science & Emerging Technologies, vol.12, issue.3, pp.344-351, 2011.

S. Prabhakar and B. Buckham, Dynamics modeling and control of a variable length remotely operated vehicle tether, OCEANS, 2005. Proceedings of MTS/IEEE, pp.1255-1262, 2005.

M. Prats, N. Palomeras, P. Ridao, and P. J. Sanz, Template Tracking and Visual Servoing for Alignment Tasks with Autonomous Underwater Vehicles, IFAC Proceedings Volumes, vol.45, pp.256-261, 2012.

K. S. Pratt, R. R. Murphy, J. L. Burke, J. Craighead, C. Griffin et al., Use of Tethered Small Unmanned Aerial System at Berkman Plaza II Collapse, IEEE International Workshop on Safety, Security and Rescue Robotics, pp.134-139, 2008.

M. Pressigout, Approches Hybrides Pour Le Suivi Temps-Réel d'objets Complexes Dans Des Séquences Vidéos, Rennes, vol.1, 2006.

V. A. Rajan, A. Nagendran, A. Dehghani-sanij, and R. C. Richardson, Tether monitoring for entanglement detection, disentanglement and localisation of autonomous robots, Robotica, vol.34, issue.3, pp.527-548, 2016.

C. F. Reverte, S. M. Thayer, W. Whittaker, E. C. Close, A. Slifko et al., Autonomous inspector mobile platform, US Patent, vol.8, p.66, 2011.

B. Ropars, A. Lasbouygues, L. Lapierre, and D. Andreu, Thruster's deadzones compensation for the actuation system of an underwater vehicle, 2015 European Control Conference (ECC), pp.741-746, 2015.

T. Sahin and M. Unel, Globally stabilized 3L curve fitting, International Conference Image Analysis and Recognition, pp.495-502, 2004.

T. Salgado-jimenez, J. L. Gonzalez-lopez, L. F. Martinez-soto, E. Olguin-lopez, P. A. Resendiz-gonzalez et al., Deep water ROV design for the Mexican oil industry, OCEANS'10 IEEE SYDNEY, pp.1-6, 2010.

S. Shimono, O. Matsubara, S. Toyama, U. Nishizawa, S. Kato et al., Development of underwater inspection system for dam inspection, OCEANS 2015 -MTS/IEEE Washington, pp.1-6, 2015.

Y. Shirai and H. Inoue, Guiding a robot by visual feedback in assembling tasks, Pattern recognition, vol.5, issue.2, pp.99-106, 1973.

A. D. Short and C. D. Woodroffe, The Coast of Australia, 2009.

B. Siciliano and O. Khatib, Springer Handbook of Robotics, 2016.

R. J. Smolowitz, S. H. Patel, H. L. Haas, and S. A. Miller, Using a remotely operated vehicle (ROV) to observe loggerhead sea turtle (Caretta caretta) behavior on foraging grounds off the mid-Atlantic United States, Journal of Experimental Marine Biology and Ecology, vol.471, pp.84-91, 2015.

O. Tahri and F. Chaumette, Application of moment invariants to visual servoing, International Conference on Robotics and Automation, vol.3, pp.4276-4281, 2003.
URL : https://hal.archives-ouvertes.fr/inria-00352082

K. A. Talke, M. De-oliveira, and T. Bewley, Catenary Tether Shape Analysis for a UAV -USV Team, IROS: International Conference on Intelligent Robots and Systems, pp.7803-7809, 2018.

M. Tognon and A. Franchi, Control of motion and internal stresses for a chain of two underactuated aerial robots, Control Conference (ECC), pp.1620-1625, 2015.
URL : https://hal.archives-ouvertes.fr/hal-01137736

M. Torabi, K. Hauser, R. Alterovitz, V. Duindam, and K. Goldberg, Guiding medical needles using single-point tissue manipulation, Robotics and Automation, 2009. ICRA'09. IEEE International Conference On, pp.2705-2710, 2009.

M. S. Triantafyllou and M. A. Grosenbaugh, Robust control for underwater vehicle systems with time delays, IEEE Journal of Oceanic Engineering, vol.16, issue.1, pp.146-151, 1991.

D. Tsai, I. A. Nesnas, and D. Zarzhitsky, Autonomous vision-based tetheredassisted rover docking, Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference On, pp.2834-2841, 2013.

R. Tsai, A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses, IEEE Journal on Robotics and Automation, vol.3, issue.4, pp.323-344, 1987.

S. Van-der-zwaan, A. Bernardino, S. , and J. , Visual station keeping for floating robots in unstructured environments, Robotics and Autonomous Systems, vol.39, issue.3-4, pp.145-155, 2000.

T. Vishnu, A. Kumar, and R. Richardson, Tether monitoring techniques for environment monitoring, tether following and localization of autonomous mobile robots, IEEE/RSJ International Conference On, pp.2109-2114, 2008.

H. Wakamatsu, E. Arai, and S. Hirai, Knotting/unknotting manipulation of deformable linear objects, The International Journal of Robotics Research, vol.25, issue.4, pp.371-395, 2006.

X. Wang and S. Bhattacharya, A Topological Approach to Workspace and Motion Planning for a Cable-Controlled Robot in Cluttered Environments, IEEE Robotics and Automation Letters, vol.3, issue.3, pp.2600-2607, 2018.

Z. Wang and S. Hirai, Modeling and estimation of rheological properties of food products for manufacturing simulations, Journal of food engineering, vol.102, issue.2, pp.136-144, 2011.

L. Weiss, A. Sanderson, and C. Neuman, Dynamic sensor-based control of robots with visual feedback, IEEE Journal on Robotics and Automation, vol.3, issue.5, pp.404-417, 1987.

R. L. Wernli and R. D. Christ, Observation Class ROVs Come of Age, Sixth International Symposium on Underwater Technology Wuxi, 2009.

R. B. Wynn, V. A. Huvenne, L. Bas, T. P. Murton, B. J. Connelly et al., Autonomous Underwater Vehicles (AUVs): Their past, present and future contributions to the advancement of marine geoscience, Marine Geology, vol.352, pp.451-468, 2014.

W. Yang, Z. Zhang, and A. Zhang, Research on an active heave compensation system for remotely operated vehicle, Intelligent Computation Technology and Automation (ICICTA), 2008 International Conference On, vol.2, pp.407-410, 2008.

A. Y. Yazicioglu, B. Calli, and M. Unel, Image based visual servoing using algebraic curves applied to shape alignment, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.5444-5449, 2009.

J. Zhu, B. Navarro, P. Fraisse, A. Crosnier, and A. I. Cherubini, Dualarm robotic manipulation of flexible cables, IROS: International Conference on Intelligent Robots and Systems, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01734740

L. Zikou, C. Papachristos, and A. Tzes, 11 An illustration presenting the symmetry of the catenary features according to the reference frame chosen (F 1 or F 2 ). The catenary slackness are H 1 = 0 Z 1 and H 2 = 0 Z 2 . The attachment points difference of height is ?H 1 = 1 Z 2 and H 2 = 2 Z 1 . The catenary parameter C(H 1 , ?H 1 ) = C(H 2 , ?H 2 ) is invariant whatever the point P i we choose for the starting attachment point of the cable. Thanks to the symmetry we have: ?H 1 = ??H 2 , H 1 = H 2 + ?H 2 , H 2 = H 1 + ?H 1 , ?D 1 = ??D 2 , D 1 = D 2 + ?D 2 , D 2 = D 1 + ?D 1 . As a reminder: H i > 0, H i + ?H i > 0, D i > 0 and D i + ?D i > 0, The power-over-tether system for powering small UAVs: Tethering-line tension control synthesis, p.58, 2015.

, Some examples of the mock tether images captured by the BlueROV1 camera in different configurations. (a) the tether is completely visible. (b) the tether plane is aligned with the camera optical axis. (c) and (d) the tether is partially visible. The mock tether is made of a leaded curtain wire wrapped by an orange cotton ribbon, p.59

, Frontal pinhole imaging model: the perspective projection of a 3-D point P a on the image plane ? i is the point p a, p.63

. .. , Transformation from pixels to meters coordinates, p.63

, Visible light spectrum for varying wavelengths. Extracted from shutha

, Courtesy of NYC Departement of Education, p.66

, The mode corresponding to the specific color of the tether is selected by two thresholds in the hue histogram. The minimum and maximum thresholds are displayed in green and red lines, respectively (b,e). The detected points are refined thanks to morphomathematical closing and skeletization (c,f), Tether detection examples in both aerial (a,b,c) and underwater (d,e,f) experiments

, (a) the tether is perpendicular to the robot longitudinal axis. (b) the tether plane is aligned with the segment P c P 2 and the catenary curve degenerates into a straight line in the image. The tether attachment point has an offset with respect to the camera axis ( c X 2 = 0). (c) the specific case where c X 2 = 0, Top view of a robot-tether system presenting the singularities on catenary shape estimation with respect to the tether orientation parameter b = sin ?

. .. , The tether length is L = 1.50m and the height difference between the attachment points is ?H = 0. The tether shape is defined by the feature vector s = [0.6, 0.6] T , with H max = 0.70m. The red dashed line represents the catenary points which are beyond the tether length (L = 1.50m). (b) a top view illustration of catenary remote points that become more and more tight and vertical in the image plane as? ? ?, Case of remote points and large residuals. (a) an illustration of residuals (black arrows) between tether detected points (in green) and the projection of the catenary current estimation

, The simulation of two BlueROV 1 linked by an orange sagging tether on Gazebo

, T , which corresponds to a slackness H = 0.28 meter and orientation angle ? = 27 ? . 98 3.23 The descent of the GN algorithm marked by black dots. The starting point is at s 0 = [0.5, 0.5] T and the solution is at s = [0.40, 0.45] T , marked by a red star. The cost function values are plotted in the background using the color scale depicted on the right, The (a) source image and (b) tether detected points of study case 1. The tether shape is given by the feature vector s

T. .. , The tether detected points are drawn in green dots, the tether catenary initial estimation is drawn by a red dashed line and the final estimation is drawn by a red full line. The tether shape estimation is achieved by the three methods: (a) GN, (b) IGN and (c) IGN + IG algorithms. The initial guess of the tether shape is given by s 0 =

?. .. , 102 3.28 Comparison of the descent of the IGN (white dots) and IGN+IG (black dots) algorithms. The solution is at s = [0.4, 0.6] T , marked by a red star. The cost function values are plotted in the background using the color scale depicted on the right. The solution is only reached if an initial guess, The (a) source image and (b) tether detected points of study case 2. The tether shape is given by the feature vector s, vol.101

, IGN and (c) IGN+IG. The tether detected points are drawn in green, the initial estimation is drawn by a red dashed line and the final estimation is drawn by a red full line. The correct tether shape estimation is only achieved by the improved Gauss-Newton with initial guess (IGN+IG), Catenary curve fitting problem depicted in the image plane for the three methods studied: (a) GN, (b)

, Estimation and real values of the tether shape during the execution of a path by the follower robot. (a) the fitting procedure is used during the whole path execution. (b) during the singularity crossing (around 8 second), the fitting procedure is stopped and the previously estimated values of the tether shape are maintained

. Espiau, Detected feature points in large dots and their desired positions are drawn in small white dots. (b) an example of an amorphous planar shape used to validate the use of image moments in visual servoing; extracted from Tahri and Chaumette (2003). (c) an example of shape alignment using a technique of contour fitting to estimate object feature points to be used in visual servoing, p.114, 1992.

, Chart flow of the vision-based tether shape control scheme. Details about the visual servoing control loop

, Algorithm chart flow for catenary-based visual servoing, p.116

, 117 4.6 Experimental setup: two Turtlebots (Gerkey and Conley, 2011) simulate a tether handling system for remotely operated robots. The leader robot freely explores its surroundings while the follower robot is, vol.127

, ) the parameters evolution. The tether goes from an initial to a desired shape (s o = (0.9, 0.8) and s * = (0.5, 0.5), respectively). (b) the control velocities. Linear velocity (? x ) in m/s and angular velocity (? z ) in rad/s

, Results of a real experiment for tether shape control. (a) the tether parameters evolution. The tether goes from an initial to a desired shape (s o = (0.9, 0.8) and s * = (0.5, 0.5), respectively). (b) the control velocities. Linear velocity (? x ) in m/s and angular velocity

, The leader robot freely moves while the follower robot maintains a desired tether shape s * = (0.7, ?0.5). (a) the leader and follower trajectories with time indications in seconds. (b) the tether parameters evolution. (c) the fitting quality index Q evolution during the experiment. Feature prediction is used in cases of wrong rope detection and inaccurate fitting, p.129

, Image features used to manage the tether shape: its highest and lowest points in the image (p A and p B ) and the line segment p A p B in blue. (a) 3D simulated scene where the tether 3D lowest point P 0 is out of the follower's camera field of view. (b) corresponding embedded view with p 0 being the perspective projection of P 0 . (c) Another situation where P 0 is inside the follower's camera field of view

, Follower robot camera view for (a) first, (b) second and (c) third simulation. Initial and desired tether shape in the image are drawn in blue and red lines, respectively

, Normal case: error evolution for (a) tether slackness and (b) orientation as well as (c) linear and (d) angular command velocities. Tether initial and desired shape are respectively: (H 0 = 0.20m, ? 0 = 20 ? ) and (H * = 0.20m, ? * = 45 ? )

, Limit case 1: error evolution for (a) tether slackness and (b) orientation as well as (c) linear and (d) angular command velocities. Tether initial and desired shape are respectively: (H 0 = 0.25m, ? 0 = 10 ? ) and (H * = 0.25m, ? * = 60 ? )

, Limit case 2: error evolution for (a) tether slackness and (b) orientation as well as (c) linear and (d) angular command velocities. Tether initial and desired shape are respectively: (H 0 = 0.10m, ? 0 = 20 ? ) and (H * = 0.24m, ? * = 0 ? )

. .. , Tether lowest point trajectory in the image plane for (a) normal case, (b) limit case 1 and (c) limit case 2. (d) legend, p.136

, Legend in Fig. 4.15d. (d) condition number evolution for the curve fitting Gauss-Newton Jacobian during normal and limit cases137

, and (e,f,g,h) follower point of view. The (a,c,e,g) initial and (b,d,f,h) final images of the visual servoing. Figures (a,b,e,f) are the source images while figures (c,d,g,h) depict the tether detected points (in blue), the shape estimation (in red) and the desired shape in the image, p.159

, Simulation results for the controller using both leader and follower cameras. (a) tether shape evolution from leader and follower points of view. Full and dashed lines represent estimated and real simulated values, respectively. (b) follower robot velocity evolution. The tether initial and desired shape are given in Table4

, Catenary solutions with DH negative, p.175

. .. , Top view and lateral view of the robot set up

. .. , Left: follower features when ? x > 0, Right: follower features when ? x < 0 (blue a i , red b i , green d i )

. .. , Left: leader features when ? x > 0, Right: leader features when ? x < 0 (blue a i , red b i , green d i )

. .. , Left: follower features when ? y > 0, Right: follower features when ? y < 0 (blue a i , red b i , green d i )

. .. , Left: leader features when ? y > 0, Right: leader features when ? y < 0 (blue a i , red b i , green d i )

. .. F.6-;, Left: follower features when ? z > 0, Right: follower features when ? z < 0 (blue a i , red b i , green d i )

F. ;. , Left: leader features when ? z > 0, Right: leader features when ? z < 0 (blue a i , red b i , green d i )

F. , Left: follower features when ? z > 0. Right: follower features when ? z < 0. (blue a i , red b i , green d i ), 0200.

. .. , Left: leader features when ? z > 0. Right: leader features when ? z < 0. (blue a i , red b i , green d i )