RGB-D Fusion for Wide Field of View User Feedback in Teleoperation Context
Автор: CNRS-AIST JRL
Загружено: 2026-01-22
Просмотров: 280
Описание:
Accompanying video of the IEEE SII 2026 publication by Raphaël d’Orfani, Antoine André, Mehdi Benallegue, Rafael Cisneros-Limon and Guillaume Caron, work done at CNRS-AIST JRL:
Effective teleoperation involves immersive and
responsive visual feedback to support depth perception and spa-
tial understanding to achieve precise control. Standard camera
views naturally constrain the operator’s Field of View (FoV) of
the remote scene, especially in cluttered or dynamic scenarios.
We present a real-time RGB-D fusion system that expands the
operator’s FoV by employing immersive 3D reconstruction. Our
system incorporates the Azure Kinect sensor into Unreal Engine
using the Robot Operating System (ROS) communication,
rendering live sensor information onto a spherical mesh. This
allows for smooth, wide-FoV rendering of the scene with greater
peripheral context and depth continuity. In contrast to planar
or depth-free systems, the proposed method is enhanced by
live depth retranscription for more interactive teleoperation,
leading to better scene understanding. This architecture lays the
basis for flexible, high-fidelity remote interaction for robotics
applications. All our developments and implementations are
publicly available at https://github.com/isri-aist/RGB-D_Fu....
Article: https://hal.science/hal-05339552
This paper is based on results obtained from a project
of Programs for Bridging the gap between R&D and the
IDeal society (society 5.0) and Generating Economic and
social value (BRIDGE)/Practical Global Research in the AI
x Robotics Services, implemented by the Cabinet Office,
Government of Japan.
We also thank all the subjects who volunteered their effort
and time for the experiments.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: