TLDR: The TiltXter system introduces a novel approach to improve telemanipulation of deformable objects like Pasteur pipettes. It uses a robotic gripper with tactile sensors and a haptic interface for the human operator. A Convolutional Neural Network (CNN) analyzes the sensor data to detect the object’s tilt and generates specific electro-tactile patterns. Experiments showed that while direct data rendering was ineffective, the CNN-generated tactile patterns significantly increased human tilt recognition from 23.13% to 57.9% and boosted teleoperation success rates from 53.12% to 92.18%. The system also reduced operator workload and frustration, demonstrating a substantial improvement in precision and dexterity for remote handling of soft objects.
Teleoperation and telepresence systems are becoming increasingly common, but accurately manipulating deformable objects remotely remains a significant challenge. When a robotic gripper grasps a soft object, its shape can change, making it difficult for a human operator to accurately perceive its alignment. This ambiguity can lead to errors in positioning and manipulation. To address this, researchers have been exploring various methods to provide clear tactile feedback to users, enhancing their precision and dexterity during telemanipulation.
A new research paper introduces a novel system called TiltXter, which aims to improve the telemanipulation of deformable objects, specifically plastic Pasteur pipettes. The system integrates a Force Dimension Omega.7 haptic interface for the human operator with a 2-finger Robotiq gripper equipped with tactile sensor arrays at the remote site. The core innovation lies in its use of convolutional neural networks (CNN) to detect the tilt of deformable objects and then render this information as electro-tactile stimuli to the user.
How TiltXter Works
The TiltXter system operates by sensing the pressure applied to a grasped object at the remote site. High-density tactile sensor arrays embedded in the Robotiq gripper measure local pressure points. This raw sensor data is then processed by a central computer running a CNN algorithm. The CNN is specifically trained to classify the tilt angle of the object based on the tactile sensor input. Once the tilt is recognized, the CNN generates a corresponding tactile pattern. This pattern is then delivered to the user’s fingertips (thumb and index finger) via electro-stimulation arrays mounted on the Omega.7 haptic device. This allows the operator to ‘feel’ the tilt of the remote object, providing crucial haptic feedback.
The choice of Pasteur pipettes for this study was deliberate, as their inherent deformability under gripper pressure makes them an ideal candidate for testing the system’s ability to handle challenging objects. The research explores how different methods of rendering tactile information impact human perception and teleoperation success.
Experimental Evaluation and Results
The researchers conducted several experiments to evaluate the effectiveness of the TiltXter system. Fourteen participants were involved in assessing human perception of pipette tilt, while nine participants took part in teleoperated grasping tasks.
Initially, participants were asked to identify the tilt of pipettes by directly grasping them, achieving an average recognition rate of 77.92%. This established a baseline for natural human perception.
Next, the study evaluated electro-tactile perception using two different rendering methods:
- Downsized Direct Data Rendering: In this method, raw sensor data was simply downsized and directly rendered as electro-tactile stimuli. The results showed a very low recognition rate of 21.67%, indicating that this approach was largely ineffective for users to distinguish tilt angles.
- CNN-based Tactile Pattern Rendering: This method utilized the tactile patterns generated by the CNN based on its tilt classification. With this approach, the recognition rate significantly improved to 57.6%. When patterns were grouped by their orientation (e.g., distinguishing between positive and negative angles), the recognition rate further increased to 72.5%. This demonstrates the substantial benefit of using CNN-generated patterns for clearer tactile feedback.
A critical experiment involved teleoperated grasping of pipettes at different angles under three conditions:
- Visual Feedback Only: The average success rate for grasping was 53.12%.
- Downsizing Method: The success rate slightly increased to 57.81%.
- CNN Tactile Patterns: The success rate dramatically jumped to 92.18%. This highlights the profound impact of the CNN-based electro-tactile feedback on improving telemanipulation performance.
Furthermore, a NASA Task Load Index (NASA-TLX) questionnaire revealed that using the CNN-generated tactile patterns significantly reduced mental demand, temporal demand, effort, and frustration for operators, while also increasing their perceived performance compared to both visual-only feedback and the downsizing method.
Also Read:
- Advancing Robotic Grasping: A Unified Approach for Diverse Dexterous Hands
- Unifying Perception and Action: A Deep Dive into Vision-Language-Action Models for Robotics
Conclusion and Future Directions
The TiltXter system, with its CNN-based electro-tactile rendering, offers a promising solution for enhancing the telemanipulation of deformable objects. The research clearly demonstrates that while direct downsizing of tactile data is largely ineffective, the application of CNN-generated tactile patterns can significantly improve an operator’s perception of object tilt and lead to a much higher success rate in teleoperated grasping tasks. This approach also reduces the cognitive and physical workload on the human operator, making telemanipulation more intuitive and efficient.
The authors, Miguel Altamirano Cabrera, Jonathan Tirado, Aleksey Fedoseev, Oleg Sautenkov, Vladimir Poliakov, Pavel Kopanev, and Dzmitry Tsetserukou, suggest that future work will focus on acquiring a more extensive dataset to allow for even more accurate tilt detection at higher angular resolutions. The proposed system has the potential for broad application, particularly in remote co-working labs, where it could improve the dexterous telemanipulation of various laboratory instruments and enhance user reactions. For more details, you can read the full research paper here.


