A virtual forearm can bend in a blink. It can also take its time, easing toward a target as if It’s thinking about the move.
In a new virtual reality study, both extremes felt wrong. When a prosthetic arm moves on its own, speed turns out to be more than a performance setting. It can shape whether the arm feels like it belongs to you, whether you feel any control over it, whether you would want to use it, and even whether the “robot” comes across as capable or unsettling. Researchers found that the sweet spot for acceptance was a movement that closely resembled a natural human reach: roughly one second.
The work, published this month, comes from Harin Manujaya Hapuarachchi and colleagues. Hapuarachchi was a doctoral student when the study was conducted and is now an Assistant Professor in the School of Informatics at Kochi University of Technology. Their research addresses a question that sits a step ahead of today’s prosthetics: how to make autonomous prosthetic limbs feel truly integrated with the user.
Much of current prosthetic research focuses on helping users control artificial limbs through intention, often by reading biosignals such as electromyography (EMG) or electroencephalography (EEG). But as machine learning improves, the possibility of prosthetic devices that sometimes act autonomously or semi-autonomously – assisting a user by moving without a direct command – is becoming increasingly realistic. These systems could anticipate needs and provide support automatically.
That promise, however, carries a risk. If a body part moves independently of your will, it can feel “unsettling” or like it is not part of you. This mismatch could become a major barrier to real-world acceptance.
A speed test for “does this feel like my arm?”
To probe this problem safely, the researchers used virtual reality to simulate a scenario in which a participant’s own arm had been replaced by a robotic prosthetic forearm. Nineteen male university students with an average age of 24.15 years participated in the study, which was approved by the Ethical Committee for Human Subject Research at Toyohashi University of Technology, and all participants provided written informed consent.
Participants wore a high-resolution head-mounted display (Varjo Aero, 2880 x 2720 pixels per eye, 90 Hz) and a motion capture suit tracked by a VICON system with 12 cameras recording at 250 Hz. A rigid brace was used on their real left arm to prevent bending at the elbow, ensuring that the virtual prosthetic performed the bending motion.
The task involved a reaching exercise. A purple sphere, 5 centimeters in diameter, appeared in front of the avatar. Participants moved their upper arm to bring the virtual elbow toward the sphere. Once the elbow was close enough, the virtual prosthetic forearm autonomously flexed toward the target in a minimum-jerk trajectory.
Testing six movement durations
The key experimental variable was the duration of that autonomous bend. The study tested six movement durations: 125 milliseconds, 250 milliseconds, 500 milliseconds, 1 second, 2 seconds, and 4 seconds. Each condition consisted of 15 reaches, and each participant completed two blocks per condition.
After each block, participants filled out questionnaires measuring embodiment, usability, and social impressions of the prosthetic as a robot. Embodiment was assessed through two components: a sense of agency and a sense of ownership. Agency was measured by agreement with the statement, “The movements of the virtual prosthetic arm seemed to be my movements.” Ownership was measured by agreement with, “I felt as if the virtual prosthetic arm I saw was my own left arm.” Usability was measured using the System Usability Scale (SUS), which produces a score from 0 to 100. Social impressions were measured using the Robotic Social Attributes Scale (RoSAS), which assesses competence, warmth, and discomfort.
One second felt best, and the extremes paid a penalty
Across all measurements, the one-second condition consistently stood out. Statistical analysis revealed that ownership in the 1-second condition was significantly higher than in the fastest and slowest conditions. Ownership at 500 milliseconds was also significantly higher than at 125 milliseconds and 4 seconds. Agency followed a similar pattern, with 1 second showing significantly higher agency than 125 milliseconds and 4 seconds, and 500 milliseconds showing higher agency than 125 milliseconds and 4 seconds.
Usability also climbed with the more human-like timing. The 1-second condition yielded significantly higher usability scores than the 125-millisecond and 4-second conditions. The 500-millisecond condition also scored significantly higher than the 125-millisecond condition. The 2-second condition scored significantly higher than 4 seconds and came close to being lower than 1 second.
very fast and very slow movements both diminished the feeling that the prosthetic was part of the body and reduced its usability. The middle range, particularly 1 second, produced the strongest sense of ownership and agency, along with the best usability.
Social impressions shift too, especially discomfort
The “robot personality” scores also shifted with speed, though not uniformly. Competence ratings at 500 milliseconds and 1 second were significantly higher than at 4 seconds. Competence at 1 second was also significantly higher than at 2 seconds. There were no significant differences among the faster and moderate conditions (125 milliseconds, 250 milliseconds, 500 milliseconds, and 1 second).
Warmth did not show a clear dependence on speed. However, discomfort spiked when the prosthetic moved fastest. The 125-millisecond condition produced significantly higher discomfort than the 500-millisecond, 1-second, 2-second, and 4-second conditions.
Optimizing solely for speed could result in a device perceived as capable but also scarier or more awkward. The fastest movement was the most uncomfortable to experience.
Why one second might feel “right”
The researchers suggest that 1 second may be close to the timing people naturally expect from reaching movements. They cite prior work which reported that when people reached “naturally” under instructions to be accurate, they tended to choose a movement duration close to 1 second. This resemblance could explain why embodiment and usability peaked at 1 second in this experiment.
However, the paper cautions that reaching timing can vary depending on the task. Fitts’ law predicts that movement duration changes with factors like distance to the target and target size. A “human-like” duration might not always be one second in every situation.
Interestingly, the study also revealed a behavioral pattern. Participants slowed their own movements when the prosthetic was slow, almost as if they were matching their pace to the “arm’s” movement.
Limits of a virtual arm, and why VR still helps
This study deliberately used a virtual prosthetic with healthy participants and a brace to constrain real arm bending. This design allowed researchers to isolate speed as the primary variable, but it cannot fully replicate the experience of an amputee using a physical device. Factors such as the forces generated by a physical prosthesis, its weight, and the forces at the connection points with the residual limb were not included.
The authors also note that in their setup, the target was always visible, making the prosthetic’s intention more predictable. Prior work suggests predictability can increase agency and ownership, so future studies may need to explore scenarios with less clear intention. They also suggest expanding the measurements beyond subjective questionnaires, using methods such as intentional binding or physiological measures, to capture agency and embodiment in different ways.
Despite these limitations, VR offers a practical advantage. It can simulate prosthetic control styles that are not yet widespread, allowing designers to test acceptance issues early in the development process.
