↓ Skip to main content

Device- and system-independent personal touchless user interface for operating rooms

Overview of attention for article published in International Journal of Computer Assisted Radiology and Surgery, March 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age

Mentioned by

twitter
1 X user

Citations

dimensions_citation
18 Dimensions

Readers on

mendeley
53 Mendeley
Title
Device- and system-independent personal touchless user interface for operating rooms
Published in
International Journal of Computer Assisted Radiology and Surgery, March 2016
DOI 10.1007/s11548-016-1375-6
Pubmed ID
Authors

Meng MA, Pascal Fallavollita, Séverine Habert, Simon Weidert, Nassir Navab

Abstract

In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems' software and hardware. To achieve this, a wearable RGB-D sensor is mounted on the surgeon's head for inside-out tracking of his/her finger with any of the medical systems' displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system. To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 53 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Germany 1 2%
Slovenia 1 2%
Unknown 51 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 8 15%
Student > Master 8 15%
Researcher 7 13%
Student > Bachelor 6 11%
Student > Postgraduate 3 6%
Other 7 13%
Unknown 14 26%
Readers by discipline Count As %
Computer Science 13 25%
Medicine and Dentistry 7 13%
Nursing and Health Professions 4 8%
Engineering 4 8%
Design 3 6%
Other 5 9%
Unknown 17 32%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 27 April 2016.
All research outputs
#15,369,653
of 22,865,319 outputs
Outputs from International Journal of Computer Assisted Radiology and Surgery
#497
of 847 outputs
Outputs of similar age
#179,131
of 300,035 outputs
Outputs of similar age from International Journal of Computer Assisted Radiology and Surgery
#17
of 24 outputs
Altmetric has tracked 22,865,319 research outputs across all sources so far. This one is in the 22nd percentile – i.e., 22% of other outputs scored the same or lower than it.
So far Altmetric has tracked 847 research outputs from this source. They receive a mean Attention Score of 3.1. This one is in the 30th percentile – i.e., 30% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 300,035 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 31st percentile – i.e., 31% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 24 others from the same source and published within six weeks on either side of this one. This one is in the 29th percentile – i.e., 29% of its contemporaries scored the same or lower than it.