In this experiment Dan Cheung Petersen, student at the University of Copenhagen uses arduino and solenoid valves to get a haptic – air pressure – feedback while using Leap Motion device.
Dan Cheung Petersen: This video demonstrates our attempt to create Haptic Feedback for Mid air Interaction, using the leap motion system, arduino, solenoid valves and a bit of creativity.
The project is our master thesis at the University of Copenhagen, under the supervision of Professor mso Kasper Hornbæk
Read further for our full abstract:
Mid-air interaction has become more common and increasingly popular as SMART TVs have
introduced mid-air interaction into our living rooms. The development of commercial Mid-air
products enables regular TV sets to be transformed into interactive systems that understand
the gesticulations we make.
Furthermore, over the last decade there have been improvements
in the recognition capabilities of these products with regard to more rened movement, such
as hands and individual ngers. While mid-air interaction has permitted natural interaction
between humans and computers, it lacks the realism of touch that we encounter with
traditional interaction methods like the mouse and keyboard. Touch enables users to gain
information about their actions, but this information is absent in mid-air interaction. We
therefore propose to make a method of reinstating the lost information of haptic feedback in
This thesis attempts to investigate how to provide haptic feedback for mid-air interaction;
the question is then how to create feedback when the interactions happen in mid-air and
there is no mouse to click or screen to touch. What both mouse and touchscreen interaction
have in common is that they both rely on touch as input. Many researchers have tried to
produce haptic feedback, but their solutions often require the user to use equipment. Other
researchers have shown that guidance in mid-air interaction improves performance; however,
the combination of these ndings is a relatively unexplored area.
How can the modality of touch be incorporated into a mid-air interaction without constricting
the users with gear that tediously has to be worn and that disrupts the freedom of
movement in mid-air interaction.
In order to achieve an untethered integration of haptic feedback in mid-air interaction, a
prototype system capable of producing non-contact haptic feedback had to be developed. For
this purpose, a pneumatic and an ultrasound prototype were developed. Furthermore, the
LEAP Motion system was used, as it fulls the requirements of being able to track the hand
This thesis describes the study of related work, design proposals and design processes of
the prototype that we have developed. The prototypes were inspired by related work and
formed into sketch ideas. The sketch ideas were tested to conclude the potential of each one;
the most ecient ideas were rened and developed into two prototype concepts.
Furthermore, this thesis describes the design and analysis of empirical tests. In order
to test the problem statement, three tasks were developed. These tasks measured users’
precision, reaction time, and a combination of these factors in mid-air interaction.
The results from the addition of haptic feedback were compared with the results of tests with
visual feedback and tests with a combination of haptic and visual feedback.
The experiments revealed that our proposed method proved that our hypothesis stating
that adding non-contact haptic feedback to visualisations improved performance is true.
Even though we solved our problem statement and proved our hypothesis, our solution was
not without aws. We were only able to produce non-contact haptic feedback in one dimensionality,
although, we believe that the addition of multidimensionality might have increased
performance even further.