Project

The influence of hand proximity on cognitive and emotional information processing with multi-touch interfaces

The use and application of interactive multi-touch displays increase continuously. Therefore, touch-based user interfaces like multi-touch tables, tablets or smartphones can be found in many public facilities and private households today. These user interfaces allow direct manipulation of external representations with the hands without the need for additional and indirect interaction devices (e.g., mouse). Thus, the question arises as to how manual interactions should be designed to support information processing.

Previous investigations with simple reaction time tasks have revealed that certain stimulus categories are processed more efficiently when located near the hands compared to others. For instance, it has been demonstrated that visuospatial information (e.g., pictures) near the hands undergoes better processing (Reed, Grubb, & Steele, 2006), while semantic information (e.g., words, sentences) is either processed less efficiently (Davoli, Du, Montana, & Garverick, 2010) or remains unaffected by hand proximity (Wang, Du, He, & Zhang, 2014), compared to conditions where the information is presented farther away from the hands. Within the framework of this project, various studies have expanded these investigations, examining the influence of hand proximity on the learning of visuospatial and verbal contents, cognitive processing (e.g., attention, working memory), and emotion regulation. Results indicate that visuospatial contents from pictures is better remembered when learned near the hands, whereas hand proximity has no effect on learning verbal contents from texts (Brucker, Brömme, Ehrmann, Edelmann, & Gerjets, 2021). Furthermore, presenting positive contents near the hands leads to better emotion regulation than when these contents are presented at a greater distance from the hands (Ruiz Fernández, Lachmair, Rahona, & Gerjets, 2016). Prospective studies will be conducted to assess the generalizability of these findings, considering additional application areas.

  • Brucker, B., Brömme, R., Ehrmann, A., Edelmann, J., & Gerjets, P. (2021). Touching digital objects directly on multi-touch devices fosters learning about visual contents. Computers in Human Behavior, 119, Article 106708. https://dx.doi.org/10.1016/j.chb.2021.106708
  • Brömme, R., Gottschling, S., Brucker, B., & Gerjets, P. (2017, March). Upside down: hand proximity fosters attentional processing but not cognitive control in a visuospatial task-switching paradigm. 59. Conference of Experimental Psychologists (TeaP). Dresden. [Talk]
  • Ruiz Fernández, S., Lachmair, M., Rahona, J. J., & Gerjets, P. (2016, March). Grasping feelings: Proximity of hands to positive pictures boosts mood recovery. 58. Tagung experimentell arbeitender Psychologen (TeaP). Heidelberg. [Talk]

Part of the lab

Duration

01/2013 - open

Funding

IWM budget resources

Your contact person

Participants

Publications (1)

  • Brucker, B., Brömme, R., Ehrmann, A., Edelmann, J., & Gerjets, P. (2021). Touching digital objects directly on multi-touch devices fosters learning about visual contents. Computers in Human Behavior, 119, Article 106708. https://doi.org/10.1016/j.chb.2021.106708

    View article