Hand-selective visual regions represent how to grasp 3D tools: brain decoding during real actions

Research output: Contribution to journalArticle

Open Access permissions






Organisational units


Most neuroimaging experiments that investigate how tools and their actions are represented in the brain use visual paradigms where tools or hands are displayed as 2D images and no real movements are performed. These studies discovered selective visual responses in occipito-temporal and parietal cortices for viewing pictures of hands or tools, which are assumed to reflect action processing, but this has rarely been directly investigated. Here, we examined the responses of independently visually defined category-selective brain areas when participants grasped 3D tools (N=20; 9 females). Using real action fMRI and multi-voxel pattern analysis, we found that grasp typicality representations (i.e., whether a tool is grasped appropriately for use) were decodable from hand-selective areas in occipito-temporal and parietal cortices, but not from tool-, object-, or body-selective areas, even if partially overlapping. Importantly, these effects were exclusive for actions with tools, but not for biomechanically matched actions with control non-tools. In addition, grasp typicality decoding was significantly higher in hand than tool-selective parietal regions. Notably, grasp typicality representations were automatically evoked even when there was no requirement for tool use and participants were naïve to object category (tool vs non-tools). Finding a specificity for typical tool grasping in hand-, rather than tool-, selective regions challenges the long-standing assumption that activation for viewing tool images reflects sensorimotor processing linked to tool manipulation. Instead, our results show that typicality representations for tool grasping are automatically evoked in visual regions specialised for representing the human hand, the brain’s primary tool for interacting with the world.


Original languageEnglish
JournalThe Journal of Neuroscience
Early online date10 May 2021
Publication statusE-pub ahead of print - 10 May 2021

Bibliographic note

Pre-print - https://www.biorxiv.org/content/10.1101/2020.10.14.339606v2 Open data - https://openneuro.org/datasets/ds003342/versions/1.0.0

View graph of relations

ID: 186493212

Related by author
  1. Identifying and detecting facial expressions of emotion in peripheral vision

    Research output: Contribution to journalArticle

  2. Visuomotor control in the healthy and damaged brain

    Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)

  3. Editorial for Special Issue on Neglect Rehabilitation

    Research output: Contribution to journalArticle

Related by journal