Talk:Crossmodal and multisensory interactions between vision and touch

From Scholarpedia
Jump to: navigation, search

    The authors wrote an interesting article on touch-vision integration of object information in the brain. My main comment is that the title suggests a much broader review of the topic than is actually provided. Rather than talking about touch-vision integration in general, about 3/4th of the article deals with tactile activation in LOC and a model of haptic object processing. They acknowledge this in the introduction, but should then also adjust the title accordingly or widen their scope. There is an interesting body of work on (Bayesian) touch-vision integration by the group of prof. Marc Ernst Ernst (Optimal integration of shape information from vision and touch), as well as fundamental work by Prof Lederman and Klatzky that deserves mentioning. It would also be interesting to discuss (1) feedforward convergence from lower-level, sensory-specific areas to higher-order, heteromodal areas vs. (2) the processing of multimodal information in primary unisensory areas (e.g. M. Liang, et al. Primary sensory cortices contain distinguishable spatial patterns of activity for each sense Nat. Commun. (2013) ).

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools