skip to main content


Title: How Your Sense of Touch Can Change Your Brain
Your brain can be divided into various areas, one of which is responsible for your sense of touch. This part of your brain can be divided into even smaller areas that communicate with each body part. We can use a special map of the human body, called a sensory homunculus, to help us understand the various sizes of these parts of the brain. We will explain how this map was created and tell you about research showing how these brain areas can change. One study showed that brain areas can be recycled, meaning that the brain areas that no longer receive messages from the body can be used by other functioning brain areas. Another study showed that these changes can even occur within a single day! These studies can help scientists to better understand the brain and to help people who have problems with the sense of touch.  more » « less
Award ID(s):
2145412
NSF-PAR ID:
10416806
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Frontiers for Young Minds
Volume:
10
ISSN:
2296-6846
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    There is growing interest concerning the ways in which the human body, both one's own and that of others, is represented in the developing human brain. In two experiments with 7‐month‐old infants, we employed advances in infant magnetoencephalography (MEG) brain imaging to address novel questions concerning body representations in early development. Experiment 1 evaluated the spatiotemporal organization of infants’ brain responses to being touched. A punctate touch to infants’ hands and feet produced significant activation in the hand and foot areas of contralateral primary somatosensory cortex as well as in other parietal and frontal areas. Experiment 2 explored infant brain responses to visually perceiving another person's hand or foot being touched. Results showed significant activation in early visual regions and also in regions thought to be involved in multisensory body and self–other processing. Furthermore, observed touch of the hand and foot activated the infant's own primary somatosensory cortex, although less consistently than felt touch. These findings shed light on aspects of early social cognition, including action imitation, which may build, at least in part, on infant neural representations that map equivalences between the bodies of self and other.

     
    more » « less
  2. null (Ed.)
    Individual Development Plans (IDPs) have been used to support the career and professional development of graduate students across disciplines. This interactive session focuses on how IDPs can help you build your skills, form your professional identity, and take control of your career through an iterative process of self‐assessment, career exploration, decision making and goal setting. To provide an introduction to the IDP process attendees will be able to take a self‐assessment to learn about their particular strengths and begin to target their strengths toward their professional goals. Additionally a co‐developer and researcher of IDP platforms will highlight the aspects that can help your IDP be more effective than others. 
    more » « less
  3. Face touch is an unconscious human habit. Frequent touching of sensitive/mucosal facial zones (eyes, nose, and mouth) increases health risks by passing pathogens into the body and spreading diseases. Furthermore, accurate monitoring of face touch is critical for behavioral intervention. Existing monitoring systems only capture objects approaching the face, rather than detecting actual touches. As such, these systems are prone to false positives upon hand or object movement in proximity to one's face (e.g., picking up a phone). We present FaceSense, an ear-worn system capable of identifying actual touches and differentiating them between sensitive/mucosal areas from other facial areas. Following a multimodal approach, FaceSense integrates low-resolution thermal images and physiological signals. Thermal sensors sense the thermal infrared signal emitted by an approaching hand, while physiological sensors monitor impedance changes caused by skin deformation during a touch. Processed thermal and physiological signals are fed into a deep learning model (TouchNet) to detect touches and identify the facial zone of the touch. We fabricated prototypes using off-the-shelf hardware and conducted experiments with 14 participants while they perform various daily activities (e.g., drinking, talking). Results show a macro-F1-score of 83.4% for touch detection with leave-one-user-out cross-validation and a macro-F1-score of 90.1% for touch zone identification with a personalized model. 
    more » « less
  4. Publishing has always been a part of academic tradition and there is increasing pressure on faculty to publish, even those who carry heavy teaching loads. This article, based on a presentation at the IUBMB 2019 Education Conference session on Publishing in Education, contains suggestions on how to conduct educational research with an eye toward publishing your findings. 
    more » « less
  5. Category selectivity is a fundamental principle of organization of perceptual brain regions. Human occipitotemporal cortex is subdivided into areas that respond preferentially to faces, bodies, artifacts, and scenes. However, observers need to combine information about objects from different categories to form a coherent understanding of the world. How is this multicategory information encoded in the brain? Studying the multivariate interactions between brain regions of male and female human subjects with fMRI and artificial neural networks, we found that the angular gyrus shows joint statistical dependence with multiple category-selective regions. Adjacent regions show effects for the combination of scenes and each other category, suggesting that scenes provide a context to combine information about the world. Additional analyses revealed a cortical map of areas that encode information across different subsets of categories, indicating that multicategory information is not encoded in a single centralized location, but in multiple distinct brain regions.

    SIGNIFICANCE STATEMENTMany cognitive tasks require combining information about entities from different categories. However, visual information about different categorical objects is processed by separate, specialized brain regions. How is the joint representation from multiple category-selective regions implemented in the brain? Using fMRI movie data and state-of-the-art multivariate statistical dependence based on artificial neural networks, we identified the angular gyrus encoding responses across face-, body-, artifact-, and scene-selective regions. Further, we showed a cortical map of areas that encode information across different subsets of categories. These findings suggest that multicategory information is not encoded in a single centralized location, but at multiple cortical sites which might contribute to distinct cognitive functions, offering insights to understand integration in a variety of domains. 

    more » « less