Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Auditory hallucinations are thought to arise through the misidentification of self-generated verbal material as alien. The neural mechanisms that normally mediate the differentiation of self-generated from nonself speech are unclear. We investigated this in healthy volunteers using functional MRI. Eleven healthy volunteers were scanned whilst listening to a series of prerecorded words. The source (self/nonself) and acoustic quality (undistorted/distorted) of the speech was varied across trials. Participants indicated whether the words were spoken in their own or another person's voice via a button press. Listening to self-generated words was associated with more activation in the left inferior frontal and right anterior cingulate cortex than words in another person's voice, which was associated with greater engagement of the lateral temporal cortex bilaterally. Listening to distorted speech was associated with activation in the inferior frontal and anterior cingulate cortex. There was an interaction between the effects of source of speech and distortion on activation in the left temporal cortex. In the presence of distortion participants were more likely to misidentify their voice as that of another. This misattribution of self-generated speech was associated with reduced engagement of the cingulate and prefrontal cortices. The evaluation of auditory speech involves a network including the inferior frontal, anterior cingulate, and lateral temporal cortex. The degree to which different areas within this network are engaged varies with the source and acoustic quality of the speech. Accurate identification of one's own speech appears to depend on cingulate and prefrontal activity.

Original publication

DOI

10.1002/hbm.20120

Type

Journal article

Journal

Hum Brain Mapp

Publication Date

09/2005

Volume

26

Pages

44 - 53

Keywords

Acoustic Stimulation, Adult, Brain Mapping, Humans, Image Processing, Computer-Assisted, Magnetic Resonance Imaging, Male, Speech, Speech Perception