Abstracts - faqs.org

Abstracts

Psychology and mental health

Search abstracts:
Abstracts » Psychology and mental health

The neural bases of strategy and skill in sentence-picture verification

Article Abstract:

This study used functional Magnetic Resonance Imaging to examine patterns of cortical activation underlying linguistic and visual-spatial reasoning. Participants were required to complete a sentence-picture verification task to aid in the identification of cortical regions involved in different types of cognitive strategies. Findings indicate that visual-spatial processing and language are directed by partially separable cortical region networks.

Author: Reichle, Erik D., Carpenter, Patricia A., Just, Marcel Adam
Publisher: Elsevier B.V.
Publication Name: Cognitive Psychology
Subject: Psychology and mental health
ISSN: 0010-0285
Year: 2000
Statistical Data Included, Usage, Physiological aspects, Brain, Visual perception, Cognition, Magnetic resonance imaging, Verbal ability, Localization (Brain function)

User Contributions:

Comment about this article or add new information about this topic:

CAPTCHA


Tests of the E-Z reader model: Exploring the interface between cognition and eye-movement control

Article Abstract:

A simultaneous test and refinement of the E-Z Reader Model and an exploration of the interrelationship between visual and language processing and eye-movements in reading is conducted. Although a parallel model may be able to duplicate the predictions of the E-Z Reader model, it is likely to be far less parsimonious and nearly as good a heuristic device for using eye movements to understand language processing in reading.

Author: Reichle, Erik D., Rayner, Keith, Pollatsek, Alexander
Publisher: Elsevier B.V.
Publication Name: Cognitive Psychology
Subject: Psychology and mental health
ISSN: 0010-0285
Year: 2006
Human behavior, Eye, Eye movements, Cognitive psychology, Reading, Modeling behaviour, Modeling behavior

User Contributions:

Comment about this article or add new information about this topic:

CAPTCHA


Phonological codes are automatically activated during reading: evidence from an eye movement priming paradigm

Article Abstract:

A study of the responses of 48 members of the University of Massachusetts to sentences with homophone words, words with same pronunciation but different spellings, revealed that phonological codes are spontaneously stimulated during eye fixations while reading. Primes for a given target word were either similar to the target, a phonologically similar word, a visually similar nonhomophone or a dissimilar word.

Author: Rayner, Keith, Pollatsek, Alexander, Sereno, Sara C., Lesch, Mary F.
Publisher: Blackwell Publishers Ltd.
Publication Name: Psychological Science
Subject: Psychology and mental health
ISSN: 0956-7976
Year: 1995
Research, Phonetics, Oral interpretation

User Contributions:

Comment about this article or add new information about this topic:

CAPTCHA


Subjects list: United States, Analysis
Similar abstracts:
  • Abstracts: The fallacy of reducing rape and violent recidivism by treating anger. Concurrent cross-validation of the self appriasal questionnaire: a tool for assessing violent and non-violent recidivism and institutional adjustment on a sample of North Carolina offenders
  • Abstracts: On the uses of history in psychiatry: Diagnostic implications for anorexia nervosa. Research training in anorexia nervosa
  • Abstracts: Mechanical properties of foods used in experimental studies of primate masticatory function. Agonistic behavior and dominance relationships in female Phayre's leaf monkeys-preliminary results
  • Abstracts: Change in maternal perception of sibling negativity: Within- and between-family influences. part 2 Commentary: Siblings in their families
  • Abstracts: False memories lack perceptual detail: Evidence from implicit word-stem completion and perceptual identification tests
This website is not affiliated with document authors or copyright owners. This page is provided for informational purposes only. Unintentional errors are possible.
Some parts © 2025 Advameg, Inc.