Modern methodologies developed primarily for the purpose of brain-computer interface providing a framework for classifying single-trial brain responses into one of several categories. Our goal was to associate brain responses with the category of visually presented objects. Subjects were looking at image exemplars of five different categories (Faces/Cars/Planes/Eggs/Watches), within a rapid serial visual presentation (RSVP, 10 Hz). EEG data was collected from 64 channels at high temporal resolution, yielding large spatio-temporal representations of single trial brain activity. Subjects were instructed to look for targets of a certain category; brain responses were collected for all presented categories. Trial-averaged event-related responses (ERPs) to all exemplars of each category were computed per subject. We found that within-subjects, responses to same category stimuli were highly reproducible. In contrast, different subjects’ ERPs to same category were highly variable. We tested single-trial categorization performance by a self-developed algorithm for classification of even-related responses. We also used our method to explore better latencies of category discrimination and the associated discriminating brain patterns. We found that maximal category discrimination occurs as early as ~150-200 ms post-stimulus, and is located mostly at occipito-parietal regions, in line with imaging reports of object selective areas adjacent to early visual cortex. To conclude, we found early category-discriminative activity for a variety of object categories, beyond the well-established categorical response to faces (the N170). Within-subjects brain response to each category is only mildly variable in time and space, but the precise distribution is subject-specific.
|Title of host publication||Conference Abstract: XI International Conference on Cognitive Neuroscience (ICON XI). doi: 10.3389/conf. fnhum|
|State||Published - 2011|