Scene Search Guidance under Salience-driven and Memory-driven Demands

Publication Year:
2017
Usage 41
Downloads 35
Abstract Views 6
Repository URL:
https://scholarcommons.sc.edu/etd/4283
Author(s):
Olejarczyk, Jenn Ha’aheo
Tags:
Scene Search; Memory-driven; Salience-driven; Experimental Analysis of Behavior; Psychology; Social and Behavioral Sciences
interview description
Visual search involves selecting relevant information while ignoring irrelevant information. Most search models predict what relevant features attract gaze; yet few consider search guidance from previous knowledge of scenes. This dissertation used eye movements to examine the guidance of attention when an immediate or delayed distractor appeared during novel and repeated searches.The experiments showed efficient search for repeated scenes, a classic result of contextual cueing. During repeated searches, an immediate attentional bias was found for distractors close to the target location. Automatic and controlled selective attention processes, measured using the antisaccade, were found within search behavior. The final experiment showed an automatic mechanism explained implicit – rather than the explicit – associative learning for a consistent target location within a repeated scene. Additionally, there was a controlled mechanism related to successful identification of the search target.Taken together, the findings support an immediate implicit guidance of attention that biases initial scene searches. After enough time passes, explicit guidance can directly guide the eyes to a known target location. The early effect of implicit bias from conceptual short-term memory, which is an abstraction of object-scene relationships, suggests task demands prioritize objects relevant for efficient search when familiar.