A probabilistic model of visual working memory: Incorporating higher-order regularities into working memory capacity estimates.
When remembering a real-world scene, people encode both detailed information about specific objects and higher-order information like the overall gist of the scene. However, formal models of change detection, like those used to estimate visual working memory capacity, assume observers encode only a simple memory representation which includes no higher-order structure and treats items independently from each other. We present a probabilistic model of change detection that attempts to bridge this gap by formalizing the role of perceptual organization and allowing for richer, more structured memory representations. Using either standard visual working memory displays or displays in which the items are purposefully arranged in patterns, we find that models which take into account perceptual grouping between items and the encoding of higher-order summary information are necessary to account for human change detection performance. Considering the higher-order structure of items in visual working memory will be critical for models to make useful predictions about observers' memory capacity and change detection abilities in simple displays as well as in more natural scenes.