Viewpoint-dependent recognition performance of 3-D objects has been taken as evidence for viewpoint-dependent object representations. We aim to investigate whether these results can be explained by viewpoint and object property information not being detected independently (being correlated) at a lower level, prior to object recognition. We introduce a combination of multidimensional signal detection theory and perturbation analysis to test this idea.
In Study 1, we measured low-level correlations using a Yes/No discrimination task. Subjects were instructed not to abstract viewpoint. We established that the correlations measured can be larger than those present in the input image, as computed using a pixel-based observer. In Study 2, subjects had to categorize objects in a Yes/No task while abstracting viewpoint. We found that the low-level correlations can only partially be overcome by object recognition: viewpoint dependency was linearly related to the low-level correlations with a slope of 0.59, significantly different from both 0 and 1. Task or stimulus differences are not responsible, since a pixel-based observer predicted a slope of 1. We conclude that low-level correlations prior to object recognition, both in the input image and the early visual system, can offer an explanation for viewpoint effects on the discrimination of 3-D objects.