In visual perception, many seemingly unrelated tasks are, in fact, very similar in their structure. For example, stereo vision, optical flow estimation and texture analysis are all closely related to each other. This can be seen as follows. If one assumes, for simplicity, a static scene, the slightly disparate images recorded by the two eyes of a binocular animal can also be sampled by a camera moving slowly from the position of the left eye to the position of the right eye (Fig. 5). In the process, the scene creates a characteristic space-time intensity pattern in the moving camera, with the texture pattern reflecting the arrangement of objects in the scene. More precisely, in this setup, the local orientation of the recorded space-time texture has a direct correlation with the observed distances of objects.
The local space-time texture orientation is usually called ``optical flow''. Thus, stereo vision can be regarded as a slightly more complicated optical flow estimation problem, having available only two image slices out of the full space-time texture. Further, optical flow estimation is, in turn, nothing more than the analysis of local texture directions in space-time patterns. So in retrospect, it is not too suprising that neural models proposed for neural texture analysis [39,40], optical flow estimation  and stereo vision  are very similar in structure.