Notice: Undefined offset: 10443 in /var/www/ on line 5961
Insect motion vision - Scholarpedia

Insect motion vision

From Scholarpedia
Martin Egelhaaf (2009), Scholarpedia, 4(11):1671. doi:10.4249/scholarpedia.1671 revision #86325 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: Martin Egelhaaf

Dr. Martin Egelhaaf accepted the invitation on 1 October 2007 (self-imposed deadline: 1 March 2008).,

Motion detection is an important computational task, since animals including humans heavily rely on visual motion information during locomotion. When an animal moves through a natural environment the eyes typically receive a wildly fluctuating spatiotemporal pattern of image flow. It is the task of the motion detection mechanisms in the brain to interpret this complex spatiotemporal input and to extract information about the animal’s self-motion and its surroundings to guide behavior. Existing artificial motion vision systems are outperformed by biological systems in many respects with regard to their capabilities to process retinal image flow. This is most remarkable for insect brains with their small number of neurons and the extraordinary speed with which retinal image motion is processed. Because of the relative ease with which the nervous systems of insects can be approached by electrophysiological and imaging techniques, selected insect species have served as model systems for analyzing the mechanisms of processing of retinal image motion. The sophisticated toolset of genetics that can be applied to a particular insect, i.e. Drosophila, is currently rolling up the analysis of the neural circuitry underlying motion detection.

Since the optic flow elicited on the eyes by self motion is characterized by its global features, the mechanisms extracting behaviorally relevant information from the retinal motion patterns need to combine local motion measurements from large areas of the visual field of one eye and even between eyes. Accordingly, visual motion information is computed by a sequence of processing steps (1) spatiotemporal filtering of the retinal brightness signals by local retinotopically organized signal channels, (2) detection of local motion signals by interaction between neighboring signal channels, (3) spatial pooling of local motion signals over large parts of the visual field, and (4) interactions of global motion representations of the two eyes.


Significance of visual motion information for behavioral control

Retinal image displacements are elicited when a moving object crosses the visual field (‘object motion’). However, even if the outside world is stationary the retinal images are in continuous flow when the animal moves about in the environment. This so-called optic flow is a rich source of information about the path and speed of locomotion as well as the three-dimensional layout of the environment (Dahmen et al. 1997; Eckert and Zeil 2001; Gibson 1979; Koenderink 1986; Lappe 2000). For instance, during forward translation the optic flow across both eyes is directed backwards with the apparent velocity of closer objects being larger than that of more distant ones; by translational movements an animal actively generates spatial information on its eyes that is not available when the animal is stationary. In contrast, during a pure rotation about the vertical body axis optic flow is directed backward across one eye, but forward across the other. In this situation the retinal velocities are independent of the distance of objects to the animal. Given that animals may often rotate and translate simultaneously and may change their velocity continually, the optic flow is likely to be much more complex in many behavioral situations. Moreover, flying animals have six degrees of freedom, three of rotation and three of translation, a feature that further increases the complexity of the resulting optic flow as compared to that of animals moving on the ground.

Visual motion has been shown to play an important role in controlling behavior in a wide range of contexts.

• Visual course control: Both the speed of straight flight as well as turning behaviour are controlled by the motion patterns on the two eyes. Balancing the optic flow on the eyes has been shown to be a means in bees and flies to compensate their course against internal asymmetries and external disturbance to ensure straight flight (e.g. Götz, 1975; Srinivasan et al., 1999; Kern and Egelhaaf, 2000; Mronz and Lehrmann 2008). Moreover, the translational velocity depends on the strength of the optic flow (David 1984; Srinivasan et al., 1991; Srinivasan et al. 1996). However, optic flow does not only help to mediate a straight course of locomotion, but may also elicit turns, for instance, to prevent collisions with obstacles.
Figure 1: Flying blowfly Calliphora as seen from above shown at 12ms time intervals. The rapid saccadic turn of about 90° (yellow frame) is executed within less than 50ms.
When a fly approaches an obstacle, such as a textured wall, it generates sharp saccade-like turns to prevent it from crashing into the obstacle. The times at which the saccades are elicited and their directions were concluded to depend on the characteristics of retinal optic flow patterns (Schilstra and van Hateren 1999; van Hateren and Schilstra 1999; Tammero and Dickinson 2002b; Mronz and Lehrmann 2008).

• Object detection: Motion cues may provide the world with a third dimension. When an animal passes or approaches a nearby object, the object appears to move faster than its background. Several insect species, ranging from flies, to bees and hawkmoths have been shown to use relative motion very efficiently to detect objects and to infer their distance (Kimmerle et al. 1996; Kimmerle and Egelhaaf 2000a; Lehrer et al. 1988; Srinivasan et al. 1989). Thereby they mainly use relative motion information at the edges of objects (Kern et al. 1997; Kimmerle et al. 1996; Srinivasan et al. 1990).

• Landing: Flying animals cannot always stay aloft, but have to come to the ground regularly. An approach directed perpendicularly to a potential landing site generates strong looming cues, i.e. the retinal image expands. Flies have been shown to use this information to initiate landing at a critical level of image expansion that is determined by spatiotemporal integration of visual motion that goes along with image expansion (Borst 1990; Tammero and Dickinson 2002a; Wagner 1982). However, looming cues are weak when the insect does not approach its landing side perpendicularly but lands on a flat surface. In this situation bees keep their speed roughly proportional to the height above ground. They do this by holding the retinal image velocity approximately constant while approaching the surface. This strategy guarantees smooth landing without requiring knowledge about the height above the ground (Srinivasan et al. 2000b).

• Pursuit of moving targets: Many insects follow moving objects, potential prey or mates, and may eventually catch them. Dragonflies or tiger beetles pursue other insects to catch and eventually eat them (Gilbert 1997; Olberg et al. 2000). The praying mantis sits in ambush and fixates targets by saccadic head and body movements (Rossel 1980; Poteser and Kral 1995). In the context of mating behavior male flies chase females in acrobatic visually controlled flight maneuvers (Collett and Land 1975; Land 1993; Land and Collett 1974; Wagner 1986a; Zeil 1983). The forward velocity of the chasing fly is controlled by the angular size of the target, whereas the turning velocity depends on the angle from which the target is seen as well as on its speed. During pursuit catch-up saccades are observed when the target changes its direction too rapidly to allow the pursuer to follow smoothly (Boeddeker et al. 2003; Boeddeker and Egelhaaf 2003; Boeddeker and Egelhaaf 2005).

• Estimation of travelled distance: Bees need to acquire distance information on foraging excursions to be able to return to their hive or nest and to communicate the location of rewarding food sources to their fellow foragers. They gauge distance in terms of the optic flow experienced during the flight to a food source (Esch et al. 2001; Hrncir et al. 2003; Srinivasan et al. 2000a; Tautz et al. 2004). Since the optic flow generated during translational movements depends on the three-dimensional layout of the environment, distance information gathered in this way is ambiguous. These ambiguities do not lead to problems as long as the recruited bees tend to fly on the same route as the forager and the environment does not change much between the flight of the forager and that of recruited bees. Since such changes occur only rarely in natural environments during a couple of days, such a relatively simple mechanism of distance estimation is sufficient for the specific needs under normal behavioral conditions.

• Spatial information through active vision: Retinal image displacements depend to a large extent by the animal's own behavior. Various insect species acquire spatial information by active movements. Most insects do not have stereo vision (as primates do for near objects) and also have a relatively poor spatial resolution which largely prohibits object detection by their contour outline. Hence, in a way, motion vision is all they got to obtain spatial information. Locusts and mantids perform periodic sidewise body movements and use the resulting translational motion information to assess the distance to a target (Collett and Paterson 1991; Kral 1998; Poteser and Kral 1995; Rossel 1980; Sobel 1990). Bees and wasps perform distinct flight maneuvers when leaving their nest or a newly discovered food place. They do not depart on a straight course, but turn around to face the place they are leaving and fly backwards in a series of continually increasing arcs. The animals actively acquire information about the location of the goal relative to its surround, store this information and use it to recognize their goal when they return (Collett and Zeil 1996; Zeil et al. 2007). Flying and walking blowflies, but also many other flying insects such as hoverflies, wasps and honeybees, shift their gaze during free flight by saccadic turns of body and head, keeping gaze basically fixed between saccades (Blaj and van Hateren 2004; Schilstra and van Hateren 1999; Tammero and Dickinson 2002b; van Hateren and Schilstra 1999). This active viewing strategy generates retinal image flow with characteristic dynamical features. It separates the image flow resulting from rotational and translational movements of the animal. Since the translational flow component depends on the distance to environmental objects, the saccadic flight strategy may help the nervous system to extract information about the spatial layout of the environment.

• Gaze stabilisation: Many of the visually guided behaviours described above would not work properly if the animal would not maintain its gaze in a default orientation; i.e. the eyes are aligned with the external horizon. A level gaze makes sure that wide-field optic flow is projected onto the eyes in a way that allows the neural machinery to efficiently analyse visual motion. Gaze stabilisation has been shown to rely on visual motion information, but also on mechanosensory systems (reviews: Hengstenberg 1993; Taylor and Krapp 2007). Multisensory convergence could be pinpointed at the level of head motor neurons that are involved in compensatory head movements and, thus, in gaze control (Huston and Krapp 2009).

Steps of visual motion computation

The behavioral significance of motion vision in insects is reflected in an abundance of motion-sensitive neurons in their nervous systems. Neurons responding specifically to visual motion have been found at all stages of the nervous system ranging from the 2nd visual neuropile to descending neurons connecting the brain with the motor control centers in the thoracic ganglia. Recently even motor neurons controlling head movements have been shown to respond to visual motion in a direction-selective way (e.g. Huston and Krapp 2008)

The properties of motion sensitive visual interneurons are elaborated along the visual motion pathway. Whereas motion sensitive neurons in the peripheral visual system respond to motion only in a small area of the visual field, neurons at subsequent processing stages tend to have large receptive fields which may even subserve both eyes. Part of these higher order neurons were concluded to respond preferably to the complex optic flow patterns that are evoked in different behavioral situations. For instance, some neurons respond best during coherent wide-field motion as may occur while an animal turns around a particular body axis. Others respond best to object motion as may occur while the animal pursues a moving target or passes a stationary object in its environment.

Motion information is not explicitly given by the retinal input. Rather, it has to be computed by the nervous system from the pattern of brightness changes as sensed by the array of photoreceptors. Motion computation is possible, because the brightness does not vary randomly and independently at each photoreceptor. Instead, the retinal image is correlated in space and time as a consequence of both the structure of natural environments and the way animals move in the world. In the following, the discussion will focus on the processing steps that lead to direction selectivity and to the sensitivity of neurons to optic flow.

Computation of local motion information

The first explicit representation of visual motion is computed in parallel by arrays of motion detectors that cover the entire visual field. Motion detection is a local process which compares changes in light intensity at neighboring points in the visual field. Activation of only two neighboring photoreceptors in an appropriate temporal order is sufficient to evoke directionally selective behavioral and neuronal responses (Franceschini et al. 1989; Schuling et al. 1989).

Local motion detection is assumed to be accomplished in the 2nd visual neuropile, the medulla. Specific representations of visual motion information are found in the two most proximal layers of the medulla. Most motion sensitive medulla neurons that could be functionally characterized have small receptive fields as is expected from neurons involved in local motion detection (review: Straufeld and Douglass 2006). As a consequence of the small size of the neurons in this brain area and of the difficulty of recording their activity, conclusions concerning the cellular mechanisms underlying motion detection are still tentative. However, much progress is currently being made by applying the sophisticated repertoire of genetic and molecular approaches in Drosophila to break the neural circuits underlying motion detection (Borst 2009; Joesch et al. 2008; Katsov and Clandinin 2008; Riester et al. 2007).

Figure 2: Major processing steps of visual motion computation in insects. (A) Schematic of the visual motion pathway. Images of the moving environment are projected on the array of photoreceptors. The retinal input is spatially and temporally filtered before signals originating from neighboring points in visual space interact with each other. These interactions lead to local motion measurements. The outputs of many retinotopically organized local movement detectors are spatially pooled by LPTCs. (B) Local motion detector in its simplest form. It consists of two mirror-symmetrical subunits and receives input from neighboring points in visual space. In each subunit one of the inputs is delayed (tau), before it interacts with the undelayed signal of the neighboring input channel. A multiplication-like interaction (M) is the lowest order nonlinearity that is sufficient to explain many aspects of motion detection in insects. The subunit outputs contribute to the response of LPTCs with opposite polarity. (C) One of the LPTCs, a so-called FD1-cell, in the third visual neuropile of the blowfly filled with the fluorescent dye Lucifer yellow and visualized in a whole-mount preparation.

Many features of motion detection can be accounted for by a computational model, the so-called correlation-type movement detector. In its simplest form, a local movement detector is composed of two mirror-symmetrical subunits. In each subunit the signals of adjacent light-sensitive cells receiving appropriately filtered brightness signals from neighboring points in visual space are correlated after one of them has been delayed. The final detector response is obtained by subtracting the outputs of two such subunits with opposite preferred directions. Direction selectivity of the movement detection circuit is considerably enhanced by this subtraction-like processing step. Movement is signaled by a movement detector when the input elements report the same brightness value in immediate succession. During this process, each motion detector reacts with a large excitatory signal to movement in a given direction and with a negative, i.e. inhibitory signal to motion in the opposite direction (reviews: Borst and Egelhaaf 1989; Borst and Egelhaaf 1993; Clifford and Ibbotson 2003; Egelhaaf and Borst 1993b; Reichardt 1961). Various elaborations of this basic movement detection scheme have been proposed to account for the responses of insect motion sensitive neurons under a wide range of stimulus conditions including even natural optic flow as experienced under free-flight conditions.

The movement detection mechanism does not operate on an immediate representation of the retinal brightness values but on a spatiotemporally filtered version of them. This filtering takes place in the retina and the first visual neuropile, the lamina, and leads to an enhancement of changes in brightness at the expense of the background brightness (Juusola et al. 1996; Laughlin 1994; Srinivasan et al. 1982; van Hateren 1997). This neural filtering is thought to maximize the transfer of information about time-dependent retinal images.

Spatial pooling of local motion information

Since the optic flow as induced during locomotion has a global structure, it cannot be evaluated by local mechanisms alone. Rather, local motion measurements from large parts of the visual field need to be combined. This is accomplished in the third visual neuropile, the lobula complex, in flies in the so-called lobula plate, by so-called lobula plate tangential cells (LPTCs). They spatially pool on their large dendrites the outputs of many retinotopically arranged local motion sensitive neurons and, accordingly, have large receptive fields. LPTCs are excited by motion in their preferred direction and are inhibited by motion in the opposite direction (reviews: Borst and Haag 2002; Egelhaaf 2006; Egelhaaf et al. 2002; Hausen and Egelhaaf 1989; Krapp 2000).

The local motion sensitive elements that synapse onto a given LPTC do not all have the same preferred direction. Rather, local preferred directions change gradually over the LPTC’s receptive field and have been concluded to coincide with the directions of the velocity vectors in particular optic flow fields. Hence, the spatial input organization of LPTCs makes them most sensitive to optic flow induced by particular self-motions (Krapp et al., 1998; Krapp et al., 2001).

Dendritic integration of local motion signals has various functional consequences for the neuronal representation of optic flow.

• Pattern dependence: Owing to their small receptive fields, the responses of the input elements of LPTCs are temporally modulated even when the stimulus pattern moves with a constant velocity. These modulations are the consequence of the texture of the environment. Since the signals of neighboring input elements are phase-shifted with respect to each other, their pooling by the dendrites of LPTCs reduces mainly those pattern dependent response modulations that originate from the high spatial frequencies of the stimulus pattern (Egelhaaf et al. Single and Borst 1998, Straw et al. 2008). Nonetheless, some pattern induced response modulations are still present.

Figure 3: Simplified wiring diagrams of neuronal circuits extracting different aspects of optic flow information. (A) Circuit for coherent wide-field motion. Input organization of the HSE-cell of the blowfly. The HSE-cell receives input from the eye ipsilateral to its main dendrite from many retinotopic motion-sensitive elements. As a consequence of this input, the HSE-cell is depolarized by front-to-back motion and hyperpolarized by back-to-front motion. The HSE-cell receives additional excitatory input on its main dendrite from the H1-cell or close to its axon terminal from the H2-cell. The spike activity of H1 and H2 is increased during back-to-front motion in the contralateral visual field. The cells are sketched only schematically. (B) Circuit for object motion. The FD1-cell is one output element of this circuit. It receives retinotopic input from the ipsilateral eye to its main dendrite. To prevent it from firing during wide-field motion, it is inhibited by the VCH-cell via GABAergic synapses. The VCH-cell responds best to wide-field motion as indicated by the inset. It receives input from the contralateral eye from both the H1- and the H2-cell. There is an additional inhibitory input during contralateral front-to-back motion (not shown in the diagram). (C) Relationship of the two neuronal circuits sketched in (A) and (B). The cells are indicated by boxes. Excitatory and inhibitory synapses are indicated by triangles and circles, respectively. Note the reciprocal recurrent inhibitory connections between neurons in both halves of the visual system.

• Velocity dependence: LPTCs do not operate like odometers: their mean responses increase with increasing velocity, reach a maximum, and then decrease again. The location of the velocity maximum depends on the textural properties of the moving stimulus pattern. If the spatial frequency of a sine-wave grating is shifted to lower values, the velocity optimum shifts to higher values. In terms of the correlation model, the location of the temporal frequency optimum is determined by the time constant of the delay filters in the local movement detectors (review: Egelhaaf and Borst 1993). The pattern dependence of velocity tuning is reduced if the stimulus pattern consists of a broad spatial frequency spectrum, such as is characteristic of natural scenery (Dror et al. 2001; Straw et al. 2008).

• Time course of motion responses: The time course of LPTC responses is approximately proportional to the time-varying pattern velocity as long as the velocity changes are small. Then pattern velocity can be inferred from the LPTC response by linear operations. However, as a consequence of the computational structure of the local motion detection mechanism, LPTC responses do not only depend on pattern velocity, but also on higher-order temporal derivatives (Egelhaaf and Reichardt 1987). This is reflected, for instance, in the response transients to sudden changes in pattern velocity (Egelhaaf and Borst 1989; Warzecha et al. 1999).

Network interactions within the visual field of one eye and integration of motion information from both eyes

Despite the sophisticated patterns of preferred directions in the receptive fields of LPTCs, dendritic pooling of motion input is not sufficient to obtain specific responses during particular types of self-motion. Network interactions between LPTCs within one brain hemisphere and between both halves of the visual system are important for shaping their specific sensitivities for optic flow (reviews: Borst and Haag 2002; Egelhaaf et al. 2002; Egelhaaf 2006; Borst and Haag 2007).

The receptive field properties of some LPTCs are not only shaped by their retinotopic motion input but also by electrical coupling between several of them. To enhance the specificity of LPTCs for particular optic flow patterns interactions between both visual hemispheres are particularly relevant. For instance, during forward translation the optic flow across both eyes is directed backward. In contrast, during a pure rotation about the animal's vertical axis, optic flow is directed backward across one eye, but forward across the other eye. Both types of optic flow can be distinguished if motion from both eyes is taken into account. This is accomplished by heterolateral interactions between fly LPTCs. Moreover, some LPTCs respond best to motion of small objects rather than to global optic flow patterns. The object sensitivity could be shown in some cases to be a consequence of inhibitory interactions with other LPTCs.

Although such heterolateral interactions may increase neuronal specificity for particular types of optic flow, this specificity is far from being perfect and the neurons still respond to a wide range of ‘non-optimal’ optic flow stimuli suggesting that behaviorally relevant motion information is encoded by the activity profile of the entire population of LPTCs rather than by the responses of individual cells.

Linearities and non-linearities in neuronal computation of motion information

Establishing neuronal wiring diagrams alone is not sufficient to understand how visual motion information is computed. One reason for this is that neurons are highly non-linear computing devices. There are only few examples where the computational consequences of these non-linearities have been analyzed in the context of neuronal encoding of visual motion.

Figure 4: Consequences of dendritic integration on the representation of visual motion: Schematic of a fly LPTC with two branches of its dendrite, the axon and the axon terminal. The LPTC receives retinotopically organised input from local motion detectors (vertical lines terminating with ´synapses´ red dots (excitatory synapses) and turquoise dots (inhibitory synapses) on the dendrite). (A) As a consequence of this input the cell is excited by motion in its preferred direction (upper diagram) and inhibited by motion in the null-direction (lower diagram). (B) Even when the velocity of motion is constant, the activity of the local input elements of an LPTC is modulated depending on the texture of the surround in the receptive fields of the local elements. Traces on the right indicate the time-dependent signals of three local input elements of the LPTC. By dendritic pooling of many local elements this pattern dependence in the timecourse of the responses is reduced (left trace). (C) Gain control in the TC makes its responses relatively independent of the number of activated input elements and, thus, of pattern size, while the response amplitude still depends on pattern velocity. Left: The enlargement illustrates that each point in visual space is subserved by a pair of input elements of the LPTCs, one of them being cholinergic and excitatory, the other GABAergic and inhibitory. Right: Even during motion in the preferred direction both types of local input elements are activated, though to a different extent depending on the velocity of motion (red and turquoise columns). As a consequence, the membrane potential approaches different saturation levels for different velocities when the number of activated local input elements increases.

• Gain control by dendritic integration of antagonistic motion input: Dendritic integration of signals from local motion-sensitive elements by LPTCs is a highly non-linear process. When the signals of an increasing number of input elements are pooled, saturation non-linearities make the response largely independent of pattern size. However, the response saturates at different levels for different velocities. Hence LPTC responses are almost invariant against changes in pattern size, while they still encode velocity. This gain control can be explained on the basis of the passive membrane properties of LPTCs and the antagonistic nature of their motion input. Even during motion in the preferred direction both types of local input elements, i.e. the two mirror-symmetrical subunits of the movement detector, are activated, though to a different extent, depending on the velocity of motion. As a consequence, with increasing numbers of activated input elements the membrane potential approaches different saturation levels at different velocities (Borst et al. 1995; Single et al. 1997).

• Voltage dependent mechanisms: Although many computational consequences of dendritic pooling of local motion inputs can be explained on the basis of the passive membrane properties of LPTC dendrites, a wealth of active processes have been identified in the dendritic membranes of fly LPTCs (review: Borst and Haag 2002). Amongst the voltage dependent currents, fast sodium currents have been found to underlie spike activity in some LPTCs. In addition, delayed rectifying potassium currents and fast sodium-dependent potassium currents were identified. Different LPTC types differ with respect to the expression of these currents and thus in their electrical signals. The functional significance of active processes for the encoding of visual motion is still not well understood. One exception is the voltage sensitive sodium current that has been shown to boost high frequency fluctuations of the membrane potential and, thus, increases the sensitivity of LPTCs to rapid velocity displacements which would otherwise be attenuated due to time constants involved in motion detection (Haag and Borst, 1996). In addition to sodium and potassium currents, voltage-sensitive calcium currents were also found in both the dendrite and the presynaptic terminal of blowfly LPTCs. These conductances show only little or no inactivation. Again different types of LPTCs differ with respect to the dynamics of the calcium channels. Calcium accumulates in the cytosol during visual motion stimulation (Borst and Egelhaaf 1992; Egelhaaf and Borst 1995; Haag and Borst 2000; Kurtz et al. 2001). Whereas in the presynaptic region, the most likely function of calcium is to trigger transmitter release, the function of dendritic calcium accumulation is less clear.

• Synaptic signal transfer: Meaningful representations of optic flow are often only achieved by interactions between LPTCs. To be beneficial, these synaptic interactions need to be carefully adjusted to the natural operating range of the system. Otherwise, synaptic transmission may severely distort the information being transmitted. This is particularly daunting as synaptic transmission is inherently noisy and the underlying biophysical processes have been found in many systems to be intrinsically non-linear. Moreover, the transformation of the postsynaptic potential into spike activity may also be non-linear. Combined electrophysiological and optical imaging experiments were performed to analyze the relationship between the presynaptic activity of a small group of LPTCs and the activity of a postsynaptic LPTC. Although it is still controversial whether this synaptic connection is chemical, electrical or combined chemical/electrical, it is clear that the entire range of presynaptic depolarization levels that can be elicited by motion in the 'preferred direction' is transformed approximately linearly into the postsynaptic spike rate. Linearity characterizes transmission of membrane potential fluctuations up to frequencies of 10Hz (Beckers et al. 2007, 2009; Haag and Borst 2008; Warzecha et al. 2003). Thus, the linear synaptic regime covers most of the dynamic range within which visual motion information is transmitted with a high gain (Haag and Borst 1997; Warzecha et al. 1998). As a consequence of the computational properties of the analyzed synapse, visual motion information is transmitted largely undistorted to the contralateral visual system. This ensures that the characteristic dependence of neural responses on stimulus parameters such as velocity or contrast is not affected by the intervening synapse.

• Transformation of postsynaptic potentials into spike trains Spike generation is an inherently non-linear process, since spikes are generated only if the cell is sufficiently depolarized. Above threshold the spike rate increases with depolarization of the cell and eventually approaches a saturation level that is mainly set by the refractory properties of the neuron. Whether these non-linearities become relevant in the context of neuronal computation depends on the operating range of the neuron during sensory stimulation. In fly LPTCs the relationship between the postsynaptic potential and the corresponding spike rate was concluded to be linear for preferred direction motion. Whereas during null-direction motion LPTCs are hyperpolarized and spike activity is essentially suppressed, the spike rate increases approximately linearly within the entire range of depolarization that can be evoked by preferred direction motion (Kretzberg et al. 2001; Warzecha et al. 2000). It has not been possible to reach the saturation level of the spiking mechanism by visual motion stimulation.

Encoding of visual motion in real time

Sensory information may be encoded in two different ways, either by graded changes in membrane potential or by sequences of action potentials.In fly LPTCs the postsynaptic signals originating from the many retinotopic input elements superimpose and, depending on the direction of motion, the cell either depolarizes or hyperpolarizes in a graded fashion. In some LPTCs graded membrane potential changes in the cell’s output terminal are transmitted to the postsynaptic target, either electrically or via transmitter release. In other LPTCs, most notably in those that project to the other side of the brain, the input and output regions are too distant for this mode of signal conduction. The graded membrane potential changes are then transformed into spike trains that are actively transmitted to the cell’s output terminal. The graded and the spiking mode of transmission are, however, not mutually exclusive, since in many cells in which graded membrane potentials reach the output terminal, the graded signals are superimposed and modified by voltage-dependent signals (Beckers et al. 2007, 2009; Cuntz et al. 2007; Farrow et al. 2006; Haag and Borst 2008; Kalb et al. 2006, 2008; Kurtz et al. 2001; Warzecha et al. 2003). These different response modes have been investigated with respect to their consequences for the encoding of visual motion information.

• Accuracy of encoding visual motion: Even during constant velocity motion the activity of cells which signal motion with graded membrane potential changes and of spiking cells fluctuates continually. Moreover, when the same stimulus is presented repeatedly to a neuron, the responses may vary much (Warzecha et al. 2000). On the basis of individual spike trains it is not easily possible to discern stimulus-driven activity changes from those that are due to sources not associated with the stimulus ('noise'). Spike generation is generally thought to time-lock to rapid membrane potential fluctuations with a millisecond precision. As a consequence, spikes are only time-locked precisely to the stimulus, if the stimulus evokes membrane potential changes that are sufficiently fast and large relative to the membrane potential noise. In contrast, slow stimulus-induced membrane potential fluctuations mainly affect the spike rate and normally do not cause precise time-locking of spikes; the exact timing of spikes is then determined by the high-frequency components of the membrane potential noise. Since the computations underlying direction selectivity inevitably require time constants of some tens of milliseconds, they attenuate the neural representations of high-frequency velocity fluctuations (Haag and Borst, 1997; Warzecha et al., 1998). Hence, only when the high-frequency velocity changes are large, the depolarizations resulting from motion computation are sufficiently pronounced to elicit spikes at a millisecond precision. Otherwise the exact timing of most spikes is determined by membrane potential noise and visual motion is not represented faithfully by the temporal occurrence of spikes (Kretzberg et al. 2001). To what extent rapid and slow velocity changes and thus the exact timing of spikes are functionally significant is currently being resolved taking into account the dynamics of natural retinal image displacements in different behavioral contexts. Given that neuronal responses are noisy, it will take some time to infer reliably relevant motion parameters from neuronal activity. Integration of neuronal activity over only 5 ms after response onset was found to be sufficient to decode the way of self-motion of the animal from the neuronal response (Karmeier et al. 2005). In the case of the fly, short integration times to decode neuronal responses are important when controlling rapid flight maneuvers.

• Noise sources limiting the reliability of motion vision: Various noise sources within the nervous system constrain the reliability of neuronal responses, such as phototransduction, the stochastic nature of ion channels that underlie all electrical activities of neurons, as well as synaptic transmission. In addition, the incoming visual signal is inherently noisy because of the quantum nature of light. In the response of fly motion sensitive neurons single-photon effects could be detected in the dark, although these neurons are several synapses away from the photoreceptors (Lillywhite and Dvorak 1981). Thus, at the sensitivity threshold of visual systems, the reliability of motion vision is limited by the physical limits of the visual input signal, i.e. photon-noise. However, at least for the light adapted eye, noise sources inherent in synaptic transmission between photoreceptors and 2nd-order neurons significantly affect the reliability with which visual information is signaled to higher-order processing stages (Borst and Haag 2001; Lewen et al. 2001a; Ruyter van Steveninck and Laughlin 1996). As a consequence, since flies are usually active during the day, for most behaviorally relevant conditions, the reliability of encoding of visual motion information is constrained by noise sources inside the nervous system rather than by photon noise (Grewe et al. 2003; 2007).

Adaptation of the visual motion pathway to environmental conditions

Motion vision systems operate under a variety of dynamical conditions. For instance, during walking the retinal images may be displaced much more slowly than during flight. Even in flying animals the dynamics of image motion may differ depending on their specific way of locomotion, i.e. whether the animal is hovering in front of a flower or cruising through its habitat. In various studies, mainly on flies, adaptation mechanisms have been inferred to adjust the visual motion pathway to these different dynamical conditions. Although it is generally agreed that many features of LPTC responses depend on stimulus history and, thus, may be regarded as adaptive, neither the underlying mechanisms nor the functional significance of motion adaptation have been fully clarified. This is because adaptive processes have been studied by very different stimulus paradigms and conceptual approaches (reviews: Clifford and Ibbotson 2003; Egelhaaf 2006).

Nonetheless, a number of mechanisms have been proposed to be involved in visual motion adaptation. Part of these motion adaptation mechanisms operates locally and, thus, presynaptic to the LPTCs; they are concluded to be, to some extent, independent of the direction of motion. Other mechanisms, mainly those that depend on the direction of motion, originate after spatial pooling of local motion signals at the level of LPTCs.

• Changes in time constants: The time constants involved at different computational stages of peripheral visual information processing and/or local movement detection were proposed to change in the context of motion adaptation (Borst and Egelhaaf 1987; Borst and Reisenmann 2003; Maddess et al. 1991; Maddess and Laughlin 1985; Ruyter van Steveninck et al. 1986; see however Harris et al. 1999).

• Gain changes: The gain of signal processing at one or several stages in the peripheral motion pathway and potentially at the level of LPTCs has been concluded to be adjusted depending on the contrast of the motion pattern (Harris et al. 2000).

• Adaptive membrane potential shifts: Motion that leads to an excitation of an LPTC shifts the membrane potential of the cell to a less depolarized state and leads to a prolonged hyperpolarization after motion offset, probably by opening of a potassium channel. In this way the operating range of the system is changed (Harris et al. 2000; Kurtz 2007; Kurtz et al. 2000).

The functional significance of motion adaptation is still not entirely clear. Several non-exclusive possibilities were proposed.

• Adjusting the dynamic range of motion sensitivity: Changes in the dynamic range of time-varying motion were concluded to rescale, on a wide range of timescales ranging from milliseconds up to minutes, the relation between the motion input and the response of LPTCs so as to match the dynamic range of responses to that of the input (Brenner et al. 2000; Fairhall et al. 2001). It is remarkable that this seemingly complex property can be explained without assuming any adaptive parameter changes just as a consequence of the multidimensionality of the stimulus and the nonlinear nature of the elementary motion detection mechanism (Borst et al. 2005).

• Saving energy: Since the response amplitude of LPTCs is reduced during motion adaptation without affecting the overall information conveyed by the neuronal responses, motion adaptation has been proposed to save energy without sacrificing the reliability with which behaviorally relevant information is encoded (Heitwerth et al. 2005).

• Increasing the sensitivity to environmental objects: Since during maintained stimulation with artificial as well as with naturalistic motion sequences the responses to discontinuities in retinal velocity and texture increases while the overall response amplitude decreases (Maddess and Laughlin 1985; Kurtz et al. 2009), motion adaptation has been concluded to enhance the saliency of environmental objects (Liang et al. 2008).

Processing of behaviorally relevant visual motion information

Knowing the wiring of a neuronal circuit is not sufficient to infer how efficiently and reliably information is processed and represented in natural behavioral situations. This is because traditionally visual information processing is analyzed with stimuli that are much simpler with respect to their spatial and dynamical features than the input an animal encounters during normal behavior. Because visual systems evolved in specific environments and behavioral contexts, the functional significance of the information being processed can only be assessed by analyzing neuronal performance under conditions that come close to natural situations.

• Reducing the complexity of natural motion signals: Segregation of rotational and translational optic flow components:

The dynamical properties of optic flow are largely determined by the dynamics of the animal's self-motion and the three-dimensional layout of the environment in which the animal moves around. The characteristics of optic flow may differ greatly in different species and in different behavioral situations. For instance, some insects, such as hoverflies, dragonflies and hawkmoths, are able to hover almost stationarily in midair or in front of some flower. From their current position in space these insects may rapidly accelerate and dart off at high velocities. The optic flow pattern and its dynamics may differ tremendously between the respective situations. Blowflies, but also other insects, such as hoverflies, honeybees or wasps, usually change their body and gaze direction rapidly by saccadic turns during flight or, one order of magnitude more slowly, while walking. Gaze direction is kept basically constant between saccades. As a consequence, the corresponding optic flow is either mainly rotational (i.e. during saccades) or translational (i.e. during the intersaccadic intervals; see above Spatial information through active vision).
Figure 5: Schematic illustration of the consequence of rotational (upper diagram) or translational self-motion (bottom diagram) for the resulting optic flow. Superimposed images were either generated by roatating a camera around its vertical axis or by translating it forward. Rotational self-motion leads to optic flow vectors that have the same length irrespectve of the distance of environmental objects to the observer. In contrast, the optic flow elicited by tranlsational self-motion depends on the distance between objects to the observer. Hence, translational optic flow contains spatial information.
The translational optic flow component depends on distance of environmental objects from the observer, whereas the rotational optic flow component is independent of the distance. Thus, the behavioural segregation of rotational and translational self-movements enables insects to gather spatial information about the three-dimensional layout of their environment during intersaccadic flight sections by relatively simple computational means. Note however, that this spatial information derived from translational optic flow is only qualitative rather than quantitative, because it depends on (i) the velocity of the animal, (ii) its distance to objects in the surroundings and (iii) on the location of the objects in the visual field relative to the direction of translation.

• Representation of spatial information by motion sensitive neurons: Since it is currently not possible to probe during flight behavior into the neural circuits processing visual motion information, behavioral and neuronal analysis are done separately. Both types of analyses are interlinked by employing for visual stimulation in experiments at the neural level reconstructions of the image sequences free-flying blowflies have previously seen during their virtuosic flight manoeuvres as well as targeted manipulations of such sequences. By employing this approach it could be shown for blowflies that the population of LPTCs makes efficient use of the saccadic flight and gaze strategy to extract spatial information from the translational optic flow during the intersaccadic intervals (Karmeier et al. 2006; Kern et al. 2005, 2006; van Hateren et al. 2005). For instance, the intersaccadic depolarisation level of LPTCs depends on the distance of the blowfly to environmental structures. This finding is interesting since it reveals that LPTCs, which have previously been concluded on the basis of experiments with artificial stimuli to act mainly as detectors of self-rotation of the animal, provide spatial information during translatory locomotion in the intersaccadic interval and fail to encode faithfully even the most prominent turns of the animal as found during saccades. This is because during saccades the visual motion system operates far beyond its linear range, whereas for retinal velocities normally encountered during the intersaccadic intervals it responds with increasing amplitudes to the increasing retinal velocities resulting from a decreasing distance of the animal to environmental objects. These results suggest that the functional significance of neuronal mechanisms cannot be assessed unless the system is assessed under its natural operating conditions.

Insect-inspired motion computation in robots

Given the ability of many insects to perform extraordinary acrobatic flight manoeuvres, it is not surprising that there have been various attempts to implement insect-inspired optic flow processing into simulation models and on robotic platforms (Franz and Mallot 2000; Webb et al. 2004; Zufferey and Floreano 2006). Although these approaches usually employed simplified model versions of the visual motion pathway, in all these studies the sensorimotor loop was closed. Most of them used optic flow information to stabilise the agent’s path of locomotion against disturbances or to avoid collisions with walls. Only few attempts in robotics make use also of the saccadic strategy of locomotion as is characteristic of many flying insects and of the implicit distance information present in translatory optic flow between saccades to implement obstacle avoidance (Franceschini et al. 1992; Reiser and Dickinson 2003; Sobey 1994; Zufferey and Floreano 2006).

That obstacle avoidance based on optic flow information is possible to some extent could be shown for a model of the blowfly, the CyberFly. The CyberFly is based on a saccadic controller that receives its sensory input from a model of the blowfly’s visual motion pathway and takes the specific dynamic features of blowfly behaviour into account. The model of the sensory system providing the input to the controller was calibrated on the basis of experimentally determined responses of a major LPTC in the blowfly’s visual system, to the visual input of flies in free-flight situations (Lindemann et al. 2005). The LPTCs provide the implicit distance information resulting from translatory movements between saccades. Although the CyberFly can avoid colliding with the walls of the flight arena, it is still limited by its strong dependence on the textural properties of the environment (Lindemann 2008).

Relevant review articles

Borst A (2009) Drosophila’s view on insect vision. Current Biol. 19, R36-R47

Borst A, Egelhaaf M (1989) Principles of visual motion detection. Trends Neurosci. 12, 297-306

Borst A, Egelhaaf M (1993) Detecting visual motion: Theory and models. In: Visual motion and its role in the stabilization of gaze, Miles, FA, Wallman J eds., Elsevier

Borst A, Haag J (2002) Neural networks in the cockpit of the fly. J.Comp.Physiol.A 188, 419-437

Borst A, Haag J (2007) Optic flow processing in the cockpit of the fly. In: Invertebrate neurobiology, North G, Greenspan RJ eds., CSHL-Press

Clifford CWG, Ibbotson MR (2003) Fundamental mechanisms of visual motion detection: models, cells and functions. Progress Neurobiol. 68, 409-437

Collett TS, Nalbach HO, Wagner H. (1993) Visual stabilization in arthropods. In Visual motion and its role in the stabilization of gaze, Miles FA, Wallman J eds., Elsevier

Collett TS, Zeil J (1996) Flights of Learning. Curr.Dir.Psychol.Sci. 5, 149-155

Dahmen HJ, Wüst RM, Zeil J (1997) Extracting egomotion parameters from optic flow: Principal limits for animals and machines. In: From living eyes to seeing machines, Srinivasan MV, Venkatesh S eds., Oxford University Press

Dickinson MH (2006) Insect Flight. Current Biology 16, 309-314

Douglass JK, Strausfeld NJ (2001) Pathways in dipteran insects for early visual motion processing. In: Motion vision: computational, neural, and ecological contraints, Zanker JM, Zeil J eds., Springer

Eckert MP, Zeil J (2001) Towards an ecology of motion vision. In: Motion vision: Computational, neural, and ecological constraints, Zanker JM, Zeil J, eds., Springer

Egelhaaf M (2006) The neural computation of visual motion information. In: Invertebrate vision, Warrant E, Nielsson DE), Cambridge University Press

Egelhaaf M, Borst A (1993) Movement detection in arthropods. In: Visual motion and its role in the stabilization of gaze, Wallman J Miles FA eds., Elsevier

Egelhaaf M, Grewe J, Karmeier K, Kern R, Kurtz R, Warzecha A-K (2005) Novel approaches to visual information processing in insects: Case studies on neuronal computations in the blowfly. In: New frontiers in insect neuroscience, CRC Press

Egelhaaf M, Kern R (2002). Vision in flying insects. Curr. Opin. Neurobiol. 12, 699-706

Egelhaaf M, Kern R, Kurtz R, Krapp H.G, Kretzberg J, Warzecha A-K (2002) Neural encoding of behaviourally relevant motion information in the fly. Trends Neurosci. 25, 96-102

Egelhaaf M, Kern R, Lindemann JP, Braun E, Geurten B (2009) Active vision in blowflies: strategies and mechanisms of spatial orientation. In: Flying insects and robots, Floreano D, Zufferey J-C, Srinivasan MV, Ellington C. eds., Springer

Franceschini N, Riehle A, Nestour Al (1989) Directionally selective motion detection by insect neurons. In: Facets of vision, Stavenga D, Hardie R eds., Springer

Frye MA, Dickinson MH (2001) Fly flight: A model for the neural control of complex behavior. Neuron 32, 385-388

Hausen K, Egelhaaf M (1989) Neural mechanisms of visual course control in insects. In: Facets of vision, Stavenga D, Hardie RC eds., Springer

Juusola M, French AS, Uusitalo RO, Weckström M (1996). Information processing by graded-potential transmission through tonically active synapses. Trends Neurosci. 19, 292-297

Koenderink JJ (1986) Optic flow, Vision Res. 26, 161-179

Krapp HG (2000) Neuronal matched filters for optic flow processing in flying insects. In: Neuronal processing of optic flow, Lappe M ed., Academic Press

Kurtz R, Spalthoff C, Kalb J (2008) Examination of fly motion vision by functional fluorescence techniques. Frontiers Biosci. 13, 3009-3021

Reichardt W (1961) Autocorrelation, a principle for the evaluation of sensory information by the central nervous system. In: Sensory Communication, WA Rosenblith ed., M.I.T. Press and John Wiley & Sons

Reichardt W Poggio T (1976) Visual control of orientation behaviour in the fly. Part I. A quantitative analysis. Quart. Rev. Biophysics 9, 311-375

Srinivasan MV, Poteser M, Kral K. (1999). Motion detection in insect orientation and navigation. Vision Res. 39, 2749-2766

Srinivasan MV, Zhang SW. (2000). Visual navigation in flying insects. Internat. Rev. Neurobiol. 44, 67-92

Straufeld NJ, Douglass J, Campbell H, Higgins C (2006) Parallel processing in the optic lobes of flies and the occurrence of motion computing circuits. In: Invertebrate vision, Warrant E, Nielsson DE eds., Cambridge University Press

Taylor GK, Krapp HG (2007) Sensory systems and flight stability: What do insects measure and why?, Advances in insect Physiol. 34, 231-316

Warzecha A-K, Egelhaaf M (2001) Neuronal encoding of visual motion in real-time. In: Motion vision: computational, neural, and ecological contraints, Zanker JM, Zeil J eds., Springer

Zeil J, Boeddeker N., Hemmi JM, Stürzl W (2007) Going wild: Toward an ecology of visual information processing. In: Invertebrate neurobiology, North G, Greenspan R ed., CSHL-Press

Selected references

Beckers U, Egelhaaf M, Kurtz, R (2007) Synapses in the fly motion-vision pathway cover a broad range of signal amplitudes and dynamics. J. Neurophysiol. 97, 2032-2041

Beckers U, Egelhaaf M, Kurtz R (2009) Precise timing in fly motion vision is mediated by fast components of combined graded and spike signals. Neurosci. 160, 639 - 650

Blaj G, van Hateren JH (2004) Saccadic head and thorax movements in freely walking blowflies. J. Comp. Physiol. A- 190, 861-868

Boeddeker N. Kern R, Egelhaaf M (2003) Chasing a dummy target: Smooth pursuit and velocity control in male blowflies. Proc.R.Soc.Lond.B, 270, 393-399

Boeddeker N, Egelhaaf M (2003). Steering a model fly: Simulations on visual pursuit in blowflies. Proc.R.Soc.Lond.B 270, 1971-1978

Boeddeker N, Egelhaaf M (2005). Chasing behaviour of blowflies: A smooth pursuit system generates saccades. J. Exp. Biol. 208, 1563-1572

Borst A (1990). How do flies land ? From behavior to neuronal circuits. Biosci. 40, 292-299. Borst A(2009) Drosophila’s view on insect vision. Current Biol. 19, R36-R47

Borst A, Egelhaaf M (1987) Temporal modulation of luminance adapts time constant of fly movement detectors. Biol. Cybern. 56, 209-215

Borst A, Egelhaaf M (1989). Principles of visual motion detection. Trends Neurosci. 12, 297-306 Borst A, Egelhaaf M, Haag J (1995) Mechanisms of dendritic integration underlying gain control in fly motion-sensitive interneurons. J. Comput. Neurosci. 2, 5-18

Borst A, Flanagin V, Sompolinsky, H (2005) Adaptation without parameter change: Dynamic gain control in motion detection. PNAS 102, 6172-6176

Borst A, Haag J (2001) Effects of mean firing on neural information rate. J. Comput. Neurosci. 10, 213-221

Borst A, Haag J (2002). Neural networks in the cockpit of the fly. J. Comp. Physiol. A 188, 419-437

Borst A, Reisenman C, Haag J (2003) Adaptation to response transients in fly motion vision: II. Model studies. Vision Res. 43, 1309-1322

Brenner N, Bialek W, Ruyter van Steveninck R de (2000) Adaptive rescaling maximizes information transmission. Neuron 26, 695-702

Clifford CWG, Ibbotson MR (2003) Fundamental mechanisms of visual motion detection: models, cells and functions. Progress Neurobiol. 68, 409-437

Collett TS, Land MF (1975). Visual control of flight behaviour in the hoverfly Syritta pipiens L. J. Comp. Physiol. 99, 1-66

Collett TS, Paterson CJ (1991) Relative motion parallax and target localization in the locust, Schistocerca gregaria. J. Comp. Physiol. A 169, 615-621

Collett TS, Zeil J. (1996) Flights of Learning. Curr.Dir.Psychol.Sci. 5, 149-155

Cuntz H, Haag J, Foerstner F, Segev I, Borst A (2007) Robust coding of flow-field parameters by axo-axonal gap junctions between fly visual interneurons. PNAS 104: 10229-10233

David CT (1984) The dynamics of height stabilization in Drosophila. Physiological Entomology 9, 377-386

Egelhaaf M, Borst A (1989) Transient and steady-state response properties of movement detectors. J. Opt. Soc. Am. A 6, 116-127

Egelhaaf M, Borst A (1995) Calcium accumulation in visual interneurons of the fly: Stimulus dependence and relationship to membrane potential. J. Neurophysiol. 73, 2540-2552

Egelhaaf M, Borst A, Reichardt W (1989) Computational structure of a biological motion detection system as revealed by local detector analysis in the fly's nervous system. J.l Opt. Soc. Am. A 6, 1070-1087

Egelhaaf M, Reichardt W (1987) Dynamic response properties of movement detectors: Theoretical analysis and electrophysiological investigation in the visual system of the fly. Biol. Cyber. 56, 69-87

Esch HE, Zhang S, Srinivasan MV, Tautz J (2001). Honeybee dances communicate distances measured by optic flow. Nature 411, 581-583

Fairhall AL, Lewen GD, Bialek W, Ruyter van Steveninck R de (2001) Efficiency and ambbiguity in an adaptive neural code. Nature 412, 787-792

Farrow K, Haag J, Borst A (2006) Nonlinear, binocular interactions underlying flow field selectivity of a motion-sensitive neuron. Nature Neurosci 9, 1312-1320

Franceschini N, Pichon JM, Blanes C (1992) From insect vision to robot vision. Phil. Trans. Roy. Soc. B 337, 283-294

Franz MO, Mallot HA (2000) Biomimetic robot navigation. Robots and Autonomous Systems 133-153

Gibson,JJ (1979). The ecological approach to visual perception. Houghton Mifflin

Gilbert C (1997) Visual control of cursorial prey pursuit by tiger beetles (Cicindelidae). J. Comp. Physiol. A 181, 217-230

Götz KG (1975) The optomotor equilibrium of the Drosophila navigation system. Journal of Comparative Physiology 99, 187-210

Grewe J, Kretzberg J, Warzecha A-K, Egelhaaf M (2003) Impact of photon-noise on the reliability of a motion-sensitive neuron in the fly's visual system. J. Neurosci. 23, 10776-10783

Grewe J, Weckström M, Egelhaaf M, Warzecha A-K (2007) Information and discriminability as measures of reliability of sensory coding. PLoS ONE, 2(12):e1328

Haag J, Borst A (1996) Amplification of high frequency synaptic inputs by active dendritic membrane processes. Nature 379, 639-64

Haag J,. Borst A (1997) Encoding of visual motion information and reliability in spiking and graded potential neurons. J. Neurosci 17, 4809-4819

Haag J, Borst A (2000) Spatial distribution and characteristics of voltage-gated calcium signals within visual interneurons. J. Neurophysiol. 83, 1039-1051

Haag, Borst A (2008): Electrical coupling of lobula plate tangential cells to a heterolateral motion-sensitive neuron in the fly. J Neurosci 28: 14435-14442

Harris RA, O'Carroll DC, Laughlin SB (1999) Adaptation and the temporal delay filter of fly motion detectors. Vision Res. 39, 2603-2613

Harris RA. O'Carroll DC, Laughlin SB (2000) Contrast gain reduction in fly motion adaptation. Neuron 28, 595-606

Heitwerth J, Kern R, van Hateren JC, Egelhaaf M (2005) Motion adaptation leads to parsimonious encoding of natural optic flow by blowfly motion vision system. J. Neurophys. 94, 1761-1769

Hengstenberg R (1993) Multisensory control in insect oculomotor systems. In: Visual motion and its role in the stabilization of gaze, Miles, FA, Wallman J eds., Elsevier

Hrncir M, Jarau S, Zucchi R, Barth FG (2003) A stingless bee (Melipona seminigra) uses optic flow to estimate flight distances. J. Comp. Physiol. A 189, 761-768

Huston SJ, Krapp HG (2008) Visuomotor transformation in the fly gaze stabilization system, PLOS BIOL, 6, 1468-1478

Huston SJ, Krapp HG (2009) Nonlinear integration of visual and haltere inputs in fly neck motor neurons., J Neurosci 29, 13097-13105

Joesch M, Platt J, Borst A, Reiff DF (2008) Response properties of motion-sensitive visual interneurons in the lobula plate of Drosophila melanogaster. Current Biol. 18, 368-374

Juusola M, French AS, Uusitalo RO, Weckström M (1996) Information processing by graded-potential transmission through tonically active synapses. Trends Neurosci. 19, 292-297

Kalb J, Egelhaaf M, Kurtz, R (2006) Robust integration of motion information in the fly visual system revealed by single-cell photo-ablation. J. Neurosci. 26, 7898-7906

Kalb J, Egelhaaf M, Kurtz R (2008) Adaptation of velocity encoding in synaptically coupled neurons in the fly visual system. J. Neurosci. 28, 9183–9193

Karmeier K, Krapp HG, Egelhaaf M. (2005) Population coding of self-motion: Applying Bayesian Inference to a population of visual interneurons in the fly. J. Neurophysiol. 94, 2182-2194

Karmeier K., van Hateren, JH, Kern R, Egelhaaf M. (2006) Encoding of naturalistic optic flow by a population of blowfly motion sensitive neurons. J Neurophysiol 96, 1602-1614

Katsov AY, Clandinin TR (2008) Motion processing streams in Drosophila are behaviourally specialized. Neuron 59, 322-335

Kern,R. and Egelhaaf,M. (2000) Optomotor course control in flies with largely asymmetric visual input. Journal of Comparative Physiology A 186, 45-55

Kern R, Egelhaaf M, Srinivasan MV (1997) Edge detection by landing honeybees: Behavioural analysis and model simulations of the underlying mechanism. Vision Res. 37, 2103-2117

Kern R, van Hateren JH, Egelhaaf M (2006) Representation of behaviourally relevant information by blowfly motion-sensitive visual interneurons requires precise compensatory head movements. J. Exp. Biol. 209, 1251-1260

Kern R, van Hateren JH, Michaelis C, Lindemann JP, Egelhaaf M (2005) Function of a fly motion-sensitive neuron matches eye movements during free flight. PLOS Biol. 3, 1130-1138

Kimmerle B, Egelhaaf M (2000) Detection of object motion by a fly neuron during simulated translatory flight. J. Comp. Physiol. A 186, 21-31

Kimmerle B, Srinivasan MV, Egelhaaf M (1996) Object detection by relative motion in freely flying flies. Naturwissenschaften 83, 380-381

Kral K (1998) Side-to-side movements to obtain motion depth cues: a short review of research on the praying mantis. Behav. Processes 43, 71-77

Krapp HG, Hengstenberg B, Hengstenberg R (1998) Dendritic structure and receptive-field organization of optic flow processing interneurons in the fly. J. Neurophysiol. 79, 1902-1917

Krapp HG, Hengstenberg R, Egelhaaf M (2001) Binocular contribution to optic flow processing in the fly visual system. J. Neurophysiol. 85, 724-734

Kretzberg J, Egelhaaf M, Warzecha A-K (2001) Membrane potential fluctuations determine the precision of spike timing and synchronous activity: A model study. J. Comput. Neurosci. 10, 79-97

Kurtz R (2007) Direction-selective adaptation in fly visual motion-sensitive neurons is generated by an intrinsic conductance-based mechanism. Neurosci. 146, 573-583

Kurtz R, Egelhaaf M, Meyer HG, Kern R (2009) Adaptation accentuates responses of fly motion-sensitive visual neurons to sudden stimulus changes. Proc. Roy. Soc B 276, 3711-3719

Kurtz R, Warzecha A-K, Egelhaaf M (2001) Transfer of visual information via graded synapses operates linearly in the natural activity range. J. Neurosci. 21, 6957-6966

Land MF (1993). Chasing and pursuit in the dolichopodid fly Poecilobothrus nobilitatus. J. Comp. Physiol. A 173, 605-613

Land MF, Collett TS (1974) Chasing behaviour of houseflies (Fannia canicularis). A description and analysis. J. Comp. Physiol. 89, 331-357

Lappe M (Ed.) (2000) Neuronal processing of optic flow. (San Diego, San Francisco, New York: Academic Press)

Laughlin SB (1994) Matching coding, circuits, cells, and molecules to signals: General principles of retinal design in the fly's eye. Progr. in Retinal and Eye Res. 13, 165-196

Lehrer M, Srinivasan MV, Zhang SW, Horridge GA (1988) Motion cues provide the bee's visual world with a third dimension. Nature 332, 356-357

Lewen GD, Bialek W, de Ruyter van Steveninck R (2001) Neural coding of naturalistic motion stimuli. Network: Comput. Neural Syst. 12, 317-329

Liang P, Kern R, Egelhaaf M (2008) Motion Adaptation Enhances Object-Induced Neural Activity in Three-Dimensional Virtual Environment. J. Neurosci. 28, 11328 - 11332

Lillywhite PG, Dvorak DR (1981 Responses to single photons in a fly optomotor neuron. Vision Res. 21, 279-290

Lindemann JP, Kern R, van Hateren JH, Ritter H, Egelhaaf M (2005). On the computations analysing natural optic flow: Quantitative model analysis of the blowfly motion vision pathway. J. Neurosci. 25, 4343-4352

Lindemann JP, Weiss H, Möller R, Egelhaaf M (2007) Saccadic flight strategy facilitates collision avoidance: Closed-loop performance of a cyberfly. Biol. Cybern. 98, 213-227

Maddess T, DuBois R, Ibbotson MR (1991) Response properties and adaptation of neurones sensitive to image motion in the butterfly Papilio aegeus. J. Exp. Biol. 161, 171-199

Maddess T, Laughlin SB (1985) Adaptation of the motion-sensitive neuron H1 is generated locally and governed by contrast frequency. Proc.R.Soc.Lond.B 225, 251-275

Mronz M, Lehmann FO (2008) The free-flight response of Drosophila to motion of the visual environment. J.Exp.Biol. 211, 2026-2045

Olberg RM, Worthington AH, Venator KR. (2000) Prey pursuit and interception in dragonflies. J. Comp. Physiol.y A 186, 155-162

Poteser M, Kral K (1995) Visual distance discrimination between stationary targets in praying mantis: an index of the use of motion parallax. J. Exp. Biol. 198, 2127-2137

Reiser MB, Dickinson MH (2003) A test bed for insect-inspired robotic control. Phil. Trans. R. Soc. Lond.A 361, 2267-2285

Riester J, Pauls D, Schnell B, Ting CY, Lee CH, Sinakevitch I, Morante J, Strausfeld NJ, Ito K, Heisenberg M (2007) Dissection of the peripheral motion channel in the visual system of Drosophila melanogaster. Neuron 56, 155-170

Rossel S. (1980) Foveal fixation and tracking in praying mantis. J. Comp. Physiol. 139, 307-331

Ruyter van Steveninck R de, Laughlin SB (1996) The rate of information transfer at graded-potential synapses. Nature 379, 642-645

Ruyter van Steveninck R de, Zaagman WH, Mastebroek HAK (1986) Adaptation of transient responses of a movement-sensitive neuron in the visual system of the blowfly, Calliphora erythrocephala. Biol. Cybern. 54, 223-236

Schilstra C,. van Hateren JH (1999) Blowfly flight and optic flow. I. Thorax kinematics and flight dynamics. J. Exp. Biol. 202, 1481-1490

Schuling FH, Mastebroek HAK., Bult R, Lenting BPM. (1989) Properties of elementary movement detectors in the fly Calliphora erythrocephala. J. Comp. Physiol. A 165, 179-192

Single S, Borst A (1998) Dendritic integration and its role in computing image velocity. Science 281, 1848-1850

Single S, Haag J, Borst A (1997). Dendritic computation of direction selectivity and gain control in visual interneurons. J. Neurosci. 17, 6023-6030

Sobel EC (1990) The locust's use of motion parallax to measure distance. J. Comp. Physiol. A 167, 579-588

Sobey PJ (1994) Active navigation with a monocular robot. Biol Cybern 71, 433-440

Srinivasan MV, Lehrer M, Horridge GA (1990) Visual figure-ground discrimination in the honeybee: the role of motion parallax at boundaries. Proc.R.Soc.Lond.B 238, 331-350

Srinivasan MV, Lehrer M, Kirchner WH, Zhang SW (1991) Range perception through apparent image speed in freely flying honeybees. Visual Neurosci. 6, 519-535

Srinivasan MV, Lehrer M, Zhang SW, Horridge GA (1989) How honeybees measure their distance from objects of unknown size. J. Comp. Physiol. A 165, 605-613

Srinivasan MV, Poteser M, Kral K. (1999). Motion detection in insect orientation and navigation. Vision Res. 39, 2749-2766

Srinivasan MV, Zhang S, Altwein M, Tautz J (2000a). Honeybee navigation: Nature and calibration of the "odometer". Science 287, 851-853

Srinivasan MV, Zhang SW, Chahl J S, Barth E, Venkatesh S (2000b). How honeybees make grazing landings on flat surfaces. Biol. Cybernetics 83, 171-183

Srinivasan MV, Zhang SW, Lehrer M, Collett TS (1996) Honeybee navigation en route to the goal: Visual flight control and odometry. J. Expl Biol. 199, 237-244

Straw, AD, Rainsford, T & O’Carroll, DC (2008) Contrast sensitivity of insect motion detectors to natural images, J. Vision 8, 1-9

Tammero LF, Dickinson MH (2002a). Collision-avoidance and landing responses are mediated by separate pathways in the fruit fly, Drosophila melanogaster. J.Exp.Biol 205, 2785-2798

Tammero LF, Dickinson MH (2002b) The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster. J.l Exp.l Biol. 205, 327-343

van Hateren JH (1997) Processing of natural time series of intensities by the visual system of the blowfly. Vision Res. 37, 3407-3416

van Hateren JH, Kern R, Schwertfeger G, Egelhaaf M (2005) Function and coding in the blowfly H1 neuron during naturalistic optic flow. J. Neurosci. 25, 4343-4352

van Hateren JH,. Schilstra C (1999). Blowfly flight and optic flow. II. Head movements during flight. J. Exp. Biol. 202, 1491-1500

Wagner H (1982) Flow-field variables trigger landing in flies. Nature 297, 147-148

Wagner H (1986a) Flight performance and visual control of the flight of the free-flying housefly (Musca domestica). II. Pursuit of targets. Phil. Trans. Roy. Soc. London, B 312, 553-579

Warzecha A-K, Horstmann W, Egelhaaf M (1999) Temperature dependence of neuronal performance in the motion pathway of the blowfly Calliphora erythrocephala. J Exp Biol 202, 3161-3170

Warzecha A-K, Kretzberg J, Egelhaaf M (1998) Temporal precision of the encoding of motion information by visual interneurons. Current Biol. 8, 359-368

Warzecha A-K, Kretzberg J, Egelhaaf M (2000) Reliability of a fly motion-sensitive neuron depends on stimulus parameters. J. Neurosci. 20, 8886-8896

Warzecha A-K, Kurtz R, Egelhaaf M (2003) Synaptic transfer of dynamical motion information between identified neurons in the visual system of the blowfly. Neurosci. 119, 1103-1112

Webb B, Harrison RR, Willis MA (2004) Sensorimotor control of navigation in arthropod artifical systems. Arthropod Struct. Develop. 33, 301-329

Zeil J (1983) Sexual dimorphism in the visual system of flies: The free flight behaviour of male Bibionidae (Diptera). J Comp Physiol A 150, 395-412

Zufferey J-C, Floreano D (2006) Fly-Inspired Visual Steering of an Ultralight Indoor Aircraft. IEEE Transactions on Robotics 22, 137-146

Internal references

  • Valentino Braitenberg (2007) Brain. Scholarpedia, 2(11):2918.
  • Olaf Sporns (2007) Complexity. Scholarpedia, 2(10):1623.
  • Eugene M. Izhikevich (2007) Equilibrium. Scholarpedia, 2(10):2014.
  • Keith Rayner and Monica Castelhano (2007) Eye movements. Scholarpedia, 2(10):3649.
  • Giovanni Gallavotti (2008) Fluctuations. Scholarpedia, 3(6):5893.
  • Tamas Freund and Szabolcs Kali (2008) Interneurons. Scholarpedia, 3(9):4720.
  • Bertil Hille (2008) Ion channels. Scholarpedia, 3(10):6051.
  • Rodolfo Llinas (2008) Neuron. Scholarpedia, 3(8):1490.
  • Jose-Manuel Alonso and Yao Chen (2009) Receptive field. Scholarpedia, 4(1):5393.
  • John Dowling (2007) Retina. Scholarpedia, 2(12):3487.
  • Arkady Pikovsky and Michael Rosenblum (2007) Synchronization. Scholarpedia, 2(12):1459.
  • Nicholas V. Swindale (2008) Visual map. Scholarpedia, 3(6):4607.
Personal tools

Focal areas