Talk:Insect motion vision

From Scholarpedia
Jump to: navigation, search

    Alexander Borst Review

    I have read the review article and found it an extremely well and scholarly written review article. I, thus, have only very few and very minor comments:

    1. When talking about object detection by relative motion cues, it might be worth pointing out that flies don't have stereo vision (as humans do for nearby objects) and also have a poor spatial resolution which largely prohibits object detection by their contour outline. In a way, motion vision is all they got.
    2. In the part where the author talks about Drosophila, there is a typo: ' ... in Drosophila to beeak the neural circuits ...'
    3. In the section on synaptic signal transfer, it is not clear whether the authors refers to chemical or electrical signal transmission.
    4. When talking about adaptation, adjusting the dynamic range of motion sensitivity, the mechanistic explanation of this phenomenon given by Borst et al, 2005, PNAS, might be worth mentioning, in particular since it also reveals an interesting and highly unexpected property of the Reichardt detector.
    5. in the part 'reducing the complexity ...', the wording 'information is only relative rather than metric' seems a bit cryptic. Maybe 'qualitative rather than quantitative' would make the point better?
    6. Finally, about the title: 'motion detection' to me refers to 'local motion detection', i.e. the reichardt detector. However, the topic is much richer and the article indeed covers a broad range of aspects in this context. So, maybe, 'motion vision' would be more appropriate?

    Prof. Dr. Alexander Borst Director Dept. Systems and Computational Neurobiology Max-Planck-Institute for Neurobiology

    Conveyed by --Pkatz 18:31, 25 September 2009 (EDT)

    Holger Krapp Review:

    This article is indeed an excellent summary on insect motion vision where the author illustrates general principles of visual information processing in the context of various behavioural tasks. The focus on flies makes perfect sense. These animals have been studied both in terms of behaviour and neurophysiology over more than half a century by now and the results, in combination with modelling approaches, have much improved our understanding of the functional organization of task-specific neural circuits.

    I have a few comments and suggestions for minor amendment/changes:

    In the paragrapf on ‘Significance of visual motion information for behavioural control’, line 3, citations: Koenderink 1986 outlines several image transformations which are related to optic flow. The publication that is more directly concerned with self-motion induced optic flow would be Koenderink and van Doorn 1987: “Facts on optic flow” – which may be added to the list of citations.

    Another important behaviour that also relies heavily on visual motion processing is gaze stabilization. As a matter of fact, many of the visually guided behaviours listed in this article would not work properly if the animal would not maintain its gaze in a default orientation; i.e. the eyes are aligned with the external horizon and the dorsoventral orientation of the head is in line with the gravity vector . A level gaze makes sure that wide-field optic flow is projected onto the eyes in a way that allows the neural machinery to efficiently analyse visual motion. What is more compensatory head movements – whenever possible - reduce rotational components of optic flow which do not provide any information on object distance. My suggestion is to add a small paragraph on gaze stabilization. The significance of this behaviour is fairly well demonstrated by the fact that gaze stabilization is a common feat among visually oriented animals. Pioneering behavioural work on flies by Hengstenberg et al. could be cited and other more recent accounts addressing the neural connections between the motion vision pathways and the neck motor system (e.g. Huston and Krapp 2008).

    In the paragraph on ‘Object detection’, line.3: “ ... and to infer on their distance ...” should be “ ... and to infer their distance ...”

    In the same paragraph, last sentence: why is it only at the edges of objects that relative motion could be detected? If the object has sufficient visual contrast all over, higher velocities compared to the background would become apparent at any part of the object. But I do see that the velocity contrast is most conspicuous at the edges.

    In the ‘Landing’ paragraph, last sentence: I suggest to replace “ ... landing site ...” with “ ... ground ...” as for this strategy to work during descend, the optic flow needs to be kept constant throughout, even before the fly actually reaches the landing site.

    In the paragraph on ‘Pursuit of moving targets’, line 2: “ ... by saccadic head and body movements.” This comment certainly is a bit nitpicky. But “saccadic” implies rotational movements which would not allow for distance estimation. Maybe “translational peering movements” would be more accurate.

    In the paragraph on ‘Estimation of travelled distance’, first sentence: it would be good to add something along the lines: “ ... and to communicate the location of rewarding food sources to their fellow foragers.” This would clarify from the outset that bees have to solve two tasks: i.e. estimate distance and orientation of the food source relative to the hive in order to return on a direct route AND communicate this information to the other bees.

    In the paragraph on ‘Steps of visual motion computation’, line 2: recently even motor neurons controlling head movements have been shown to respond to visual motion in a direction-selective way (e.g. Huston and Krapp 2008)

    General: I should leave this to the discretion of the author. But another interesting question in the context of coding self-motion information is: In which way are the outputs of LPTCs used at the next stage along the optomotor pathways? One of the examples so far concerns the neck motor system. Neck motor neurons integrate visual motion information from both eyes in order to gain higher specificity to rotational optic flow. This makes sense because most head movements of the animal cope with the compensation of rotational movements of the animal so that the remaining translational optic flow can be analysed more efficiently to estimate distance to objects in the surroundings.

    In the paragraph on ‘Linearities and non-linearieties ...’, line 1, second sentence: “have” before “are” should be dropped

    In the paragraph on ‘Voltage dependent mechanisms’, line 7: I would drop “So far, ...” at the beginning of the sentence. Next sentence: “One exception are ...” should probably be either “Exceptions are ...” or, better, “One exception is the voltage sensitive sodium current that has been shown ...” which would be compatible with “ increases” later in the sentence.

    A brief description of the work by Harris et al. 2000 should be added to this paragraph, who demonstrated contrast gain control mechanisms which, in a directional and non-directional selective way, change the contrast sensitivity in LPTCs. I realize that Harris et al. 2000 is mentioned further below, but I think it would fit in this paragraph, too.

    In the paragraph on ‘Encoding of visual motion in real time’: Hengstenberg 1977 should be added to the list of citations at the end of the paragraph. He was the first, I guess, who described the “irregular spikes” in LPTCs.

    In the title “Accuracy of encoding of visual motion” the second “of” could go. Also, in the same paragraph, line 3, from bottom: “same time” should be replaced with “some time”.

    In the paragraph on ‘Insect-inspired motion computation in robots’ work by Sean Humbert (e.g. Conroy et al. 2009) is missing who provided a control engineering framework that applies the “matched filter” hypothesis of optic flow-based self-motion estimation (Franz and Krapp 2000) to control terrestrial and aerial robots (quadropters and helicopters). It is one of the few examples where biological design principles have been directly transformed into technical applications.

    Finally, the author may also include a brief paragraph mentioning that most behaviours regarding flight and gaze stabilization also involve modalities other than vision. Because of its limited bandwidth the visual system alone would not be able to cope with very rapid changes in attitude. That is when other senses, including mechanoreceptive systems, kick in. A fairly comprehensive review addressing multisensory integration and also discussing the relationship between sensor and motor coordinate systems has been published recently (Taylor and Krapp 2007).

    Dr Holger G Krapp, Reader in Systems Neuroscience, Department of Bioengineering, Imperial College London.

    conveyed by --Pkatz 08:36, 11 November 2009 (EST)

    Personal tools

    Focal areas