Talk:Entropy

From Scholarpedia
Jump to: navigation, search

    The edits I have made in the text are very minor and mostly involve correcting unidiomatic usage of the English articles "the" and "a".

    However, I have a major problem with this article. It appears to have been written by a mathematician for an audience of mathematicians. It contains many terms and mathematical notations that will not be familiar to most neuroscientists. I suggest that a greater effort must be made to define these terms and notations, given the target readership. Examples include the following: measurable sets and subsets, indicator function, sigmafield and sub-sigmafield, infimum and supremum, one-sided or two-sided generator, the symbol '#' used in equations, injectively.

    From the above comment, it will be apparent that this reviewer, while familiar with applications of entropy and information theory in neuroscience, is not familiar with some of the mathematical concepts discussed in the article. Therefore, I cannot comment on the accuracy of those statements, except to say that I did not detect any obvious errors; other reviewers should check the mathematical sections. It is certainly not clear to me why, in the discussion of Kolmogorov-Sinai entropy, T is defined as a transformation, then T-inverse is used in all the equations. Should the author not just say that T must be an invertable transformation, then use T in the equations?

    The examples alluded to in the text are nowhere to be found.

    Some authors insist that information is always a difference of two entropies (for example, the entropy of a signal minus the entropy of the noise contained in the signal), never a single entropy as it is treated in this article. The author needs to address this issue.

    The author should mention that logarithms other than those to base 2 are sometimes used in calculations involving entropy. In that case, the units are named differently, for example, 'nats' instead of 'bits' when the base is e. The base e is far more common in thermodynamics texts than the base 2 used here in the discussion of Boltzmann and Gibbs entropy.

    Author's response: Indeed, I overlooked this review before, in fact I had no idea it exists. Now I know. I made the effort to explain all math. terms, where possible and within reasonable limits. The Scholarpedia manual for authors says that one should not try to explain everything, only make links. So, I would simply make a ling to "sigmafield" or "measurable". On the other hand, these terms do not exist yet in Scholarpedia, so I did include brief explanations. It is impossible to make an exhausting exposition here (the subject of the article would become "measure theory"), but it is impossible to write about Kolmogorov-Sinai entropy avoiding these terms. So, please have understanding, and you must accept, that some parts of this article will not be completely understandable by non-mathematicians.

    I noticed that two of the subpages (Examples 1 and 2) have been also revised. See my edits and comments there. What about the other four ("Connections¨ and examples 3-5). Are they perfect? Or just not revised yet?

    black hole entropy

    (reviewer B) It occurred to me that a section 2.4 on black hole entropy (as introduced by Bekenstein and Hawking) should be added, if only a very brief one. I could make suggestions as to what to say there if the author agrees that there should be such a section. Here are some reasons in favor of such a section: It is a notion of entropy that is different enough from the other notions of entropy described in the article; it is discussed a lot in certain communities (astrophysics, cosmology, quantum gravity); its existence should be mentioned, and connections to other notions of entropy should be described.

    Kolmogorov-Sinai entropy

    (reviewer B) It seems that there are now two detailed descriptions of KS entropy, one in the main article on KS entropy and one here (at "entropy"). The one here is pretty long (more than one page in my printout). Is that a good idea? Why did the author do that? Alternatives could be: (a) only very brief description in "entropy"; (b) different style of description than in the main article on KS entropy, e.g., non-technical. Let me add that I'm not necessarily against keeping the section as it is, but the question arises naturally.

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools