你又在摸鱼了对《Memory and the Computational Brain》的笔记(8)

Memory and the Computational Brain
  • 书名: Memory and the Computational Brain
  • 作者: C. R. Gallistel/Adam Philip King
  • 副标题: Why Cognitive Science will Transform Neuroscience
  • 页数: 336
  • 出版社: Wiley-Blackwell
  • 出版年: 2009-4-20
  • Information

    hmmmmm

    One of neurobiology’s uncomfortable secrets – the sort of thing neurobiologists are not keen to talk about except among themselves – is that we do not understand the code that is being used in these communications.
    引自 Information

    Whenever we learn from experience the value of an empir- ical variable (for example, how long it takes to boil an egg, or how far it is from our home to our office), the range of a priori possible values for that variable is narrowed by our experience. The greater the range of a priori possible values for the variable (that is, the larger the set of possible messages) and the narrower the range after we have had an informative experience (that is, the more precisely we then know the value), the more informative the experience. That is the essence of Shannon’s definition of information.
    引自 Information

    Shannon’s analysis says that the (average!) amount of information communicated is the (average) amount of uncertainty that the receiver had before the communication minus the amount of uncertainty that the receiver has after the communication. This implies that information itself is the reduction of uncertainty in the receiver. A reduction in uncertainty is, of course, an increase in certainty, but what is measured is the uncertainty.
    引自 Information

    By subjectivity, we mean that the information communicated by a signal depends on the receiver’s (the subject’s) prior knowledge of the possibilities and their probabilities. Thus, the amount of information actually communicated is not an objective property of the signal from which the subject obtained it!
    引自 Information

    the information that is communicated to a receiver by a signal is the receiver’s uncertainty about the state of the world before the signal was received (the receiver’s prior entropy) minus the receiver’s uncertainty after receiv- ing the signal (the posterior entropy). Thus, its quantification depends on the changes that the signal effects in the receiver’s representation of the world. The informa- tion communicated from a source to a receiver by a signal is an inherently subject- ive concept; to measure it we must know the receiver’s representation of the source probabilities. That, of course, implies that the receiver has a representation of the source probabilities, which is itself a controversial assumption in behavioral neuro- science and cognitive psychology.
    引自 Information

    The continuous case

    where we went wrong in considering the applicability of Shannon’s analysis to the continuous case was in assuming that an analog signal from an analog source could give a receiver infor- mation with certainty; it cannot. The accuracy of analog signaling is always noise limited, and it must be so for deep physical reasons. Therefore, the receiver of an analog signal always has a residual uncertainty about the true value of the source variable. This a priori limit on the accuracy with which values within a given range may be known limits the number of values that may be distinguished one from another within a finite range. That is, it limits resolution. The limit on the number of dis- tinguishable values together with the limits on the range of possible values makes the source entropy finite and the post-communication entropy of the receiver non-zero.
    引自 Information

    In principle, this representation of the sound requires infinitely many different sinusoids; but in practice, there are limits on both the sensible range of sinusoidal frequencies and the frequency resolution within that range.
    引自 Information

    Thus, the space of possible broadcasts is the space defined by the range of hearable frequencies and attainable amplitudes and phases.
    引自 Information

    Mutual information

    The mutual information between an information-conveying signal and its source is the entropy of the source plus the entropy of the signal minus the entropy of their joint distribution.
    引自 Information

    The mutual information between source and signal sets the upper limit on the information that may be communicated to the receiver by that signal. There is no way that the receiver can extract more information about the source from the sig- nal received than is contained in that signal. The information about the source con- tained in the signal is an objective property of the statistical relation between the source and the signal, namely, their joint distribution, the relative frequencies with which all possible combinations of source message and received signal occur. The information communicated to the receiver, by contrast, depends on the receiver’s ability to extract the information made available in the signals it receives (for exam- ple, the receiver’s knowledge of the code, which may be imperfect) and on the receiver’s representation of the possibilities and their probabilities.
    引自 Information

    Information and the Brain

    There is little reason to think that there are things that can only be computed by an analog computer. On the contrary, the general, if largely unspo- ken, assumption is that digital computation can accomplish anything that analog
    引自 Information

    computation can, while the converse may not be the case. As a practical matter, it can usually accomplish it better. That is why there is no technological push to create better analog computers.

    2018-01-31 12:38:02 回应
  • 第55页 Representation

    The representing system and the represented system, together with the functions that map between them, constitute a representation, provided three conditions are met:

    1. The mapping from entities in the represented system to their symbols in the representing system is causal (as, for example, when light reflected off an object in the world acts on sensory receptors in an eye causing neural signals that eventuate in a percept of the object).
    2. The mapping is structure preserving: The mapping from entities in the represented system to their symbols is such that functions defined on the represented entities are mirrored by functions of the same mathematical form between their corresponding symbols. Structure-preserving mappings are called homomorphisms.
    3. Symbolic operations (procedures) in the representing systems are (at least some- times) behaviorally efficacious: they control and direct appropriate behavior within, or with respect to, the represented system.
    2019-07-04 04:53:49 回应
  • 第73页 Physical Properties of Good Symbols

    Distinguishability

    Constructability: A computing machine can only have a finnite nnumber of actual symbols in it, but it must be so constructed that the set of possible symbols from which those actual symbols come is essenntially infinite.

    Compactness: Analog coding is daunting because the damand on physical resources grows in proportion to the number of durations that we want to distinguish. The same resources can be put to much better use by a symbol-construction scheme in which the number of distinguishable symbols grows exponentially with the physical resources required. rate coding vs. place coding

    Efficacy: because symbols are the stuff of computation, they must be physically efficacious within the mechanisms that implement basic functions. That is, the outputs produced by computational mechanisms (the physical realization of functions) must be determined by their inputs.

    2019-07-10 01:46:17 回应
  • 第79页 Symbol Taxonomy

    Atomic data: irreducible physical forms that can be constructed and distinguished in a representing system.

    Data strings: ordered forms composed of one or more of these atomic elements

    Nominal synbols: data strings that map to their referents in the represented system in an arbitrary way, a mapping that is not constrained by any generative principles.

    Encoding symbols: are related to their referents by some organized and generative principles.

    Data structures: often called expressions in the philosophical and logical literature, are symbol strings that have referents by virtue of the referents of the symbols out of which they are composed annd the arrangemennt of those symbols.

    2019-07-10 03:25:39 回应
  • 第99页 Procedures
    State memory is not only useful, it is necessary. All procedures require some state memory, just as they require some structure that is not a result of experience. This is a direct reflection of the fact that if any device is to recieve information, it must have a priori representation of the possible messages that it might receive.
    引自 Procedures

    memory, memory

    2019-07-10 03:42:48 回应
  • 第100页 Two Senses of Knowing

    Symbolic knnowing: transparent, the symbols carry information gleaned from experience forward in time in a manner that makes it accessible to computation. The information needed to inform behavior is either explicit in the symbols that carry it forward or may be made explicity by computationns that take those symbols as inputs.

    Proceduarl "knowing": e.g., inn the search tree implementation of fis_even. State 5 “knows” that the first bit in the input was a ‘0’ and the second bit was a ‘1’, not because it has symbols carrying this information but instead because the procedure would never have entered that state were that not the case. We, who are gods outside the pro- cedure, can deduce this by scrutinizing the procedure, but the procedure does not symbolize these facts. It does not make them accessible to some other procedure.

    2019-07-10 03:56:16 回应
  • 第155页 Data Structures

    The Turing machine architecture is not a plausible model for an efficient memory, as this sequential acess (searching through all of memory to find what is needed) would simply be too slow. All modern computers use the random-acess model of memory.

    2019-07-12 03:09:39 回应
  • 第241页 The Modularity of Learning
    Learning processes are modular because the structure of an information-extracting process must reflect the formal structure of the information that is to be extracted and the structure of the data from which it is to be extracted.
    引自 The Modularity of Learning

    3 examples:

    Learning by path integration

    Learning by fitting an innately specified function to observational data

    Pavlovian conditioning paradigm

    2019-07-13 02:15:43 回应