Skip to content

Neurons As Information

SurrealVectors edited this page Jul 2, 2021 · 4 revisions

Neurons As Truth Functions

It can be useful to consider the activation function as a truth function of sorts. It is essentially answering the question "How true is this idea currently compared to how true it is typically?". For a given set of inputs, the neuron learns the typical combined value.

Neurons as Definitions

Another way to look at it is that each neuron represents an idea defined by the ideas of it's inputs. There are two kinds of definitions that these neurons display: intensional definitions and extensional definitions.

Intensional Definitions

Intensional definition rely upon the logic of 'and'. It represents an idea as being defined by a set of features for which if they are mostly true for a particular thing, than that thing is an example of the defined idea. For example, a 'toy' is a small object that is used for fun. Breaking this definition, for something to be a 'toy' it must be 'small' and 'object' and 'be used for fun'.

Extensional Definitions

Extensional definitions rely upon the logic of 'or'. It represents an ideas being defined by a set of possible variations. If something is an example of any one of these variants, than it is an example of the defined idea. For example, an 'additive primary color' is one of 'red', 'green', or 'blue'. These are the 3 possible colors which together form the class of 'additive primary colors'.

Duality

These two kinds of definitions are duels of each other. In theory, anything defined the one way can be defined the other. In practice, the information available to form a particular concept may lend itself to a particular kind definition.

Neurons As Relation Measures

The activation function implies a relationship between the inputs. The stronger the implied relation is, the smaller the average of the output will tend to be. Because the neural net tends towards minimal outputs, this implies a trend towards increasing accuracy in neurons representing relationships. Typically, the deeper into the neural net, the less likely it is for a neuron to have activity. 'Activity' simply meaning deviating from the long-term estimated average learned by the neuron. Therefore, inputs of a neuron will tend to have they're activity synchronized in opposing directions from the average. This allows their combined input to remain near the average for typical activity. If the inputs weren't synchronized, this would increase the activity. As the activities of the neural network is minimized over time, this implies the prior option being favored.

Such a synchronized set of inputs could have multiple subsets whereby each subset tends to by internally synchronized in the same directions, while externally there would tend to be a balance of subsets on either side of the average. For larger numbers of inputs, there can be additional sets of inputs with synchronized activity. Not all the inputs have to be synchronized as a whole, as long as every input is synchronized with at least some other inputs. This implies that a single neuron can represent many ideas. The less activity the inputs tend to have, the more ideas a neuron can represent before becoming saturated. However, this only accounts for typical activity. There will often be outlier activity. The more outlier activity there is, the less room there is for saturation without interference.

There is an additional relation though, which is that the synchronized sets would, together, form a extensional definition, as these sets are not active at the same time. However, where as the neurons within a particular set have a more direct relation, the relationship across sets can be as weak as "these don't tend to be true at the same time." There are countless ideas that don't apply to particular situations. Not all possible ideas are useful. If a neuron built up of unrelated definitions, than it's output will be less likely to synchronize to be input of other neurons. Neurons that aren't inputs to other neurons will need to break it's input connections until it reaches the point were it is kept as input to other neurons.