top of page
  • Writer's picturemattfaw

03 Where in the brain does Experience happen?

Updated: Apr 3

In our model, Subjective Experience is a brain-wide activation. The Immediate Action Network (please see the 3 brain systems post) of the neocortex receives sensory input, and motivates behavioral output. It then sends signals of its conclusions to the hippocampus, to be encoded in the next moment's new episodic memory. Those signals are organized in the entorhinal cortex, filled-in using pattern-matching abilities of hippocampal field CA3, and encoded within a theta wave output in field CA3, before traveling through the subiculum and entorhinal cortex, back out to the rest of the brain.

It is the activation of the rest of the brain, by the hippocampal theta signal, that gives rise to Experience. The perception of smell is the activation of the olfactory bulb, by the hippocampal signal, just as the perception of vision is the activation of the visual cortices, by the hippocampal signal. The parts of the brain that evolved to make sense of certain sensation stimuli from the outside world are the same parts that are employed in creating perceptual Experience.

Video cameras make fairly useful technological analogies for the episodic memory system (even if memories can't be as accurate as video). When light rays enter the lens of a camera, they activate pixels in the camera's CCD or other imaging device. Those individual pixels send their signals to the camera's image processor, where they are combined into one signal, which can be recorded, and then played back later.

Upon playback, the recorded master signal is interpreted by the image processor, activating a different array of pixels, those on a video monitor. When those pixels are activated to correspond to the input pixels, then what appears on the monitor is a representation of the original scene.

In-pixels are activated, leading to encoding and information storage. Information recall (i.e. playback) leads to decoding and representations upon out-pixels. Those representations playing back on the monitor create a simulation of the original scene, and can help educate the viewer about what was happening.

Likewise, sensory stimulus activates the brain's version of in-pixels. Not just the cones and rods of the retina, but also all the brain regions activated by vision (plus the input regions of each other sense). Each in-pixel of the brain sends its code to the hippocampus to be combined into a master signal and recorded, for future playback.

Upon memory recall, the master memory signal is reconstructed by the hippocampus, and sent back to all the contributing nodes of the Immediate Action network. It is the activation of those nodes/pixels, that creates Experience. The theta wave serves as a timing signal, so that all receiving brain parts can agree on when is the 'now' of experience.

Immediate experience is also like a video camera. Even when a camera is not recording, but is otherwise active, it sends a live video feed to the viewfinder, to activate those pixels into a representation of what's happening right now. That viewfinder allows the camera operator to respond to the input, panning and tilting to keep the image in view, for example.

Likewise, Subjective Experience is the live feedback of the hippocampal signal. Even if no part of this experience ever gets recalled, ever again, the in-pixels are still sending information to the hippocampus, which is replying with the theta wave memory, causing those same pixels to act as out-pixels, creating Qualia, creating Experience.

The wide-spread activation of out-pixels creates a communication challenge for the Core Network (please see the three brain regions post), because it means the Core Network has to 'observe pixels' all over the brain. And indeed, if we look at a wiring map of the Core Network, we see that it has afferents from all over the brain, coming together to be processed in a few local hubs and parts of the prefrontal cortex. The Core Network observes the out-pixel activations caused by the hippocampal signal's impact on the IAN, allowing it to interpret meaning from the immediate moment, and to imagine elements beyond the current input of the senses.

I think it's worthwhile mentioning another technological analogy to this process. Microphones and earphones, for example are input and output devices, respectively, that can sometimes take each other's place. If you plug a microphone into an earphone jack, for example, you can hear some of the audio output that you would expect from an earphone. Likewise, an earphone plugged into a microphone jack can actually transmit some audio into the machine, the way a microphone would. These input devices can also act as parallel output devices, creating a representation of the original input.

0 views0 comments

Recent Posts

See All


bottom of page