Robert Epstein: Your brain is not a computer

18 Replies, 2617 Views

Robert Epstein is a senior research psychologist at the American Institute for Behavioral Research and Technology in California. He is the author of 15 books, and the former editor-in-chief of Psychology Today.

In Aeon:

The empty brain:
Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer


Quote:(H)ere is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.

Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.

Computers, quite literally, move these patterns from place to place in different physical storage areas etched into electronic components. Sometimes they also copy the patterns, and sometimes they transform them in various ways – say, when we are correcting errors in a manuscript or when we are touching up a photograph. The rules computers follow for moving, copying and operating on these arrays of data are also stored inside the computer. Together, a set of rules is called a ‘program’ or an ‘algorithm’. A group of algorithms that work together to help us do something (like buy stocks or find a date online) is called an ‘application’ – what most people now call an ‘app’.

Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?
[-] The following 8 users Like Ninshub's post:
  • Sciborg_S_Patel, Obiwan, nbtruthman, tim, The King in the North, Laird, Valmar, Doug
The flipside. 
Quote:His claim is that the brain does not “contain”. Furthermore, the foundational assumption of cognitive science and psychology is a horribly mistaken “contains + processes” thesis about the brain. Epstein aggressively argues that your brain is not a computer - in reality, or in metaphor.

But does he provide a single good reason why the brain as an information processor is a mistaken metaphor?http://lukependergrass.work/blog/the-information-processing-brain



I'd like to know how autistic savants such as this young man fit into his idea?  https://youtu.be/phkNgC8Vxj4
Here he is drawing a cityscape mentioned in the beginning of the first video. https://youtu.be/iXrvL7IlEtw

Does Robert also factor in persons that have Highly Superior Autobiographical Memory, or H-SAM? For example Marylou Henner.
See also Valmar's thread about this same article, [Article] Your brain does not process information, and it is not a computer.
[-] The following 2 users Like Laird's post:
  • tim, Ninshub
Oops - sorry there Valmar.
Epstein is rather vague about exactly how the brain (physically) achieves consciousness. His big problem is that he still seems to assume, like almost all neuroscientists, that the physical neurons and synapses of the brain somehow create consciousness. He says that the way it does that is not through the Turing machine information processing model, but is coy about the specifics of how. This is probably because he has no idea of that other than some notion based on the interesting baseball pitch example. 

This basic materialist/reductionist assumption still runs squarely into the famous "hard problem". How can the physical interaction of the brain's neurons and synapses (the electronic and chemical parameters of which are in principle detectable and quantifiable through high tech machines) be one and the same as qualia - the stuff of consciousness? The properties of consciousness are clearly in another ontological existential category than the physical properties of the basic elementary particles and fields making up matter on up to the size range of cells and their organelles.       

It seems to me that the only meaningful point he makes is about the digital information processing model of human consciousness being just another failed model based on (then) current technology in a long series going back into antiquity. That does seem to be the case. Unfortunately he won't even suggest that the most likely explanation for this is the basic category error at the root of the "hard problem". Of course he won't go there because he is a good well credentialed and influential materialist reductionist scientist and wants to remain one.
(This post was last modified: 2018-09-17, 06:19 AM by nbtruthman.)
[-] The following 6 users Like nbtruthman's post:
  • Sciborg_S_Patel, Ninshub, tim, Typoz, Valmar, Doug
(2018-09-17, 06:13 AM)nbtruthman Wrote: Epstein is rather vague about exactly how the brain (physically) achieves consciousness. His big problem is that he still seems to assume, like almost all neuroscientists, that the physical neurons and synapses of the brain somehow create consciousness. He says that the way it does that is not through the Turing machine information processing model, but is coy about the specifics of how. This is probably because he has no idea of that other than some notion based on the interesting baseball pitch example. 

This basic materialist/reductionist assumption still runs squarely into the famous "hard problem". How can the physical interaction of the brain's neurons and synapses (the electronic and chemical parameters of which are in principle detectable and quantifiable through high tech machines) be one and the same as qualia - the stuff of consciousness? The properties of consciousness are clearly in another ontological existential category than the physical properties of the basic elementary particles and fields making up matter on up to the size range of cells and their organelles.       

It seems to me that the only meaningful point he makes is about the digital information processing model of human consciousness being just another failed model based on (then) current technology in a long series going back into antiquity. That does seem to be the case. Unfortunately he won't even suggest that the most likely explanation for this is the basic category error at the root of the "hard problem". Of course he won't go there because he is a good well credentialed and influential materialist reductionist scientist and wants to remain one.

I think there is probably a similar issue with assumptions about memory. It must be very tempting to think of memory in computing terms: storage locations somewhere in the brain, events stored in digital format, some form of indexing for later retrieval but I somehow doubt that is how human memory works. I get the impression that a category error is also common when people talk about memory: in that recall is meant when the word memory is used. The two are, of course, entirely different but we all tend to use phrases like "memory fails me" when we mean "I can't recall".

Just as a bit of a diversion, I awoke one morning recently with an idea about memory which really surprised me as I have no clue what prompted it. The idea was that memory is not a matter of storage but a matter of accessing (or viewing) events that are ever present  - somehow timeless. So we don't have some form of representation of an event encoded and filed in some neurological circuitry but we perform a kind of remote viewing of the event we are trying to remember; an event which still exists in the true timelessness of reality. Remote viewing is far from perfect and can often result in mistakes and misinterpretations. In the same way, recall could also be considered less than reliable. Likewise, imagination may intrude on viewed places or events in the same way that memories could be partially or wholly imagined.

It isn't a well thought through idea yet and I have not read much about memory and recall in science books or journals so my thoughts on the matter amount to speculative musing.
I do not make any clear distinction between mind and God. God is what mind becomes when it has passed beyond the scale of our comprehension.
Freeman Dyson
[-] The following 6 users Like Kamarling's post:
  • Ninshub, stephenw, tim, Doug, Typoz, Valmar
(2018-09-17, 08:14 AM)Kamarling Wrote: Just as a bit of a diversion, I awoke one morning recently with an idea about memory which really surprised me as I have no clue what prompted it. The idea was that memory is not a matter of storage but a matter of accessing (or viewing) events that are ever present  - somehow timeless. So we don't have some form of representation of an event encoded and filed in some neurological circuitry but we perform a kind of remote viewing of the event we are trying to remember; an event which still exists in the true timelessness of reality. Remote viewing is far from perfect and can often result in mistakes and misinterpretations. In the same way, recall could also be considered less than reliable. Likewise, imagination may intrude on viewed places or events in the same way that memories could be partially or wholly imagined.

It isn't a well thought through idea yet and I have not read much about memory and recall in science books or journals so my thoughts on the matter amount to speculative musing.
I strongly agree with your intuition.  I understand, in a qualified way, the point of Epstein.

I have a problem of Epstein's grasp of information processing.

Quote:The faulty logic of the IP metaphor is easy enough to state. It is based on a faulty syllogism – one with two reasonable premises and a faulty conclusion. Reasonable premise #1: all computers are capable of behaving intelligently. Reasonable premise #2: all computers are information processors. Faulty conclusion: all entities that are capable of behaving intelligently are information processors.

Setting aside the formal language, the idea that humans must be information processors just because computers are information processors is just plain silly, and when, some day, the IP metaphor is finally abandoned, it will almost certainly be seen that way by historians, just as we now view the hydraulic and mechanical metaphors to be silly.

Computers do not behave intelligently - they have logic-structured outcomes.  As soon as you say - behave - there is an implication of intent.  AI can mimic intent - but most applications on a computer do not.  There are no intentional input coming from information processing, only hollow structures that in the hands of an agent, can lead to intelligent action.

Further, the assertion that living things acting as agents do not process information is simply not scientifically true.  Agent's mental work output is the context for intentional behavior in many cases.  Mental work directly changes probabilities in the real-world environments in which they are active.  These changes in probability from mental work are measurable in terms outcomes.

The idea that our probing for affordances is not done as information processing is crazy.  It is done by manipulating symbols that are meaning-loaded.  The problem is that formal information processing in a computer is without any meaningful understanding of affordances by the machine.

The idea that there timeless events - is not understood in today's context of Materialism.  On the other hand, thinking in terms of information objects that evolve from real natural structure (formal information) can explain why an event that is not in the hear and now - can bring forward real meaningful guidance that can change our current view of the past and our view of the future.  

Physics is always measured in the here and now.  Information structures in the past can be brought to the present and make real-world impact.  Likewise, future events can be predicted from mental work and these prediction can have meaningful and intentional changes in the course of unfolding events.  Information objects are real and they change outcomes in reality.  Information objects are not just in the here and now, as are physical events.  In some respects they not bound by time.
(This post was last modified: 2018-09-19, 02:15 PM by stephenw.)
[-] The following 2 users Like stephenw's post:
  • Max_B, Ninshub
(2018-09-19, 02:13 PM)stephenw Wrote: Further, the assertion that living things acting as agents do not process information is simply not scientifically true. 

If I'm getting the gist of your post (which I may not be), then what you wrote in the quote above is mistaken.  The author's own words implied to me the point that humans are not exclusively information processors.  While we regularly do process information, that what makes up our experience is not reducible to exclusively information processes bases.
That's certainly my best guess as well.  I just don't see how binary, algorithm-based processes could ever become conscious as we humans are.
[-] The following 2 users Like Silence's post:
  • Ninshub, Obiwan
This post has been deleted.
(2018-10-06, 05:36 PM)Max_B Wrote: There is very little doubt in my mind that we are processing information, just like a computer, just not a computer that we could build today, or tomorrow. We are horrendously predictable, and so easy to manipulate. The majority are like sheep, fenced in a field, and the balance are little better, they spend their time making sure none get beyond the fence. It’s utterly nutty.

Absolutely. Any healthy animal, human included, is programmed to think and act in certain ways. The bulk of this conditioning happens in the earliest months and years of life. Suggestion, nudge, expectation, reward, penalty etc, it’s easy to see life as a prolonged kind of hypnosis. Yet it has has to be this way, we’re born primed to adapt, survive and thrive in the universe we’re shoved into, and that is the real wonder of biology.
(This post was last modified: 2018-10-06, 07:15 PM by malf.)

  • View a Printable Version
Forum Jump:


Users browsing this thread: 1 Guest(s)