Artificial Consciousness Is Impossible

2 Replies, 103 Views

Artificial Consciousness Is Impossible

David Hsing

Quote:Conscious machines are staples of science fiction that are often taken for granted as articles of future fact, but they are not possible.

I feel like this holds for Turing Machines, not sure if it would for some distant future where we make androids like Data from Star Trek?

[Note I am not talking about synthetic brains being producers of consciousness but rather Idealist Alters, Panpsychic Combinations or Decombinations, etc.]

Quote:Learning Rooms-

Machines never actually learn, partly because the mind isn’t just a physical information processor
The direct result of a machine’s complete lack of any possible genuine comprehension and understanding is that machines can only be Learning Rooms that appear to learn but never actually learn. Considering this, “machine learning” is a widely misunderstood and arguably oft-abused term.

AI textbooks readily admit that the “learning” in “machine learning” isn’t referring to learning in the usual sense of the word[8]:

Quote:“For example, a database system that allows users to update data entries would fit our definition of a learning system: it improves its performance at answering database queries based on the experience gained from database updates. Rather than worry about whether this type of activity falls under the usual informal conversational meaning of the word “learning,” we will simply adopt our technical definition of the class of programs that improve through experience.”

Note how the term “experience” isn’t used in the usual sense of the word, either, because experience isn’t just data collection. The Knowledge Argument shows how the mind doesn’t merely process information about the physical world[9].

Possessing only physical information and doing so without comprehension, machines hack the activity of learning by engaging in ways that defy the experiential context of the activity. A good example is how a computer artificially adapts to a video game with brute force instead of learning anything[10].
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


(This post was last modified: 2025-01-17, 04:23 PM by Sciborg_S_Patel. Edited 1 time in total.)
[-] The following 3 users Like Sciborg_S_Patel's post:
  • Valmar, Typoz, nbtruthman
(2025-01-17, 03:59 PM)Sciborg_S_Patel Wrote: Artificial Consciousness Is Impossible

Machines Can't Think

David Hsing

Quote:Machines can't think. My argument is as follows:
  • There is no thought without a subject (what is even a thought without a subject?)
  • Machines never deal with subjects and could never deal with subjects

Three supporting observations...
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


(This post was last modified: 2025-02-04, 07:59 PM by Sciborg_S_Patel.)
[-] The following 3 users Like Sciborg_S_Patel's post:
  • nbtruthman, stephenw, Valmar
(2025-02-04, 07:59 PM)Sciborg_S_Patel Wrote: Machines Can't Think

David Hsing

I would agree, except my concept is a lttle different: computing machines can never have consciousness (which is subjective awareness) for two basic reasons:

The first is because the computer simply can do nothing but compute algorithms, at base the logical process of executing micro computer instruction after computer instruction. This is logical manipulation of digital bits (no matter how incredibly complicated and rapid), and is completely a process and involves material things. But the Hard Problem of mind has shown that conscious awareness is totally immaterial and is in a totally, existentially different realm of existence.

Therefore the computer system is fundamentally unable to exhibit conscious awareness, because its very basic nature in limited to material mechanisms, physical "things", which have no immaterial mental dimension.

The second reason, which is related to the first given above: computing machines can do nothing but manipulate material signals and digital bits in a data processor, and this process can have no meaning whatsoever. This is because the meaning of something is what it expresses or represents. "Expression", "representation", and especially "meaning", are inherently mental properties and are not intrinsic to material things like computer processors or data processing. Example: The word "flight" has two different meanings: a plane trip, and the act of running away. Absolutely nothing about the computing machine's processing of digital bits coding for the word "flight" has any such intrinsic meaning - in other words, meaning is exclusively a property of immaterial mind or conscious awareness and thought.
(This post was last modified: 2025-02-06, 03:12 PM by nbtruthman. Edited 2 times in total.)
[-] The following 3 users Like nbtruthman's post:
  • Valmar, stephenw, Sciborg_S_Patel

  • View a Printable Version
Forum Jump:


Users browsing this thread: 1 Guest(s)