Maybe AI just needs a bigger truck

34 Replies, 1763 Views

(2019-03-29, 04:18 PM)Mediochre Wrote: So come on, justify it. What's the difference?

Well the structural argument is metaphysically neutral.

The reason a Turing Machine doesn't have consciousness is because the meanings of programs depend on human users. This is something you seem to agree on for the most part AFAICTell, because you said you think only a small class of all possible programs can make a computer conscious.
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


[-] The following 1 user Likes Sciborg_S_Patel's post:
  • Valmar
(2019-03-29, 04:26 PM)Sciborg_S_Patel Wrote: Well the structural argument is metaphysically neutral.

The reason a Turing Machine doesn't have consciousness is because the meanings of programs depend on human users. This is something you seem to agree on for the most part AFAICTell, because you said you think only a small class of all possible programs can make a computer conscious.

The "Meaning" of programs... of course you'd say that. Once again you have no argument. I described a process that exists in both humans and some programs and all you have to come back with is philosophy.
"The cure for bad information is more information."
(2019-03-29, 04:19 PM)Sciborg_S_Patel Wrote: As I said before I can wonder about whether my fellow humans are conscious while still being confident that a Turing Machine - which could be instantiated by a group of humans - is not conscious any more than a book or abacus is conscious.

No you can't, I describe equivalent processes across both architectures, explain how they're different. Either both are conscious or neither are.

Quote:Regarding the second paragraph's first sentence, I am basing it on structurally similarity to my own being that possesses consciousness - why I said a synthetic entity that produces the empirically verified necessary/sufficient structures would be a conscious entity.

So then it's all produced in the brain like neuroscience says. I mean why else would the structures matter if that's not the case?

Quote:Regarding the italicized second sentence. are you making an argument for panpsychism? That all substances or at least processes are conscious, including the running of Turing Machines? Because it seemed you were saying it takes a specific kind of program to make a non-conscious physical instantiation of a Turing Machine conscious?
Yeah and human bodies stop acting conscious too when they have no electrical activity among other things. Yes I'm sure someone will try to make an OBE argument but really that changes nothing, The body's still unconscious and the rest is just different sensors and a different medium. So once again, demonstrate logically how the machine is actually different.
Quote:Perhaps I've misunderstood, but my reading of your argument is that a certain kind of program - specifically deep learning or possibly any class of machine learning programs - can make a computer - thus a physical instantiation of a Turing Machine - conscious. However this same computer is not conscious before it runs the programming, and possibly not conscious when it runs other programs. 

Again, dead bodies have no consciousness, and drunken stupors and other drug and non drug related things can have someone be unconscious while their body is still performing behaviours. So once again, describe how humans aren't just machines. Describe what makes the processes so different.
"The cure for bad information is more information."
(2019-03-29, 04:50 PM)Mediochre Wrote: The "Meaning" of programs... of course you'd say that. Once again you have no argument. I described a process that exists in both humans and some programs and all you have to come back with is philosophy.

You asked for logic, and thus I am unsure what the separation is between the logical argument you want and philosophy you don't.

I just meant that ultimately the bit strings of 0s and 1s have to be interpreted and even pre-compilation the program's purpose has to be decided by humans.

Whether a bug in a program is a mistake or an intentional act of sabotage requires one to ask a programmer. The stored data, ultimately bit strings themselves I believe, can be interpreted as something else by another computer. So any group of stored bit strings could represent different forms of knowledge - perhaps not any form of knowledge but at least a countably infinite number.

Also you mentioned data and rewriting source. But again I am unclear what it means to rewrite source so that a program is conscious - what is the threshold from programs that don't make a computer conscious to programs that do?
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


[-] The following 1 user Likes Sciborg_S_Patel's post:
  • Laird
(2019-03-29, 05:03 PM)Sciborg_S_Patel Wrote: I just meant that ultimately the bit strings of 0s and 1s have to be interpreted and even pre-compilation the program's purpose has to be decided by humans.

Irrelevant. The program can run, it mathematically could've come about through a random number generator. The amount of time it takes for that to happen doesn't matter, just that it could. All the human did was speed up the process.

Quote:Also you mentioned data and rewriting source. But again I am unclear what it means to rewrite source so that a program is conscious - what is the threshold from programs that don't make a computer conscious to programs that do?

What the threshold for consciousness in general then? I mean if you're going to ask that then I guess humans aren't conscious either. Since hey, it's all just electrochemical reactions causing all those neural rewirings. How many synapses do you need for consciousness? Or can you stop being facetious and focus on the raw capability of being able to change behaviour?

So far you're effectively arguing to me that a Raspberry Pi can't run Cataclysm: Dark Days ahead because it's and ARM based platform whereas my AMD64 laptop can. I'm demonstrating that I can run it on both my Raspberry Pi and laptop but you're stating that no, somehow, someway, the Pi isn't actually running it because it's too radically different because of "reasons".
"The cure for bad information is more information."

  • View a Printable Version
Forum Jump:


Users browsing this thread: 1 Guest(s)