Robert J. Marks discussing his book, Non-Computable You

2 Replies, 152 Views

C-Span asks Marks: How Can AI Be Made Sentient? Innovative?

by "News"

Quote:So this begs the question, are there aspects of us which are non computable? Right? And I think the low-hanging fruit there are things such as emotions such as love, compassion, empathy and such — but the deeper ones as you mentioned, our creativity, understanding and sentience.
These look to be non-computable. And the thing is that if they were non-computable in the thirties, they’re non-computable now. Non-computable means non-computable. And they’re also going to be non-computable in the future. Even if we get super duper computers, it’s still going it’s still going to be something which is out of the reach of artificial intelligence in computers in general.

See also:

Marks: AI Looks Very Intelligent — While Following Set Rules

Quote:Eventually, another sacrificial Dweeb was identified, and the process repeated. The new sacrificial Dweeb kept the Bullies running around in circles while the remaining Dweebs cowered in the corner. The sacrificial Dweeb result was unexpected. A complete surprise. There was nothing written in the evolutionary computer code explicitly calling for these sacrificial Dweebs. Is this an example of AI doing something we hadn’t programmed it to do? Did it pass the Lovelace test,? Absolutely not.

We had programmed the computer to sort through millions of strategies that would maximize the life of the Dweeb swarm, and that’s what the computer did. It evaluated options and chose the best one. The result was a surprise, but does not pass the Lovelace test for creativity. The program did exactly what it was written to do, and the seemingly frightened Dweebs were not in reality shaking with fear. Humans tend to project human emotions onto non-sentient things. The Dweebs were rapidly adjusting to stay as far away as possible from the closest Bully. They were programmed to do this.

Quote:Here’s an example from the predator–prey swarm example. The Lovelace test would be passed if some Dweebs became aggressive and started attacking and killing lone Bullies, a potential action we didn’t program into the suite of possible strategies. But that didn’t happen. And because the ability of a Dweeb to kill a Bully is not written into the code, it will never happen… But remember, the AlphaGo software as written couldn’t even provide an explanation of its own programmed behavior, the game of Go.

Note: An excerpt from Chapter One is also available here, as read by Larry Nobles (October 6, 2022). A transcript is available there as well.
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell


[-] The following 2 users Like Sciborg_S_Patel's post:
  • nbtruthman, Typoz
This is yet another output from the Discovery Institute (DI), that focuses on ways to demonstrate that live cannot have evolved by natural selection, and related issues - such as the idea that our minds must be external to our brains.

I haven't read his book, and I'd be interested to learn more about it if anyone here has this book.

As I have said before, the DI likes to keep its technical output separate from its evangelising - which is good since I am not a Christian!
[-] The following 1 user Likes David001's post:
  • Sciborg_S_Patel
(2022-12-19, 06:15 PM)David001 Wrote: This is yet another output from the Discovery Institute (DI), that focuses on ways to demonstrate that live cannot have evolved by natural selection, and related issues - such as the idea that our minds must be external to our brains.

I haven't read his book, and I'd be interested to learn more about it if anyone here has this book.

As I have said before, the DI likes to keep its technical output separate from its evangelising - which is good since I am not a Christian!

I am still trying to find a bit more about it. I feel there are some basic arguments related to the Hard Problem - how computers don't have feels.

However what I'd want for a book is something more, like the example he gives of the AI "Dweebs" he wrote. Further developed arguments such as those I think present something beyond the basics.
'Historically, we may regard materialism as a system of dogma set up to combat orthodox dogma...Accordingly we find that, as ancient orthodoxies disintegrate, materialism more and more gives way to scepticism.'

- Bertrand Russell



  • View a Printable Version
Forum Jump:


Users browsing this thread: 1 Guest(s)