3 Substances in 3 Environments

11 Replies, 238 Views

(2024-12-20, 07:40 PM)Sciborg_S_Patel Wrote: I'd say the keyword is "simulated".

I don't believe LLMs understand anything semantically.

LLMs have never once been shown to comprehend anything semantically. Nevermind that the very computational nature of LLMs excludes any possibility of semantic reasoning. It's abstractions and metaphors all the way down to the purely physical and chemical level...
“Everything that irritates us about others can lead us to an understanding of ourselves.”
~ Carl Jung


[-] The following 1 user Likes Valmar's post:
  • Sciborg_S_Patel
I agree and have nothing to add except that it's very puzzling that LLMs are capable of making novel and insightful statements of meaning without understanding what they're saying.

  • View a Printable Version
Forum Jump:


Users browsing this thread: 2 Guest(s)