(Yesterday, 07:40 PM)Sciborg_S_Patel Wrote: I'd say the keyword is "simulated".
I don't believe LLMs understand anything semantically.
LLMs have never once been shown to comprehend anything semantically. Nevermind that the very computational nature of LLMs excludes any possibility of semantic reasoning. It's abstractions and metaphors all the way down to the purely physical and chemical level...
“Everything that irritates us about others can lead us to an understanding of ourselves.”
~ Carl Jung
~ Carl Jung