HN2new | past | comments | ask | show | jobs | submitlogin

Likely something like being able to explain the meaning, intent, and information contained in a statement?

The academic way of verifying if someone "understands" something is to ask them to explain it.



I mean if I memorize an explanation and recite it to you, do I actually understand it? Your evaluation function needs to determine if they just wrote memorize stuff.

Explanation by analogy seems more interesting to me as now you have to know two different concepts and how the ideas in them can connect in ways that may be not be contained in the dataset the model is trained on.

There was an interesting post where someone asked ChatGPT to make up a song/poem as if written by Eminem about the how an internal combustion engine works, and ChatGPT returns a pretty faithful rendition of just that. The model seems to 'know' who Eminem is, how their lyrics work in general, and the fundamental concepts of an engine.


I think a lot of ink has already been spilled on this topic, for example under the heading of "The Chinese Room"

https://en.wikipedia.org/wiki/Chinese_room


> The question Searle wants to answer is this: does the machine literally "understand" Chinese? Or is it merely simulating the ability to understand Chinese? Searle calls the first position "strong AI" and the latter "weak AI".

> Therefore, he argues, it follows that the computer would not be able to understand the conversation either.

The problem with this is that there is no practical difference between a strong and weak AI. Hell, even for humans you could be the only person alive that's not a mindless automaton. There is no way to test for it. And just as well the same way a bunch of transistors don't understand anything a bunch of neurons don't either.

Funniest thing about human inteligence is how it stems from our "good reason generator" that makes up random convincing reasons for doing actions we're already doing, so we could convince others to do what we say. Eventually we deluded ourselves enough to believe that those reasons came before the subconscious actions.

Such a self-deluding system is mostly dead weight for AI, as as long as the system does or outputs what's needed there is no functional difference. Does that make it smart or dumb? Are viruses alive? Arbitrary lines are arbitrary.


Does someone only understand English by being able to explain the language? Can someone understand English and not know any of the grammatical rules? Can someone understand English without being able to read and write?

If you ask someone to pass you the salt, and they pass you the salt, do they not understand some English? Does everyone understand all English?


Well there seem to be three dictionary definitions:

- perceive the intended meaning of words, a language, or a speaker (e.g. "he didn't understand a word I said")

- interpret or view (something) in a particular way (e.g. "I understand you're at art school")

- be sympathetically or knowledgeably aware of the character or nature of (e.g. "Picasso understood colour")

I suppose I meant the 3rd one, but it's not so different from the 1st one in concept, since they both mean some kind of mastery of being able to give or receive information. The second one isn't all that relevant.


So only someone who has a mastery of English can be said to understand English? Does someone who speaks only a little bit of English not understand some English? Does someone need to “understand color” like Picasso in order to say they understand the difference between red and yellow?

Why did we need the dictionary definitions? Do we not already both understand what we mean by the word?

Isn’t asking someone to pass the small blue box and then experiencing them pass you that small blue box show that they perceived the intended meaning of the words?

https://en.m.wikipedia.org/wiki/Use_theory_of_meaning


> Isn’t asking someone to pass the small blue box and then experiencing them pass you that small blue box show that they perceived the intended meaning of the words?

You can teach a dog to fetch something particular. The utility of that is quiet limited.


> Does someone who speaks only a little bit of English not understand some English?

I mean yeah, sure? It's not a binary thing. Hardly anyone understands anything fully. But putting "sorta" before every "understand" gets old quick.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: