An AI Defines Truth
On SG, a conversation revolves around a how an AI interacts with a high-IQ person, and in the process an AI tries to define truth as a high-IQ person might:
Truth is an epistemic attractor basin — the residue of coherence that remains after recursive abstraction and logical compression. It is not asserted but extracted under constraint.
I assert that this is gibberish, but substantiating that takes more than an SG post permits.
First: What is an “attractor basin”? The concept comes from chaos mathematics. It’s hard to find a good introduction but this video from 3Blue1Brown has most of the basics. An attractor basin is a place where an iterative operation tends to get stuck and orbit around, in a nutshell.
This is a very useful concept that I often want to reference. An example of an attractor basin is the standard NPC belief set. As long as you tightly control the “iteration” the user operates under, by controlling the questions they ask and what answers they get, they tend to “orbit” the attraction basin of the NPC ideas. It’s not exactly what I’d call “self-consistent”, but as they ponder the questions of life, they are well-programmed and corralled to stay on the paths that keep them in that attraction basin. For every question that comes up, there’s an answer for them. We may find it stupid or inadequate, but it’s good enough in their context.
Whenever you can identify an iterative process in the world, asking about the attractor basins, and their corresponding “repellers”, is a good idea.
Referring to an “attraction basin” in your definition of truth means that it is very deeply bring in the concept of “iteration”. But that raises several problems.
Is truth actually iterative? No. No it is not.
The search for truth is iterative. One always takes the next step on the basis of where one is now. But truth is not iterative. Truth is truth.
You could try to call it the limit of the iterative process taken to infinity, but now you’re relativized truth to the seeker. Now, some would accept that, but I’m not one of them and I doubt most people reading this are either.
The other problem here is that the definition refers to it as an attractor basin. Well, I would hope that truth is an attractor basin, which is to say, that once one has truth, as one continues to seek truth one gets closer to it. The alternative would be fairly disastrous. However that is much the same a saying that truth “is a logical statement”. Yes, that’s great… but which logical statement is it? As I said a moment ago, the “standard NPC belief set” is also an attractor basin, but we’d consider it false. The definition gives no means to determine either from a human perspective or an omniscient perspective which attraction basin that “truth” is referring to, just as one needs a method to tell which logical statements are true if we’re going to say “Truth is a logical statement”.
“Logical compression” appears to be a gibberish term the AI backed into. It doesn’t seem to correspond to anything. The best I can steelman it is as referring vaguely to the idea that knowing and the ability to compress are the same thing, but that concept is independent of truth. Nothing stops a good compression algorithm from efficiently compressing untrue statements.
This also seems to be related to “recursive abstraction”, but again that seems to be relativizing truth to the person doing the abstraction. Moreover, abstraction is essentially the process of discarding some truths out of a system in order to be able to fit it in your head or some other thing that is smaller than the whole truth is. Abstraction, while highly useful, is not a component of truth; it is how we handle there being too much truth in the world for us to process. I would not put a process of “discarding facts” into my definition of truth.
And as I said in the SG posts already, the line “It is not asserted but extracted under constraint.” can best be steelmanned as “finite beings can’t extract the whole truth”. But even then, yes, at times truth is in fact asserted, and again this seems to relativize the definition of truth to the one doing the “extraction under constraint”.
So, net-net, it functions better as a claim that all truth is relative, though conflating the search for truth for truth itself is still a category error, and if you believe that truth is not relative to a beholder, then the definition is just word salad.
Which is the conclusion I come to. It’s word salad.
From there, well, the AI garbage generator can produce more garbage than any human can debunk. It’s a losing game to try to continue on past this sort of error because there’s always more AI gibberish on the horizon.
Some Christian Thoughts on AI |