Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
The only thing logic can ever prove is internal consistency, not fact.
Yes, and being able to build structures with internal consistency would be an advantage.
Nobody says you can prevent any “AI” oracle from saying things that aren’t true.
But a tool which would generate a tree of possible logical conclusions from something given in language and then divided into statements on objects with statistical dependencies could be useful.
Yes, and being able to build structures with internal consistency would be an advantage.
Nobody says you can prevent any “AI” oracle from saying things that aren’t true.
But a tool which would generate a tree of possible logical conclusions from something given in language and then divided into statements on objects with statistical dependencies could be useful.