>a broader dichotomy between the people-persuation-plane, and the real-world-facts plane
This right here is the real thing which AI is deployed to upset.
The Enlightenment values which brought us the Industrial Revolution imply that the disparity between the people-persuasion-plane and the real-world-facts-plane should naturally decrease.
The implicit expectation here is that as civilization as a whole learns more about how the universe works, people would naturally become more rational, and thus more persuadable by reality-compliant arguments and less persuadable by reality-denying ones.
That's... not really what I've been seeing. That's not really what most of us have been seeing. Like, devastatingly not so.
My guess is that something became... saturated? I'd place it sometime around the 1970s, same time Bretton Woods ended, and the productivity/wages gap began to grow. Something pertaining to the shared-culture-plane. Maybe there's only so much "informed" people can become before some sort of phase shift occurs and the driving force behind decisions becomes some vague, ethically unaccountable ingroup intuition ("vibes", yo), rather than the kind of explicit, systematic reasoning which actually is available to any human, except for the weird fact how nobody seems to trust it very much any more.
> people would naturally become more rational, and thus more persuadable by reality-compliant arguments and less persuadable by reality-denying ones.
likely not. Our natural state tuned by evolution is one of an emotional creature persuaded by pleasing rhetoric - like a bird which responds to another bird's call.
What's irrational about a bird responding to another bird's call, though?
I always figured, unlike human speech, bird song contained only truth - 100% real-time factual representation of reproductive fitness/compatibility, 0% fractal bullshitting (such as arguing about definitions of abstract notions, or endless rumination and reflection, or command hierarchies built to leak, or...).
Although who knows, really! I'm just guessing here. Maybe what we oughtta do is ask some actual ornithologists to ask an actual parrot to translate for us the songs of its distant relatives. Sounds crazy enough to work -- though probably not in captivity.
Overall I see your point, and I see many people sharing that perspective; personally, I find it rather disheartening. Tbh I'm not even sure what would be a convincing argument one way or the other.
This right here is the real thing which AI is deployed to upset.
The Enlightenment values which brought us the Industrial Revolution imply that the disparity between the people-persuasion-plane and the real-world-facts-plane should naturally decrease.
The implicit expectation here is that as civilization as a whole learns more about how the universe works, people would naturally become more rational, and thus more persuadable by reality-compliant arguments and less persuadable by reality-denying ones.
That's... not really what I've been seeing. That's not really what most of us have been seeing. Like, devastatingly not so.
My guess is that something became... saturated? I'd place it sometime around the 1970s, same time Bretton Woods ended, and the productivity/wages gap began to grow. Something pertaining to the shared-culture-plane. Maybe there's only so much "informed" people can become before some sort of phase shift occurs and the driving force behind decisions becomes some vague, ethically unaccountable ingroup intuition ("vibes", yo), rather than the kind of explicit, systematic reasoning which actually is available to any human, except for the weird fact how nobody seems to trust it very much any more.