A sudden collapse—say, a deepfake swinging an election or an algorithm sparking a market crash—would be visible and dramatic. Drift is more dangerous because it is slow and largely invisible. Think of geology. The Colorado River did not carve the Grand Canyon in a single flood. It wore away rock grain by grain, until centuries later the canyon yawned wide. Epistemic drift works in the same way. Each small shift in what people trust, share or doubt seems trivial. Yet over time the ground beneath societies is altered. By the time the change is obvious, it is too late to reverse.
Signs of drift: The drift is already under way.
Misinformation inflation. With AI, almost any claim can be backed up with convincing “proof”—fabricated images, fake quotes, synthetic witnesses. The line between truth and invention blurs.
Trust outsourcing. Algorithms decide what is worth seeing. The old habit of checking credibility for oneself begins to fade.
Synthetic memory. Platforms already curate collective memory by choosing what to highlight and what to bury. AI will edit the archive even more aggressively.
Artificial intimacy. People are already forming relationships with AI therapists, companions and influencers. These voices shape emotions and norms, even though they are not part of human society.
Each of these shifts looks manageable in isolation. Taken together, they amount to a reconstruction of reality itself. The effects stretch far beyond culture. Markets rely on reliable anchors—earnings reports, supply chains, consumer demand. If AI can produce not just fake reports but entire synthetic datasets, those anchors may drift.