AI Redistributes Epistemic Authority
AI systems aren’t neutral tools, they’re socio-technical infrastructure that redistributes epistemic authority and encodes normative commitments.
When an organization adopts AI for decision-making, questions of “who knows” and “whose knowledge counts” get restructured. Foundation model concentration creates dependencies where consequential decisions may be delegated to unaccountable private entities. Algorithmic curation of information environments affects what knowledge reaches whom.
This is why purely technical approaches to AI governance fail: they treat AI as a tool to be constrained rather than infrastructure that reshapes power relations. Governance frameworks that ignore this constitutive politicality produce legitimacy deficits regardless of their technical sophistication.
Related: 05-atom—expertocracy-problem, 02-atom—format-shapes-cognition