EPISODE 12: WIKIPEDIA AS A MODEL FOR WORLDBUILDING
Episode 12: Wikipedia as a Model for Worldbuilding | 10 February 2026
Stephen and Trent sing Wikipedia’s praises (and dread the death of truth) in an impassioned discussion about The Worldbuilding Workshop’s tenth chapter, “Wikipedia as a Model for Worldbuilding,” including:
Wikipedia as an open-source public good;
Whether teachers should police learners' use of Wikipedia;
Co-construction of reality and meaningful generalization;
Commonality between Wikipedia page frameworks versus diversity of their contents;
Contested history and the challenges of crowdsourcing truth;
Treating imperfections in collaborative writing as opportunities for learners to “go meta”;
Instructors as community experts who can facilitate reflection on “good” versus “bad” consensus building;
Wikipedia's approach to maintaining article accuracy and diversity of opinion;
Steven Pruitt (the most prolific Wikipedia contributor, editor, and administrator) as an exemplary Wikipedian;
Neutral Point of View and egalitarian editing;
Why tech oligarchs hate Wikipedia (spoiler: they can't control it);
Making, changing, and shifting capital-T “Truth”;
Negotiating meaning through nuance rather than "defeating" alternative points of view;
"Strategic inefficiency" as a way to avoid rushed judgment and better understand the “real” world;
Generative AI (GenAI), large language models (LLMs), and John Searle's “Chinese Room” thought experiment;
Why GenAI and LLMs cannot replace human-driven collaborative discussion about thoughts, ideas, and creative endeavors;
Wikipedia as a starting point for manual research over LLMs as hallucinating curators;
The ability of instructors to shape learner thinking and behavior via on-the-fly interaction, intentional complication, and reflection;
GenAI and LLMs as “highways to mediocrity”;
Public transparency of Wikipedia editing versus the “black box” of GenAI and LLM response construction;
The messiness of authentic, human-driven collaborative writing as the primary goal of good teaching and learning;
The criticality of decentralizing power and ensuring knowledge generation remains as small-d “democratic” as possible;
Whether truth can survive in a world where algorithms drive human perception;
The unreality of Rush Limbaugh, Fox News, and the war on knowledge-generating institutions;
Separating what we know, what we feel, and what we intend to do about it; and
Consequences of corporate media consolidation, Steve Bannon's “flood the zone” mentality, and the transition away from journalistic contextualization to verbatim transcription of bad-faith arguments.
Episode References:
Inskeep, S. (2012, October 3). Wikipedia Policies limit editing Haymarket bombing. NPR. https://www.npr.org/2012/10/03/162203092/wikipedia-politicizes-landmark-historical-event
Giles, J. (2005). Internet encyclopaedias go head to head. Nature, 438(7070), 900–901. https://doi.org/10.1038/438900a
Searle, J. (1980), “Minds, brains and programs.” Behavioral and Brain Sciences, 3 (3): 417–457. doi: 10.1017/S0140525X00005756
Klein, E. (Host). (2025, July 8). How the attention economy is devouring Gen Z — and the rest of us [Audio podcast episode featuring Kyla Scanlon]. In The Ezra Klein Show. The New York Times. https://www.nytimes.com/2025/07/08/opinion/ezra-klein-podcast-kyla-scanlon.html