Reframing GenAI as “digital plastic” opens up new ways for educators to build the critical literacy skills students urgently need in an AI-saturated world.
Key Takeaways for Teachers
- Like plastic, GenAI is versatile and democratizing in some contexts while polluting and inequitable in others – students need frameworks to navigate this complexity.
- Metaphors are powerful pedagogical tools that help students grasp the hidden social, cultural and ethical dimensions of AI systems.
- Critical AI literacy must be taught equitably and globally – those most harmed by GenAI’s inequities often have the least access to tools for critically engaging with it.
What the Research Study Is About
This conceptual paper introduces the metaphor of “digital plastic” as a framework for teaching Critical AI Literacy (CAIL) within a multiliteracies approach. Drawing on Conceptual Metaphor Theory, the authors map the parallels between synthetic plastic and GenAI-generated content: both are human-made, malleable, widely accessible, potentially useful and capable of causing long-term environmental and systemic damage. The paper argues that this metaphor can help educators and students think more critically and concretely about what GenAI does to digital knowledge ecosystems, who it benefits and who it leaves behind.
Key Findings
GenAI mirrors plastic: useful but potentially toxic
- Scalable and low-cost, but prone to flooding ecosystems with homogeneous, biased and low-quality content.
- Training AI models on AI-generated data degrades quality over time.
- Instead of relying on unreliable detection software, educators should help students learn to notice the “footprints” or “AI slop” that synthetic media leaves behind in the digital environment.
GenAI deepens inequities, especially for learners in the Global South
- Models built on Western-centric data create technological dependency for under-resourced contexts.
- Students without premium access or epistemic familiarity with AI’s cultural assumptions face compounded disadvantage.
- It is not enough to simply give students tools; teachers must provide the epistemic access required to critique the cultural and ideological assumptions of AI.
Metaphor is a powerful tool for developing CAIL
- Mapping AI onto familiar phenomena helps students engage with otherwise opaque algorithmic systems.
- The “digital plastic” framing invites students to ask whose knowledge is amplified, who benefits and what is lost when meaning-making is automated.
- Metaphor positions students as critical analysts of AI, not just users of it.
CAIL must be embedded in multiliteracies pedagogy
- The multiliteracies framework needs updating to explicitly address AI’s power structures and systemic risks.
- CAIL should be developed equitably across geographies and cultures, not treated as a literacy for the privileged few.
- Educators should help students recognize when a human artistic voice is more effective and when to choose not to use AI to convey a specific meaning.
This research summary was generated by Claude AI and has been reviewed by the authors.