Abstract
As a prerequisite for the use of ChatGPT in writing classes, instructors should scaffold students’ (critical) digital literacy of the technology. Part of such scaffolding should include the exploration of relevant frameworks for conceptualizing ChatGPT, including the use of multiple metaphors, like tool and collaborator. By analyzing recent scholarly and news discourse regarding ChatGPT, prompts and outputs from ChatGPT, and the author's own writing process, the essay illustrates the limitations of the tool and collaborator metaphors, while emphasizing the value of multiple metaphors. In particular, the tool metaphor fails to account for ChatGPT's human components – namely its repurposing of thousands of authors’ writing and ideas, from which it draws with no transparency on sources. While the collaborator metaphor appears to address the need to cite ideas that are not one's own, ChatGPT fails to provide the accountability of a human author, even as it includes biased output derived from its training corpus, and while again failing to identify original sources. Medical and surgical metaphors highlight the ways that ChatGPT acts upon both the enormous corpus, or body of human writing, on which it was trained and our social body in our academic communities and beyond.
Original language | English |
---|---|
Article number | 102778 |
Journal | Computers and Composition |
Volume | 68 |
DOIs | |
State | Published - Jun 2023 |
Externally published | Yes |
Scopus Subject Areas
- General Computer Science
- Language and Linguistics
- Education
- Linguistics and Language
Keywords
- Artificial intelligence
- Bias
- ChatGPT
- Corpus
- Digital literacy
- Embodiment
- Ethics
- Metaphor
- Writing as extended mind
- Writing processes