For my masters thesis, I took a closer look on the large language model GPT-3 (Generative Pre-Trained Transformer) through the lens of Nietzsche's ontological understanding of the self, namely that the self is always in the process of becoming. In this process of becoming, technology plays an increasingly important role. One can say technology becomes part of the self.
The full pdf can be found here: https://essay.utwente.nl/89754/1/Fischer_MA_BMS.pdf
The advent of large language models introduces a new relation between the self, language and technology. Language enables interaction through which the self experiences and understands itself in relation to others. As GPT-3 generates synthetic texts that convey ideals, it co-constitutes these interactions. The self, language, and GPT-3 are ultimately intertwined.
To conceptualise this relation and anticipate challenges, a new under-standing of the self is required. For traditional notions of the self, such as the essentialist and dualist, assume the self to be static, independent and invariable. This, however, sustains the impression that technology has no effect on the self. And it also excludes the possibility of (radical) change.
Nietzsche’s will to power ontology, in contrast, conceives of the self as having no pre-established essence. Instead, the self is a priori undefined and its development remains unfinished and unknown. The self is thus inherently indeterminate and always in a state of becoming. In this, it is formed through interaction. Becoming is not an autonomous process, however, as the self is always constrained to some degree by the social context in which it is embedded.
Self-formation thus requires constant re-evaluation and re-negotiation of set boundaries. Given the indeterminate nature of the self, negotiation is particularly important. For otherwise, the self would conform to ‘normalised’ ideals, which in turn denies different ways of being. The overarching goal, then, is to secure the indeterminacy of the self, by allowing for ambiguity and pluralism.
GPT-3, however, debilitates self-formation because (1) the static representation of language conveys pre-determined identities of the self, negating the ambiguity and plurality of both; (2) the invisibility and incomprehensibility of GPT-3 undermine deliberate and reflective interaction in which negotiation is possible; (3) GPT-3 re-cycles old assumptions that reinforce the status quo. And as the negotiation of the static assumptions is undermined, GPT-3 creates a self-reinforcing feedback loop. This ultimately excludes other alternatives and hinders (radical) change.