Tokens: From arcade machines to AI

Lefteris
4 min readNov 17, 2024

--

Token. A word so captivating that it became a symbol, a token of meaning, long before we even realized it had always been a token.

The word token carries a fascinating versatility, manifesting in various contexts with alternative meanings. In linguistics, it refers to a specific instance of a word or phrase, while in computer science, it serves as a fundamental unit in parsing and processing language data. Beyond these abstract definitions, token also exists as a tangible object — a placeholder, such as the tokens used in amusement parks or arcade machines. This duality, spanning the concrete and the conceptual, hints at the token’s role in the effort to bridge the physical and symbolic.

Token’. Even its sound carries a certain musicality when spoken, a rhythmic quality that echoes its function as a discrete unit of meaning. This phonetic aspect adds another layer to its significance, intertwining the auditory experience with its conceptual depth. The sound of ‘token’ itself becomes a token of language, embodying the very concept it represents.

The fuss about tokens

The so-famous Transformer architecture, representing the letter T in GPT, is most probably the reason the word token has risen in frequency use. Tokenization, the process of breaking down text into smaller units, is fundamental to how these models process and understand language. In the context of Large Language Models (LLMs) like GPT, tokens are the basic units of text that the model works with, allowing it to analyze and generate human-like text with remarkable efficiency.

Beyond modern natural language processing and computer vision architectures, the word token had attached various meanings and was utilized in quite different contexts. For instance, in finance, tokens represent digital assets or units of value in blockchain systems. In security, physical tokens are used for two-factor authentication. Even in board games, tokens serve as placeholders or representations of game elements. These diverse applications highlight the versatility of the concept across different fields.

Tokens in self-attention

Referring back in computer science terms, in the realm of AI, in the self-attention mechanism, tokens are used as the fundamental units of information that the model processes. Each token, whether it represents a word, subword, or character, is treated as a distinct entity within the attention mechanism. These tokens interact with each other through the self-attention process, allowing the model to weigh the importance of different parts of the input sequence when generating outputs. This token-based approach enables the model to capture complex relationships and dependencies within the text, contributing to the powerful language understanding and generation capabilities of transformer-based architectures.

Tokens as primitives of human perception

As the endeavour to develop human-like intelligence is inspired, by definition, by human perception and processing qualities, unavoidably the question arises: do we humans process information in tokens? This question delves into the heart of cognitive science and neuroscience, exploring whether our mental processes mirror the tokenization seen in AI models. While human cognition is vastly more complex and nuanced than current AI systems, there are intriguing parallels. Our ability to break down complex information into manageable chunks, our focus on specific words or phrases in a conversation, and our capacity to rapidly shift attention between different elements of our environment all suggest that some form of tokenization might be at play in human cognition.

What happens when tokens entangle

But do we really break down the world in tokens? Is our perception driven by fundamental primitives before composing a more complex interpretation of the world we experience? This question becomes more complex when we consider experiences that seem to defy easy categorization or tokenization. For instance, the perception of a beautiful sunset or the emotional impact of a piece of music might not be easily broken down into discrete units. These holistic experiences challenge the notion of tokenized perception and suggest that human cognition might operate on multiple levels simultaneously, combining both discrete and continuous processing mechanisms.

One could argue that these kinds of experiences could require a complicated entanglement function occurring between tokens. This entanglement could be seen as a complex interplay of various sensory inputs, memories, and emotions, creating a rich tapestry of experience that transcends simple tokenization. The challenge for AI systems, then, becomes not just processing discrete tokens, but understanding and replicating these intricate, interconnected experiences. This raises intriguing questions about the future development of AI and its potential to capture the full spectrum of human perception and cognition.

--

--

Lefteris
Lefteris

No responses yet