AI Dose
0
Likes
0
Saves
Back to updates

[r/ML] [D] Lossless tokenizers lose nothing and add nothing — trivial observation or worth formalizing?

Impact: 4/10
Swipe left/right

Summary

This discussion argues that lossless tokenization, from an information-theoretic perspective, neither limits the expressiveness of language models nor introduces unavoidable redundancy. It posits that any string distribution can be exactly induced by a distribution over token sequences, maintaining the original entropy. While theoretically neutral, practical models may show slight deviations from canonical tokenizations.

Continue Reading

Explore related coverage about community news and adjacent AI developments: [r/ML] [D] MYTHOS-INVERSION STRUCTURAL AUDIT, [r/LocalLLaMA] karpathy / autoresearch, [r/ML] [R] Agentic AI and Occupational Displacement: A Multi-Regional Task Exposure Analysis (236 occupations, 5 US metros), [r/ML] Building behavioural response models of public figures using Brain scan data (Predict their next move using psychological modelling) [P].

Related Articles

Comments

Sign in to leave a comment.

Loading comments...