This shouldn’t work nearly as well as it does. Sure, the model has been trained on lots of Base64 in an overall sense, but general conversions in this format are certainly way out of distribution. The tokenizer chops it into completely different sub-word units. The positional patterns are unrecognizable. And yet it works… Curious…
编者按:本文是少数派 2025 年度征文活动#TeamSilicon25标签下的入围文章。本文仅代表作者本人观点,少数派只略微调整排版。
。黑料对此有专业解读
Opens in a new window,详情可参考手游
// union size = 64,推荐阅读超级权重获取更多信息