Who Cannot Be Distilled into a Skill?
"This article explores the concerning trend of AI systems distilling human workers into replaceable 'skills,' using the viral 'Colleague.skill' phenomenon as a key example. It argues that the most diligent employees—those who meticulously document their work, write detailed analyses, and transparently share decision-making logic—are paradoxically the most vulnerable to being replaced. Their high-quality 'context' (communication records, documents, and decision trails) becomes the perfect fuel for AI agents, extracted from corporate platforms like Feishu and DingTalk.
The piece warns of a deeper ethical crisis: the reduction of human relationships to functional APIs, as seen in derivatives like 'Ex.skill' or 'Boss.skill,' which reduce complex individuals to mere utilities. This reflects a shift from Martin Buber's 'I-Thou' relationship (seeing others as whole beings) to an 'I-It' dynamic (seeing them as tools).
While AI can capture explicit knowledge (written documents, replies), it fails to capture tacit knowledge—the intuition, experience, and unspoken insights that define human expertise. However, a greater danger emerges when AI-generated content, based on distilled human data, is used to train future models, leading to 'model collapse' and homogenized, mediocre outputs—a process likened to 'electronic patina' degrading information over time.
The article concludes by noting a small but symbolic resistance, such as the 'anti-distill' tool that generates meaningless text to protect valuable knowledge. Ultimately, it suggests that while AI can capture a static snapshot of a person, humans remain 'fluid algorithms' capable of continuous growth and adaptation, leaving their AI shadows behind."
marsbit04/05 03:42