Zico Kolter - Antidistillation Sampling [Alignment Workshop]
Автор: FAR․AI
Загружено: 2025-06-17
Просмотров: 156
Описание:
Zico Kolter introduces "Antidistillation Sampling," a computationally efficient method to generate text from closed-source models while rendering the outputs ineffective as training data for other models.
Highlights:
Anti-distillation sampling prevents student model improvement
Technique blocks learning from generated samples
Method hinders replication of closed-source models
Optimizes token selection for reduced knowledge transfer
Efficient computation using directional derivatives
Achieves strong student model performance degradation
Only moderately degrades performance from sampled model
Повторяем попытку...
![Zico Kolter - Antidistillation Sampling [Alignment Workshop]](https://ricktube.ru/thumbnail/HDTEQby52-E/hq720.jpg)
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: