Knowledge Distillation

DUET: Distilled LLM Unlearning from an Efficiently Contextualized Teacher

LLM unlearning is a technique to remove the impacts of undesirable knowledge from the model without retraining from scratch, which is indispensable towards trustworthy AI. Existing unlearning methods face significant limitations: conventional …