Mohamad Amin Mohamadi*, Wonho Bae*, Danica J. Sutherland
The University of British Columbia
NeurIPS 2022
Active learning: reducing the required amount of labelled data in training ML models through allowing the model to "actively request for annotation of specific datapoints".
We focus on Pool Based Active Learning:
Most proposed acquisition functions in deep active learning can be categorized to two branches:
Our Motivation: Making Look-Ahead acquisition functions feasible in deep active learning:
Retraining
Engine
Retraining Time: The proposed retraining approximation is much faster than SGD.
Experiments: The proposed querying strategy attains similar or better performance than best prior pool-based AL methods on several datasets.