Humans are frequently able to learn new skills from just a few examples. In contrast, modern learning algorithms can be tremendously data-hungry. We have been exploring ways to overcome this shortcoming of machine learning through a combination of symbolic and statistical techniques.
As a concrete example, some of our recent work uses program synthesis to automatically compose a set of previously learned neural library modules. The composite models are fine-tuned on new tasks, and this fine-tuning takes many fewer examples than learning from scratch. Our longer-term goals include scaling such compositional program synthesis to larger libraries and much larger modules (think GPT-3), and discovering libraries in an unsupervised manner.
HOUDINI: Lifelong Learning as Program Synthesis Inproceedings
In: Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, NeurIPS 2018, 3-8 December 2018, Montréal, Canada, pp. 8701–8712, 2018.