image part 010

Data-Efficient Learning

Humans are frequently able to learn new skills from just a few examples. In contrast, modern learning algorithms can be tremendously data-hungry. We have been exploring ways to overcome this shortcoming of machine learning through a combination of symbolic and statistical techniques.

As a concrete example, some of our recent work uses program synthesis to automatically compose a set of previously learned neural library modules. The composite models are fine-tuned on new tasks, and this fine-tuning takes many fewer examples than learning from scratch. Our longer-term goals include scaling such compositional program synthesis to larger libraries and much larger modules (think GPT-3), and discovering libraries in an unsupervised manner.


Selected Publications

Sorry, no publications matched your criteria.