TAFE-Net: Task-Aware Feature Embeddings for Low Shot Learning

Xin Wang, Fisher Yu, Ruth Wang, Trevor Darrell, Joseph E. Gonzalez
CVPR 2019

TAFE-Net: Task-Aware Feature Embeddings for Low Shot Learning

Abstract

Learning good feature embeddings for images often requires substantial training data. As a consequence, in settings where training data is limited (e.g., few-shot and zero-shot learning), we are typically forced to use a generic feature embedding across various tasks. Ideally, we want to construct feature embeddings that are tuned for the given task. In this work, we propose Task-Aware Feature Embedding Networks (TAFE-Nets) to learn how to adapt the image representation to a new task in a meta learning fashion. Our network is composed of a meta learner and a prediction network. Based on a task input, the meta learner generates parameters for the feature layers in the prediction network so that the feature embedding can be accurately adjusted for that task. We show that TAFE-Net is highly effective in generalizing to new tasks or concepts and evaluate the TAFE-Net on a range of benchmarks in zero-shot and few-shot learning. Our model matches or exceeds the state-of-the-art on all tasks. In particular, our approach improves the prediction accuracy of unseen attribute-object pairs by 4 to 15 points on the challenging visual attribute-object composition task.

Paper

Code

paper
github.com/ucbdrive/tafe-net

Citation

@inproceedings{wang2019tafe,
  title={TAFE-Net: Task-Aware Feature Embeddings for Low Shot Learning},
  author={Wang, Xin and Yu, Fisher and Wang, Ruth and Darrell, Trevor and Gonzalez, Joseph E},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  pages={1831--1840},
  year={2019}
}

Related


Tracking Every Thing in the Wild

Tracking Every Thing in the Wild

ECCV 2022 We introduce a new metric, Track Every Thing Accuracy (TETA), and a Track Every Thing tracker (TETer), which performs association using Class Exemplar Matching (CEM).


Fast Hierarchical Learning for Few-Shot Object Detection

Fast Hierarchical Learning for Few-Shot Object Detection

IROS 2022 We pose few-shot detection as a hierarchical learning problem, where the novel classes are treated as the child classes of existing base classes and the background class.


Frustratingly Simple Few-Shot Object Detection

Frustratingly Simple Few-Shot Object Detection

ICML 2020 State-of-the-art few-shot detection method with backpropagation learning.


Few Shot Object Detection via Feature Reweighting

Few Shot Object Detection via Feature Reweighting

ICCV 2019 We develop a few-shot object detector that can learn to detect novel objects from only a few annotated examples.


Deep Mixture of Experts via Shallow Embedding

Deep Mixture of Experts via Shallow Embedding

UAI 2019 We explore a mixture of experts (MoE) approach to deep dynamic routing, which activates certain experts in the network on a per-example basis.


SkipNet: Learning Dynamic Routing in Convolutional Networks

SkipNet: Learning Dynamic Routing in Convolutional Networks

ECCV 2018 We introduce SkipNet, a modified residual network, that uses a gating network to selectively skip convolutional blocks based on the activations of the previous layer.


IDK Cascades: Fast Deep Learning by Learning not to Overthink

IDK Cascades: Fast Deep Learning by Learning not to Overthink

UAI 2018 We introduce the “I Don’t Know” (IDK) prediction cascades framework to accelerate inference without a loss in prediction accuracy.


Deep Layer Aggregation

Deep Layer Aggregation

CVPR 2018 Oral We augment standard architectures with deeper aggregation to better fuse information across layers.


Dilated Residual Networks

Dilated Residual Networks

CVPR 2017 We show that dilated residual networks (DRNs) outperform their non-dilated counterparts in image classification without increasing the model’s depth or complexity.