Publications

Synthetic Gradient Methods with Virtual Forward-Backward Networks

ICLR2017 Workshop

By : Takeru Miyato, Daisuke Okanohara, Shin-ichi Maeda, Masanori Koyama

Abstract

The concept of synthetic gradient introduced by Jaderberg et al. (2016) provides an avant-garde framework for asynchronous learning of neural network. Their model, however, has a weakness in its construction, because the structure of their synthetic gradient has little relation to the objective function of the target task. In this paper we introduce virtual forward-backward networks (VFBN). VFBN is a model that produces synthetic gradient whose structure is analogous to the actual gradient of the objective function. VFBN is the first of its kind that succeeds in decoupling deep networks like ResNet-110 (He et al., 2016) without compromising its performance.

  • Twitter
  • Facebook