Salah Rifai, Yann N. Dauphin, Pascal Vincent, Yoshua Bengio, Xavier Muller
We combine three important ideas present in previous work for building classi- ﬁers: the semi-supervised hypothesis (the input distribution contains information about the classiﬁer), the unsupervised manifold hypothesis (data density concen- trates near low-dimensional manifolds), and the manifold hypothesis for classiﬁ- cation (different classes correspond to disjoint manifolds separated by low den- sity). We exploit a novel algorithm for capturing manifold structure (high-order contractive auto-encoders) and we show how it builds a topological atlas of charts, each chart being characterized by the principal singular vectors of the Jacobian of a representation mapping. This representation learning algorithm can be stacked to yield a deep architecture, and we combine it with a domain knowledge-free version of the TangentProp algorithm to encourage the classiﬁer to be insensitive to local directions changes along the manifold. Record-breaking classiﬁcation results are obtained.