Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)
Guo-Zheng Sun, Hsing-Hen Chen, Yee-Chun Lee
We proposed a model of Time Warping Invariant Neural Networks (TWINN)
to handle the time warped continuous signals. Although TWINN is a simple modifica(cid:173) tion of well known recurrent neural network, analysis has shown that TWINN com(cid:173) pletely removes time warping and is able to handle difficult classification problem. It is also shown that TWINN has certain advantages over the current available sequential processing schemes: Dynamic Programming(DP)[I], Hidden Markov Model((cid:173) HMM), Time Delayed Neural Networks(TDNN)  and Neural Network Finite Automata(NNFA).
We also analyzed the time continuity employed in TWINN and pointed out that
this kind of structure can memorize longer input history compared with Neural Net(cid:173) work Finite Automata (NNFA). This may help to understand the well accepted fact that for learning grammatical reference with NNF A one had to start with very short strings in training set.
The numerical example we used is a trajectory classification problem. This
problem, making a feature of variable sampling rates, having internal states, continu(cid:173) ous dynamics, heavily time-warped data and deformed phase space trajectories, is shown to be difficult to other schemes. With TWINN this problem has been learned in 100 iterations. For benchmark we also trained the exact same problem with TDNN and completely failed as expected.