Part of Advances in Neural Information Processing Systems 36 (NeurIPS 2023) Main Conference Track
Jin-Hui Wu, Shao-Qun Zhang, Yuan Jiang, Zhi-Hua Zhou
Complex-valued neural networks potentially possess better representations and performance than real-valued counterparts when dealing with some complicated tasks such as acoustic analysis, radar image classification, etc. Despite empirical successes, it remains unknown theoretically when and to what extent complex-valued neural networks outperform real-valued ones. We take one step in this direction by comparing the learnability of real-valued neurons and complex-valued neurons via gradient descent. We show that a complex-valued neuron can efficiently learn functions expressed by any one real-valued neuron and any one complex-valued neuron with convergence rate O(t−3) and O(t−1) where t is the iteration index of gradient descent, respectively, whereas a two-layer real-valued neural network with finite width cannot learn a single non-degenerate complex-valued neuron. We prove that a complex-valued neuron learns a real-valued neuron with rate Ω(t−3), exponentially slower than the O(e−ct) rate of learning one real-valued neuron using a real-valued neuron with a constant c. We further verify and extend these results via simulation experiments in more general settings.