Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This paper adds static linear surround modulation to deep convolutional networks. The authors show that this improves the speed and performance of the networks. They also show that it (but also batch normalization) increases sparsity of neural activity. The performance gains seem to be reduced on larger problems which limited projected significance and excitement in the paper. Limiting suppression to only identical features was also seen as limiting. Reviewers did recognize that training on full ImageNet is computationally expensive.