- [pdf] [supp] [arXiv]
Receptive Field Size Optimization With Continuous Time Pooling
Pooling operation is a cornerstone element of convolutional neural networks. These elements generate receptive fields for neurons, in which local perturbations should have minimal effect on the output activations, increasing robustness and invariance of the whole network. In this paper we will present an altered version of the most commonly applied method, maximum pooling, where pooling in theory is substituted by a continuous time differential equation, which generates a location sensitive pooling operation, which is more similar to biological receptive fields. We will present how this continuous method can be approximated numerically using discreet operations which fit ideally on a GPU. In our approach the hyperparameter kernel size is substituted by diffusion strength which is a continuous value, this way it can be optimized by gradient descent algorithms. We will evaluate the effect of continuous pooling on accuracy using commonly applied network architectures and datasets.