SMPConv: Self-moving Point Representations for Continuous Convolution

04/05/2023
by   Sanghyeon Kim, et al.
3

Continuous convolution has recently gained prominence due to its ability to handle irregularly sampled data and model long-term dependency. Also, the promising experimental results of using large convolutional kernels have catalyzed the development of continuous convolution since they can construct large kernels very efficiently. Leveraging neural networks, more specifically multilayer perceptrons (MLPs), is by far the most prevalent approach to implementing continuous convolution. However, there are a few drawbacks, such as high computational costs, complex hyperparameter tuning, and limited descriptive power of filters. This paper suggests an alternative approach to building a continuous convolution without neural networks, resulting in more computationally efficient and improved performance. We present self-moving point representations where weight parameters freely move, and interpolation schemes are used to implement continuous functions. When applied to construct convolutional kernels, the experimental results have shown improved performance with drop-in replacement in the existing frameworks. Due to its lightweight structure, we are first to demonstrate the effectiveness of continuous convolution in a large-scale setting, e.g., ImageNet, presenting the improvements over the prior arts. Our code is available on https://github.com/sangnekim/SMPConv

READ FULL TEXT
research
12/07/2019

Dynamic Convolution: Attention over Convolution Kernels

Light-weight convolutional neural networks (CNNs) suffer performance deg...
research
07/13/2020

PSConv: Squeezing Feature Pyramid into One Compact Poly-Scale Convolutional Layer

Despite their strong modeling capacities, Convolutional Neural Networks ...
research
12/07/2021

Dilated convolution with learnable spacings

Dilated convolution is basically a convolution with a wider kernel creat...
research
10/25/2021

Exploiting Redundancy: Separable Group Convolutional Networks on Lie Groups

Group convolutional neural networks (G-CNNs) have been shown to increase...
research
02/04/2021

CKConv: Continuous Kernel Convolution For Sequential Data

Conventional neural architectures for sequential data present important ...
research
08/17/2021

Contextual Convolutional Neural Networks

We propose contextual convolution (CoConv) for visual recognition. CoCon...
research
04/04/2023

Neural Field Convolutions by Repeated Differentiation

Neural fields are evolving towards a general-purpose continuous represen...

Please sign up or login with your details

Forgot password? Click here to reset