SnowflakeNet: Point Cloud Completion by Snowflake Point Deconvolution with Skip-Transformer

08/10/2021
by   Peng Xiang, et al.
0

Point cloud completion aims to predict a complete shape in high accuracy from its partial observation. However, previous methods usually suffered from discrete nature of point cloud and unstructured prediction of points in local regions, which makes it hard to reveal fine local geometric details on the complete shape. To resolve this issue, we propose SnowflakeNet with Snowflake Point Deconvolution (SPD) to generate the complete point clouds. The SnowflakeNet models the generation of complete point clouds as the snowflake-like growth of points in 3D space, where the child points are progressively generated by splitting their parent points after each SPD. Our insight of revealing detailed geometry is to introduce skip-transformer in SPD to learn point splitting patterns which can fit local regions the best. Skip-transformer leverages attention mechanism to summarize the splitting patterns used in the previous SPD layer to produce the splitting in the current SPD layer. The locally compact and structured point cloud generated by SPD is able to precisely capture the structure characteristic of 3D shape in local patches, which enables the network to predict highly detailed geometries, such as smooth regions, sharp edges and corners. Our experimental results outperform the state-of-the-art point cloud completion methods under widely used benchmarks. Code will be available at https://github.com/AllenXiangX/SnowflakeNet.

READ FULL TEXT

page 1

page 3

page 6

research
02/18/2022

Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer

Most existing point cloud completion methods suffered from discrete natu...
research
05/08/2020

Point Cloud Completion by Skip-attention Network with Hierarchical Folding

Point cloud completion aims to infer the complete geometries for missing...
research
07/21/2022

SeedFormer: Patch Seeds based Point Cloud Completion with Upsample Transformer

Point cloud completion has become increasingly popular among generation ...
research
12/10/2021

Attention-based Transformation from Latent Features to Point Clouds

In point cloud generation and completion, previous methods for transform...
research
07/17/2023

SVDFormer: Complementing Point Cloud via Self-view Augmentation and Self-structure Dual-generator

In this paper, we propose a novel network, SVDFormer, to tackle two spec...
research
11/23/2022

Completing point cloud from few points by Wasserstein GAN and Transformers

In many vision and robotics applications, it is common that the captured...
research
06/28/2023

Tensorformer: Normalized Matrix Attention Transformer for High-quality Point Cloud Reconstruction

Surface reconstruction from raw point clouds has been studied for decade...

Please sign up or login with your details

Forgot password? Click here to reset