OcCo icon indicating copy to clipboard operation
OcCo copied to clipboard

What Makes a Good Pre-Training on Points Cloud?

Open hansen7 opened this issue 5 years ago • 1 comments

General Interpretability:

  • interpretable ml book, specifically sections on learned features, Shapley values, Influential Instance
  • "Network dissection: Quantifying interpretability of deep visual representations", CVPR 2017
  • "Feature Visualisation", Olah, et al., Distill 2017.
  • Bolei's Portfolio
  • Chiyuan's Portfolio (also, transfer learning)

General Pre-Training:

  • "Rethinking ImageNet Pre-training", ICCV 2019
  • "Rethinking Pre-training and Self-training", NeurIPS 2020
  • "What is being transferred in transfer learning?", NeurIPS 2020
  • "What Makes Instance Discrimination Good for Transfer Learning?", ICLR 2021 Sub

Ideas from Contrastive Learning:

Point Cloud Specific:

  • "Rotation Invariant Convolutions for 3D Point Clouds Deep Learning", 3DV 2019
  • "Quaternion Equivariant Capsule Networks for 3D Point Clouds", ECCV 2020
  • "Label-Efficient Learning on Point Clouds using Approximate Convex Decompositions", ECCV 2020
  • "On the Universality of Rotation Equivariant Point Cloud Networks", ICLR 2021 Sub

Extensions:

  • "Neural Similarity Learning", NeurIPS 2019

hansen7 avatar Nov 16 '20 00:11 hansen7

Hello, here are some other point cloud self-supervised papers

  • Info3D Representation Learning on 3D Objects using Mutual Information Maximization and Contrastive Learning (2020 ECCV)
  • Global-Local Bidirectional Reasoning for Unsupervised Representation Learning of 3D Point Clouds (2020 CVPR)

GengxinLiu avatar Jan 12 '21 08:01 GengxinLiu