copy and paste this google map to your website or blog!
Press copy button and paste into your blog or website.
(Please switch to 'HTML' mode when posting into your blog. Examples: WordPress Example, Blogger Example)
Meta-Learning Sparse Implicit Neural Representations To address this issue, we propose to leverage a meta-learning approach in combination with network compression under a sparsity constraint, such that it renders a well-initialized sparse parameterization that evolves quickly to represent a set of unseen signals in the subsequent training
Meta-learning Sparse Implicit Neural Representations We call this procedure Meta-SparseINR (Meta-learning Sparse Implicit Neural Representation) For completeness, we provide preliminaries on the meta-learning procedure we use in Section 4 1, and give a full description of the Meta-SparseINR algorithm in Section 4 2
GitHub - jaeho-lee MetaSparseINR: Meta-Learning Sparse Implicit Neural . . . Official PyTorch implementation of "Meta-learning Sparse Implicit Neural Representations" (NeurIPS 2021) by Jaeho Lee*, Jihoon Tack*, Namhoon Lee, and Jinwoo Shin TL;DR: We develop a scalable method to learn sparse neural representations for a large set of signals
Meta-Learning Sparse Implicit Neural Representations This paper presents a method of alternately meta-learning the Implicit Neural Representations (INR) for a set of signals using the MAML method, and pruning the INR using the global magnitude-based pruning method for sparsifying the INR
Meta-leaning Sparse Implicit Neural Representations To address this issue, we propose to leverage a meta-learning approach in combination with network compression under a sparsity constraint, such that it renders a well-initialized sparse parameterization that evolves quickly to represent a set of unseen signals in the subsequent training