%AXie, Yiheng [Brown University, Unity Technologies]%ATakikawa, Towaki [University of Toronto, NVIDIA]%ASaito, Shunsuke [Meta Reality Labs Research]%ALitany, Or [NVIDIA]%AYan, Shiqin [Brown University]%AKhan, Numair [Brown University]%ATombari, Federico [Google, Technical University of Munich]%ATompkin, James [Brown University]%Asitzmann, Vincent [Massachusetts Institute of Technology]%ASridhar, Srinath [Brown University, Equal advising]%BJournal Name: Computer Graphics Forum; Journal Volume: 41; Journal Issue: 2; Related Information: CHORUS Timestamp: 2023-08-21 07:15:18 %D2022%IWiley-Blackwell %JJournal Name: Computer Graphics Forum; Journal Volume: 41; Journal Issue: 2; Related Information: CHORUS Timestamp: 2023-08-21 07:15:18 %K %MOSTI ID: 10367766 %PMedium: X %TNeural Fields in Visual Computing and Beyond %XAbstract

Recent advances in machine learning have led to increased interest in solving visual computing problems using methods that employ coordinate‐based neural networks. These methods, which we callneural fields, parameterize physical properties of scenes or objects across space and time. They have seen widespread success in problems such as 3D shape and image synthesis, animation of human bodies, 3D reconstruction, and pose estimation. Rapid progress has led to numerous papers, but a consolidation of the discovered knowledge has not yet emerged. We provide context, mathematical grounding, and a review of over 250 papers in the literature on neural fields. InPart I, we focus on neural field techniques by identifying common components of neural field methods, including different conditioning, representation, forward map, architecture, and manipulation methods. InPart II, we focus on applications of neural fields to different problems in visual computing, and beyond (e.g., robotics, audio). Our review shows the breadth of topics already covered in visual computing, both historically and in current incarnations, and highlights the improved quality, flexibility, and capability brought by neural field methods. Finally, we present a companion website that acts as a living database that can be continually updated by the community.

%0Journal Article