State of the Art on Neural Rendering
Abstract: Efficient rendering of photo-realistic virtual worlds is a long standing effort of computer graphics. Modern graphics techniqueshave succeeded in synthesizing photo-realistic images from hand-crafted scene representations. However, the automatic gen-eration of shape, materials, lighting, and other aspects of scenes remains a challenging problem that, if solved, would makephoto-realistic computer graphics more widely accessible. Concurrently, progress in computer vision and machine learninghave given rise to a new approach to image synthesis and editing, namely deep generative models. Neural rendering is anew and rapidly emerging field that combines generative machine learning techniques with physical knowledge from computergraphics, e.g., by the integration of differentiable rendering into network training. With a plethora of applications in computergraphics and vision, neural rendering is poised to become a new area in the graphics community, yet no survey of this emerg-ing field exists. This state-of-the-art report summarizes the recent trends and applications of neural rendering. We focus onapproaches that combine classic computer graphics techniques with deep generative models to obtain controllable and photo-realistic outputs. Starting with an overview of the underlying computer graphics and machine learning concepts, we discusscritical aspects of neural rendering approaches. Specifically, our emphasis is on the type of control, i.e., how the control isprovided, which parts of the pipeline are learned, explicit vs. implicit control, generalization, and stochastic vs. deterministicsynthesis. The second half of this state-of-the-art report is focused on the many important use cases for the described algorithmssuch as novel view synthesis, semantic photo manipulation, facial and body reenactment, relighting, free-viewpoint video, andthe creation of photo-realistic avatars for virtual and augmented reality telepresence. Finally, we conclude with a discussion ofthe social implications of such technology and investigate open research problems. Downloads:
|