Matches in SemOpenAlex for { <https://semopenalex.org/work/W3214712237> ?p ?o ?g. }
- W3214712237 endingPage "735" @default.
- W3214712237 startingPage "703" @default.
- W3214712237 abstract "Abstract Synthesizing photo‐realistic images and videos is at the heart of computer graphics and has been the focus of decades of research. Traditionally, synthetic images of a scene are generated using rendering algorithms such as rasterization or ray tracing, which take specifically defined representations of geometry and material properties as input. Collectively, these inputs define the actual scene and what is rendered, and are referred to as the scene representation (where a scene consists of one or more objects). Example scene representations are triangle meshes with accompanied textures (e.g., created by an artist), point clouds (e.g., from a depth sensor), volumetric grids (e.g., from a CT scan), or implicit surface functions (e.g., truncated signed distance fields). The reconstruction of such a scene representation from observations using differentiable rendering losses is known as inverse graphics or inverse rendering. Neural rendering is closely related, and combines ideas from classical computer graphics and machine learning to create algorithms for synthesizing images from real‐world observations. Neural rendering is a leap forward towards the goal of synthesizing photo‐realistic image and video content. In recent years, we have seen immense progress in this field through hundreds of publications that show different ways to inject learnable components into the rendering pipeline. This state‐of‐the‐art report on advances in neural rendering focuses on methods that combine classical rendering principles with learned 3D scene representations, often now referred to as neural scene representations. A key advantage of these methods is that they are 3D‐consistent by design, enabling applications such as novel viewpoint synthesis of a captured scene. In addition to methods that handle static scenes, we cover neural scene representations for modeling non‐rigidly deforming objects and scene editing and composition. While most of these approaches are scene‐specific, we also discuss techniques that generalize across object classes and can be used for generative tasks. In addition to reviewing these state‐of‐the‐art methods, we provide an overview of fundamental concepts and definitions used in the current literature. We conclude with a discussion on open challenges and social implications." @default.
- W3214712237 created "2021-11-22" @default.
- W3214712237 creator A5016061808 @default.
- W3214712237 creator A5017813263 @default.
- W3214712237 creator A5020640739 @default.
- W3214712237 creator A5020664641 @default.
- W3214712237 creator A5024117395 @default.
- W3214712237 creator A5026499843 @default.
- W3214712237 creator A5036392768 @default.
- W3214712237 creator A5044917356 @default.
- W3214712237 creator A5050419964 @default.
- W3214712237 creator A5054383242 @default.
- W3214712237 creator A5061519579 @default.
- W3214712237 creator A5068089881 @default.
- W3214712237 creator A5071514078 @default.
- W3214712237 creator A5075797855 @default.
- W3214712237 creator A5080103406 @default.
- W3214712237 creator A5083849742 @default.
- W3214712237 creator A5088583491 @default.
- W3214712237 date "2022-05-01" @default.
- W3214712237 modified "2023-10-14" @default.
- W3214712237 title "Advances in Neural Rendering" @default.
- W3214712237 cites W1776042733 @default.
- W3214712237 cites W183071939 @default.
- W3214712237 cites W1938204631 @default.
- W3214712237 cites W1967554269 @default.
- W3214712237 cites W1977758817 @default.
- W3214712237 cites W1993244675 @default.
- W3214712237 cites W2009422376 @default.
- W3214712237 cites W2020681231 @default.
- W3214712237 cites W2071906076 @default.
- W3214712237 cites W2081579113 @default.
- W3214712237 cites W2085905957 @default.
- W3214712237 cites W2098362450 @default.
- W3214712237 cites W2099940712 @default.
- W3214712237 cites W2122572959 @default.
- W3214712237 cites W2137983211 @default.
- W3214712237 cites W2144199676 @default.
- W3214712237 cites W2150128927 @default.
- W3214712237 cites W2156598602 @default.
- W3214712237 cites W2293372129 @default.
- W3214712237 cites W2294343556 @default.
- W3214712237 cites W2301937176 @default.
- W3214712237 cites W2331128040 @default.
- W3214712237 cites W2556802233 @default.
- W3214712237 cites W2698857938 @default.
- W3214712237 cites W2737368828 @default.
- W3214712237 cites W2760103357 @default.
- W3214712237 cites W2798510813 @default.
- W3214712237 cites W2805658037 @default.
- W3214712237 cites W2808492412 @default.
- W3214712237 cites W2810610794 @default.
- W3214712237 cites W2902812770 @default.
- W3214712237 cites W2903299432 @default.
- W3214712237 cites W2933283236 @default.
- W3214712237 cites W2942074357 @default.
- W3214712237 cites W2943445277 @default.
- W3214712237 cites W2949657144 @default.
- W3214712237 cites W2949825757 @default.
- W3214712237 cites W2951159596 @default.
- W3214712237 cites W2963527086 @default.
- W3214712237 cites W2963627347 @default.
- W3214712237 cites W2963739349 @default.
- W3214712237 cites W2963850211 @default.
- W3214712237 cites W2963876827 @default.
- W3214712237 cites W2963926543 @default.
- W3214712237 cites W2964288609 @default.
- W3214712237 cites W2968257580 @default.
- W3214712237 cites W2977371611 @default.
- W3214712237 cites W2981657250 @default.
- W3214712237 cites W2981978060 @default.
- W3214712237 cites W2982058372 @default.
- W3214712237 cites W2984210651 @default.
- W3214712237 cites W2986023562 @default.
- W3214712237 cites W2989630530 @default.
- W3214712237 cites W2990173985 @default.
- W3214712237 cites W2997387466 @default.
- W3214712237 cites W3006889321 @default.
- W3214712237 cites W3012056513 @default.
- W3214712237 cites W3034259269 @default.
- W3214712237 cites W3034395814 @default.
- W3214712237 cites W3034700465 @default.
- W3214712237 cites W3034801905 @default.
- W3214712237 cites W3034964128 @default.
- W3214712237 cites W3034968345 @default.
- W3214712237 cites W3035163517 @default.
- W3214712237 cites W3035291735 @default.
- W3214712237 cites W3035515538 @default.
- W3214712237 cites W3035575130 @default.
- W3214712237 cites W3035591705 @default.
- W3214712237 cites W3043139608 @default.
- W3214712237 cites W3044532657 @default.
- W3214712237 cites W3048402306 @default.
- W3214712237 cites W3084311597 @default.
- W3214712237 cites W3095473874 @default.
- W3214712237 cites W3095682719 @default.
- W3214712237 cites W3097792222 @default.
- W3214712237 cites W3106569148 @default.