Alternative Representations and Techniques for Accelerated Realistic Image Synthesis
|Author||: Jerry Jinfeng Guo|
|Promotor(s)||: Prof.dr. E. Eisemann / Dr. J.J. Billeter|
|University||: Delft University of Technology|
|Year of publication||: 2022|
|Link to repository||: TU Delft Research Repository|
Realism has always been a major goal in visual content creation – from oil painting to motion pictures, from graphic arts to scientific data visualization. Computer graphics creates a virtual reality with digital representations. Trading between accuracy and speed, realistic rendering either creates photorealistic renders that follow strict rules of physics or approximates them with interactive alternatives.
The two approaches have their distinctive strengths and constraints. In this thesis, we focus on working from both ends towards realistic rendering. We study the highly accurate process of physically based rendering and introduce three novel methods dedicated to making it more efficient. We also investigate real-time depth of field rendering and develop a new model for an effect that is currently missing in state of the art systems.
Part i concerns sampling in numerical integration. Two methods are presented in Chapter 2 and Chapter 3. We first propose to perform path guiding in primary sample space, resulting in an effective and efficient scheme that is easy to plug into existing rendering pipelines. Secondly, we map visibility relations in a matrix-like table to steer the sampling process, which improves processes, such as visible light samples and light subpaths.
Part ii tackles the subsequent step after sampling in numerical integration. We present a new integration scheme in Chapter 4 that associates a weight to samples based on their adjacency, while remaining unbiased. The method delivers similar performance using uniform random samples as one can obtain with costly low-discrepancy sequences.
In Part iii this thesis revisits the optics behind depth of field effects and models distortion and shrinking effects that are missing in modern real-time rendering solutions. We are able to deliver similar effects to that of ray-traced results at a fraction of the cost.
Each chapter has detailed depiction and evaluation that helps readers better understand the methods and gain insights as to the applications thereof.