Indirect lighting drastically increases the realism of rendered scenes but it has traditionally been very expensive to calculate. This has long precluded its use in real-time rendering applications such as video games which have mere milliseconds to respond to user input and produce a final image. As hardware power continues to increase, however, some recently developed algorithms have started to bring real-time indirect lighting closer to reality. Of specific interest to this paper, cloud-based rendering systems add indirect lighting to real-time scenes by splitting the rendering pipeline between a server and one or more connected clients. However, thus far they have been limited to static scenes and/or require expensive precomputation steps which limits their utility in game-like environments. In this paper we present a system capable of providing real-time indirect lighting to fully dynamic environments. This is accomplished by modifying the light gathering step in previous systems to be more resilient to changes in scene geometry and providing indirect light information in multiple forms, depending on the type of geometry being lit. We deploy it in several scenes to measure its performance, both in terms of speed and visual appeal, and show that it produces high quality images with minimum impact on the client machine.
Identifer | oai:union.ndltd.org:BGMYU2/oai:scholarsarchive.byu.edu:etd-8031 |
Date | 01 November 2018 |
Creators | Zabriskie, Nathan Andrew |
Publisher | BYU ScholarsArchive |
Source Sets | Brigham Young University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Theses and Dissertations |
Rights | http://lib.byu.edu/about/copyright/ |
Page generated in 0.0021 seconds