Clip space sample culling for motion blur and defocus blur

Yi Jeng Wu, Der Lor Way, Yu Ting Tsai, Zen-Chung Shih

研究成果: Article同行評審

2 引文 斯高帕斯(Scopus)


Motion blur and defocus blur are two common visual effects for rendering realistic camera images. This paper presents a novel clip space culling for stochastic rasterization to render motion and defocus blur effects. Our proposed algorithm reduces the sample coverage using the clip space information in camera lens domain (UV) and time domain (T). First, samples outside the camera lens were culled in stage I using the linear relationship between camera lens and vertex position. Second, samples outside the time bounds were culled in stage II using the triangle similarity in clip space to find the intersection time. Each pixel was computed within two linear bounds only once. Our method achieves good sample test efficiency with low computation cost for real-time stochastic rasterization. Finally, the proposed method is demonstrated by means of various experiments, and a comparison is made with previous works. Our algorithm was able to handle these two blur effects simultaneously and performed better than others did.

頁(從 - 到)1071-1084
期刊Journal of Information Science and Engineering
出版狀態Published - 1 5月 2015


深入研究「Clip space sample culling for motion blur and defocus blur」主題。共同形成了獨特的指紋。