Todd Jerome Kosloff and Brian A. Barsky

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2007-19

January 24, 2007

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-19.pdf

Depth of field is the swath through a 3D scene that is imaged in acceptable focus through an optics system, such as a camera lens. Control over depth of field is an important artistic tool that can be used to emphasize the subject of a photograph. In a real camera, the control over depth of field is limited by the laws of physics and by physical constraints. The depth of field effect has been simulated in computer graphics, but with the same limited control as found in real camera lenses. In this report, we use anisotropic diffusion to generalize depth of field in computer graphics by allowing the user to independently specify the degree of blur at each point in three-dimensional space. Generalized depth of field provides a novel tool to emphasize an area of interest within a 3D scene, to pick objects out of a crowd, and to render a busy, complex picture more understandable by focusing only on relevant details that may be scattered throughout the scene. Our algorithm operates by blurring a sequence of nonplanar layers that form the scene. Choosing a suitable blur algorithm for the layers is critical; thus, we develop appropriate blur semantics such that the blur algorithm will properly generalize depth of field. We found that anisotropic diffusion is the process that best suits these semantics.


BibTeX citation:

@techreport{Kosloff:EECS-2007-19,
    Author= {Kosloff, Todd Jerome and Barsky, Brian A.},
    Title= {An Algorithm for Rendering Generalized Depth of Field Effects Based on Simulated Heat Diffusion},
    Year= {2007},
    Month= {Jan},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-19.html},
    Number= {UCB/EECS-2007-19},
    Abstract= {Depth of field is the swath through a 3D scene that is imaged in
acceptable focus through an optics system, such as a camera lens.
Control over depth of field is an important artistic tool that can be
used to emphasize the subject of a photograph. In a real camera,
the control over depth of field is limited by the laws of physics and
by physical constraints. The depth of field effect has been simulated
in computer graphics, but with the same limited control as
found in real camera lenses. In this report, we use anisotropic diffusion
to generalize depth of field in computer graphics by allowing
the user to independently specify the degree of blur at each point
in three-dimensional space. Generalized depth of field provides a
novel tool to emphasize an area of interest within a 3D scene, to
pick objects out of a crowd, and to render a busy, complex picture
more understandable by focusing only on relevant details that may
be scattered throughout the scene. Our algorithm operates by blurring
a sequence of nonplanar layers that form the scene. Choosing
a suitable blur algorithm for the layers is critical; thus, we develop
appropriate blur semantics such that the blur algorithm will properly
generalize depth of field. We found that anisotropic diffusion
is the process that best suits these semantics.},
}

EndNote citation:

%0 Report
%A Kosloff, Todd Jerome 
%A Barsky, Brian A. 
%T An Algorithm for Rendering Generalized Depth of Field Effects Based on Simulated Heat Diffusion
%I EECS Department, University of California, Berkeley
%D 2007
%8 January 24
%@ UCB/EECS-2007-19
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-19.html
%F Kosloff:EECS-2007-19