Image-Driven Navigation of Analytical BRDF Models
Specifying parameters of analytic BRDF models is a difficult task as these parameters are often not intuitive for artists and their effect on appearance can be non-uniform. Ideally, a given step in the parameter space should produce a predictable and perceptually-uniform change in the rendered image. Systems that employ psychophysics have produced important advances in this direction; however, the requirement of user studies limits scalability of these approaches. In this work, we propose a new and intuitive method for designing material appearance. First, we define a computational metric between BRDFs that is based on rendered images of a scene under natural illumination. We show that our metric produces results that agree with previous perceptual studies. Next, we propose a user interface that allows for navigation in the remapped parameter space of a given BRDF model. For the current settings of the BRDF parameters, we display a choice of variations corresponding to uniform steps according to our metric, in the various parameter directions. In addition to the parametric navigation for a single model, we also support neighborhood navigation in the space of all models. By clustering a large number of neighbors and removing neighbors that are close to the current model, the user can easily visualize the alternate effects that can only be expressed with other models. We show that our interface is simple and intuitive. Furthermore, visual navigation in the BRDF space both in the local model and the union space is an effective way for reflectance design.