A Generative Framework for Image-based Editing of Material Appearance using Perceptual Attributes
Abstract
Single-image appearance editing is a challenging task, traditionally requiring the estimation of additional scene properties such as geometry or illumination. Moreover, the exact interaction of light, shape and material reflectance that elicits a given perceptual impression is still not well understood. We present an image-based editing method that allows to modify the material appearance of an object by increasing or decreasing high-level perceptual attributes, using a single image as input. Our framework relies on a two-step generative network, where the first step drives the change in appearance and the second produces an image with high-frequency details. For training, we augment an existing material appearance dataset with perceptual judgements of high-level attributes, collected through crowd-sourced experiments, and build upon training strategies that circumvent the cumbersome need for original-edited image pairs. We demonstrate the editing capabilities of our framework on a variety of inputs, both synthetic and real, using two common perceptual attributes (Glossy and Metallic), and validate the perception of appearance in our edited images through a user study.
Downloads and code
- Code (github repo)
- Trained weights: trained weights for the generative networks for the attributes "Metallic" and "Glossy", used to produce the results in the paper.
- Training dataset: images (rendering and normals) and perceptual attributes of the training dataset (9100 images).
- Normal prediction: trained weights for the normal prediction network.
BibTex
@Article{delanoy2022generativematerials, author = {Delanoy, Johanna and Lagunas, Manuel and Gutierrez, Diego and Masia, Belen}, title = {A Generative Framework for Image-based Editing of Material Appearance using Perceptual Attributes}, journal = {Computer Graphics Forum}, volume = {41}, number = {1}, pages = {453-464}, year = {2022}, url = {https://onlinelibrary.wiley.com/doi/10.1111/cgf.14446}, doi = {https://doi.org/10.1111/cgf.14446} }