HairStyle Editing via Parametric Controllable Strokes

IEEE Trans Vis Comput Graph. 2023 Feb 3:PP. doi: 10.1109/TVCG.2023.3241894. Online ahead of print.

Abstract

In this work, we propose a stroke-based hairstyle editing network, dubbed HairstyleNet, allowing users to conveniently change the hairstyles of an image in an interactive fashion. Different from previous works, we simplify the hairstyle editing process where users can manipulate local or entire hairstyles by adjusting the parameterized hair regions. Our HairstyleNet consists of two stages: a stroke parameterization stage and a stroke-to-hair generation stage. In the stroke parameterization stage, we firstly introduce parametric strokes to approximate the hair wisps, where the stroke shape is controlled by a quadratic Bézier curve and a thickness parameter. Since rendering strokes with thickness to an image is not differentiable, we opt to leverage a neural renderer to construct the mapping from stroke parameters to a stroke image. Thus, the stroke parameters can be directly estimated from hair regions in a differentiable way, enabling us to flexibly edit the hairstyles of input images. In the stroke-to-hair generation stage, we design a hairstyle refinement network that first encodes coarsely composed images of hair strokes, face, and background into latent representations and then generates high-fidelity face images with desirable new hairstyles from the latent codes. Extensive experiments demonstrate that our HairstyleNet achieves state-of-the-art performance and allows flexible hairstyle manipulation.