AgentDress: Realtime Clothing Synthesis for Virtual Agents using Plausible Deformations

IEEE Trans Vis Comput Graph. 2021 Nov;27(11):4107-4118. doi: 10.1109/TVCG.2021.3106429. Epub 2021 Oct 27.

Abstract

We present a CPU-based real-time cloth animation method for dressing virtual humans of various shapes and poses. Our approach formulates the clothing deformation as a high-dimensional function of body shape parameters and pose parameters. In order to accelerate the computation, our formulation factorizes the clothing deformation into two independent components: the deformation introduced by body pose variation (Clothing Pose Model) and the deformation from body shape variation (Clothing Shape Model). Furthermore, we sample and cluster the poses spanning the entire pose space and use those clusters to efficiently calculate the anchoring points. We also introduce a sensitivity-based distance measurement to both find nearby anchoring points and evaluate their contributions to the final animation. Given a query shape and pose of the virtual agent, we synthesize the resulting clothing deformation by blending the Taylor expansion results of nearby anchoring points. Compared to previous methods, our approach is general and able to add the shape dimension to any clothing pose model. Furthermore, we can animate clothing represented with tens of thousands of vertices at 50+ FPS on a CPU. We also conduct a user evaluation and show that our method can improve a user's perception of dressed virtual agents in an immersive virtual environment (IVE) compared to a realtime linear blend skinning method.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Computer Graphics*
  • Humans