Autocomplete 3D Sculpting
Digital sculpting is a popular means to create 3D models but remains a challenging task for many users. This can be alleviated by recent advances in data-driven and procedural modeling, albeit bounded by the underlying data and procedures. We propose a 3D sculpting system that assists users in freely creating models without predefined scope. With a brushing interface similar to common sculpting tools, our system silently records and analyzes users' workflows, and predicts what they might or should do in the future to reduce input labor or enhance output quality. Users can accept, ignore, or modify the suggestions and thus maintain full control and individual style. They can also explicitly select and clone past workflows over output model regions. Our key idea is to consider how a model is authored via dynamic workflows in addition to what it is shaped in static geometry, for more accurate analysis of user intentions and more general synthesis of shape structures. The workflows contain potential repetitions for analysis and synthesis, including user inputs (e.g. pen strokes on a pressure sensing tablet), model outputs (e.g. extrusions on an object surface), and camera viewpoints. We evaluate our method via user feedbacks and authored models.
READ FULL TEXT