Tangible Interaction + Graphical Interpretation:
Bringing Clay Models to Life
Published at SIGGRAPH 2000:
Anderson, Frankel, Marks, Agarwala, Beardsley, Hodgins,
Leigh, Ryall, Sullivan, and Yedidia, ``Tangible
Interaction + Graphical Interpretation: A New Approach to 3D Modeling",
Proceedings of SIGGRAPH 2000, Annual Conference Series, July 2000.
Our vision for this project was to give children tangible
access to 3D modeling using simple clay, and then to bring those clay models
to life. Children were asked to create clay models from a family
of common types, such as a dog, person, car, table, etc....
We then scanned these clay figures into 3D volumetric models. Using
a set of parameterized object templates which were optimized to best fit
the volumetric model, I was able to identify which type of clay figure
the child had made, and to parse the model into its constituent parts.
I then fed the descriptions of these parts into a motion controller which
allowed us to generate realistic motion for the child's clay model.
In this way, a clay dog made with the child's own hands could be brought
to life in a 3D animation. The top row of figures above show this
process for a dog model, and the bottom row of figures shows the parses
for a variety of clay models.
I developed all the algorithms for the volumetric model
recognition, parsing, and animation. This work formed half
of the paper above, alongside another case study demonstrating simple tangible
modeling coupled with interpretation of those models. I did
this work during a research internship with Joe
Marks at MERL.