In this paper, we explore texture mapping as a unified representation for enabling realistic multimodal interaction with finely-detailed surfaces. We first present a novel approach to modifying collision handling between textured rigid-body objects; we then show how normal maps can be adopted as a unified representation to synthesize complex sound effects from long-lasting collisions and perform rendering of haptic textures. The resulting multimodal display system allows a user to see, hear, and feel complex interactions with textured surfaces. By using normal maps as a unified representation for seamlessly integrated multimodal interaction instead of complex triangular meshes otherwise required, this work is able to achieve up to 25 times performance speedup and reduce up to six orders of magnitude in memory storage. We further validate the results through a user study which demonstrates that the subjects are able to correctly identify the material texture of a surface through interaction with its normal map.
In this paper, we explore texture mapping as a unified representation for enabling realistic multimodal interaction with finely-detailed surfaces. We show how both normal maps and relief maps can be adopted as unified representations to handle collisions with textured rigid body objects, synthesize complex sound effects from long lasting collisions and perform rendering of haptic textures. The resulting multimodal display system allows a user to see, hear, and feel complex interactions with textured surfaces. By using texture representations for seamlessly integrated multimodal interaction instead of complex triangular meshes otherwise required, this work is able to achieve up to 25 times performance speedup and reduce up to six orders of magnitude in memory storage. We further validate the results through user studies to demonstrate the effectiveness of texture representations for integrated multimodal display.