Tactile mesh saliency

Lau, Manfred and Dev, Kapil and Shi, Weiqi and Dorsey, Julie and Rushmeier, Holly (2016) Tactile mesh saliency. ACM Transactions on Graphics, 35 (4): a52. ISSN 0730-0301

[thumbnail of TactileMeshSaliency_SIGGRAPH]
Preview
PDF (TactileMeshSaliency_SIGGRAPH)
TactileMeshSaliency_SIGGRAPH.pdf - Accepted Version

Download (12MB)

Abstract

While the concept of visual saliency has been previously explored in the areas of mesh and image processing, saliency detection also applies to other sensory stimuli. In this paper, we explore the problem of tactile mesh saliency, where we define salient points on a virtual mesh as those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the relative tactile saliency of every mesh vertex. Since it is difficult to manually define a tactile saliency measure, we introduce a crowdsourcing and learning framework. It is typically easy for humans to provide relative rankings of saliency between vertices rather than absolute values. We thereby collect crowdsourced data of such relative rankings and take a learning-to-rank approach. We develop a new formulation to combine deep learning and learning-to-rank methods to compute a tactile saliency measure. We demonstrate our framework with a variety of 3D meshes and various applications including material suggestion for rendering and fabrication

Item Type:
Journal Article
Journal or Publication Title:
ACM Transactions on Graphics
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/1700/1704
Subjects:
?? saliencydeep learningperceptioncrowdsourcingfabrication material suggestioncomputer graphics and computer-aided design ??
ID Code:
80006
Deposited By:
Deposited On:
10 Jun 2016 11:02
Refereed?:
Yes
Published?:
Published
Last Modified:
03 Jan 2024 00:19