Show simple item record

dc.contributor.authorXu, Zen_US
dc.contributor.authorZhang, Qen_US
dc.date.accessioned2016-07-26T13:36:49Z
dc.date.issued2016-01-01en_US
dc.date.submitted2016-06-13T14:46:48.318Z
dc.identifier.isbn9783319276700en_US
dc.identifier.issn0302-9743en_US
dc.identifier.urihttp://qmro.qmul.ac.uk/xmlui/handle/123456789/13697
dc.description.abstract© Springer International Publishing Switzerland 2016. In this paper, we propose a symmetry-aware human shape correspondence extraction method. We address the symmetric flip problem which exists in establishing correspondences for intrinsically symmetric models and improve the accuracy of the final corresponding pairs. To achieve this goal, we extended the state-of-the-art approach by using skeleton information to further remove symmetric flipped shape correspondences. Traditional approaches that only rely on surface geometry information can hardly discriminate surface points which are symmetric. With the appearance of inexpensive RGB-D camera, such as Kinect, skeleton information can be easily obtained along with mesh. Therefore, after the initial correspondences are achieved, we extend the candidate sets for each point on the template, followed by making use of skeleton to remove the symmetric flipped false candidates. In the remaining candidates, final correspondences are achieved by choosing those with minimum geodesic distortion from base vertex set, which is formed by sampling on the mesh. Experiments demonstrate that the proposed method can effectively remove all the symmetric flipped candidates. Moreover, the final correspondence pair is more accurate than those of the state of the arts.en_US
dc.format.extent632 - 641en_US
dc.rights“The final publication is available at http://link.springer.com/chapter/10.1007/978-3-319-27671-7_53”
dc.titleSymmetry-aware human shape correspondence using skeletonen_US
dc.typeConference Proceeding
dc.identifier.doi10.1007/978-3-319-27671-7_53en_US
pubs.notesNot knownen_US
pubs.organisational-group/Queen Mary University of London
pubs.organisational-group/Queen Mary University of London/Faculty of Science & Engineering
pubs.organisational-group/Queen Mary University of London/Faculty of Science & Engineering/Electronic Engineering and Computer Science - Electronic Engineering - Research Students
pubs.organisational-group/Queen Mary University of London/Faculty of Science & Engineering/Electronic Engineering and Computer Science - Staff
pubs.organisational-group/Queen Mary University of London/Faculty Reporting - Research Students
pubs.organisational-group/Queen Mary University of London/Faculty Reporting - Research Students/Faculty of Science & Engineering PGRs
pubs.organisational-group/Queen Mary University of London/REF
pubs.organisational-group/Queen Mary University of London/REF/REF - S&E - EECS UoA12
pubs.organisational-group/Queen Mary University of London/REF/REF - UoA 12
pubs.publication-statusPublisheden_US
pubs.volume9516en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record