Raw Data Library
About
Aims and ScopeAdvisory Board Members
More
Who We Are?
User Guide
Green Science
​
​
EN
Kurumsal BaşvuruSign inGet started
​
​

About
Aims and ScopeAdvisory Board Members
More
Who We Are?
User GuideGreen Science

Language

Kurumsal Başvuru

Sign inGet started
RDL logo

Verified research datasets. Instant access. Built for collaboration.

Navigation

About

Aims and Scope

Advisory Board Members

More

Who We Are?

Contact

Add Raw Data

User Guide

Legal

Privacy Policy

Terms of Service

Support

Got an issue? Email us directly.

Email: info@rawdatalibrary.netOpen Mail App
​
​

© 2026 Raw Data Library. All rights reserved.
PrivacyTermsContact
  1. Raw Data Library
  2. /
  3. Publications
  4. /
  5. Deeply Supervised Subspace Learning for Cross-Modal Material Perception of Known and Unknown Objects

Verified authors • Institutional access • DOI aware
50,000+ researchers120,000+ datasets90% satisfaction
Article
en
2022

Deeply Supervised Subspace Learning for Cross-Modal Material Perception of Known and Unknown Objects

0 Datasets

0 Files

en
2022
Vol 19 (2)
Vol. 19
DOI: 10.1109/tii.2022.3195171

Get instant academic access to this publication’s datasets.

Create free accountHow it works

Frequently asked questions

Is access really free for academics and students?

Yes. After verification, you can browse and download datasets at no cost. Some premium assets may require author approval.

How is my data protected?

Files are stored on encrypted storage. Access is restricted to verified users and all downloads are logged.

Can I request additional materials?

Yes, message the author after sign-up to request supplementary files or replication code.

Advance your research today

Join 50,000+ researchers worldwide. Get instant access to peer-reviewed datasets, advanced analytics, and global collaboration tools.

Get free academic accessLearn more
✓ Immediate verification • ✓ Free institutional access • ✓ Global collaboration
Access Research Data

Join our academic network to download verified datasets and collaborate with researchers worldwide.

Get Free Access
Institutional SSO
Secure
This PDF is not available in different languages.
No localized PDFs are currently available.
Aiguo Song
Aiguo Song

Institution not specified

Verified
Pengwen Xiong
Junjie Liao
MengChu Zhou
+2 more

Abstract

In order to help robots understand and perceive an object's properties during noncontact robot-object interaction, this article proposes a deeply supervised subspace learning method. In contrast to previous work, it takes the advantages of low noise and fast response of noncontact sensors and extracts novel contactless feature information to retrieve cross-modal information, so as to estimate and infer material properties of known as well as unknown objects. Specifically, a depth-supervised subspace cross-modal material retrieval model is trained to learn a common low-dimensional feature representation to capture the clustering structure among different modal features of the same class of objects. Meanwhile, all of unknown objects are accurately perceived by an energy-based model, which forces an unlabeled novel object's features to be mapped beyond the common low-dimensional features. The experimental results show that our approach is effective in comparison with other advanced methods.

How to cite this publication

Pengwen Xiong, Junjie Liao, MengChu Zhou, Aiguo Song, Peter Liu (2022). Deeply Supervised Subspace Learning for Cross-Modal Material Perception of Known and Unknown Objects. , 19(2), DOI: https://doi.org/10.1109/tii.2022.3195171.

Related publications

Why join Raw Data Library?

Quality

Datasets shared by verified academics with rich metadata and previews.

Control

Authors choose access levels; downloads are logged for transparency.

Free for Academia

Students and faculty get instant access after verification.

Publication Details

Type

Article

Year

2022

Authors

5

Datasets

0

Total Files

0

Language

en

DOI

https://doi.org/10.1109/tii.2022.3195171

Join Research Community

Access datasets from 50,000+ researchers worldwide with institutional verification.

Get Free Access