ZHANG JUN

·Paper Publications

Current position: 英文主页 > Scientific Research > Paper Publications
Relevance Units Latent Variable Model and Nonlinear Dimensionality Reduction
Release time:2021-04-11  Hits:

Indexed by: Journal paper

First Author: Junbin Gao

Co-author: David Tien,ZHANG JUN

Journal: IEEE Trans. on Neural Networks

Included Journals: SCI

Volume: 21

Issue: 1

Page Number: 123-135

Key Words: Dimensionality reduction, Relevance units machine(RUM), Relevance vector machine(RVM), Gaussian process latent variable model(GPLVM)

DOI number: 10.1109/TNN.2009.2034964

Date of Publication: 2010-01-11

Abstract: A new dimensionality reduction method, called relevance units latent variable model (RULVM), is proposed in this paper. RULVM has a close link with the framework of Gaussian process latent variable model (GPLVM) and it originates from a recently developed sparse kernel model called relevance units machine (RUM). RUM follows the idea of relevance vector machine (RVM) under the Bayesian framework but releases the constraint that relevance vectors (RVs) have to be selected from the input vectors. RUM treats relevance units (RUs) as part of the parameters to be learned from the data. As a result, a RUM maintains all the advantages of RVM and offers superior sparsity. RULVM inherits the advantages of sparseness offered by the RUM and the experimental result shows that RULVM algorithm possesses considerable computational advantages over GPLVM algorithm.