张钧

个人信息

Personal information

副教授     硕士生导师

性别:男

在职信息:在职

所在单位:人工智能与自动化学院

学历:研究生(博士)毕业

学位:工学博士学位

毕业院校:华中科技大学

学科:模式识别与智能系统

Relevance Units Latent Variable Model and Nonlinear Dimensionality Reduction
发布时间:2021-04-11  点击次数:

论文类型:期刊论文
第一作者:高俊斌
合写作者:David Tien,张钧
发表刊物:IEEE Trans. on Neural Networks
收录刊物:SCI
卷号:21
期号:1
页面范围:123-135
关键字:Dimensionality reduction, Relevance units machine(RUM), Relevance vector machine(RVM), Gaussian process latent variable model(GPLVM)
DOI码:10.1109/TNN.2009.2034964
发表时间:2010-01-11
摘要:A new dimensionality reduction method, called relevance units latent variable model (RULVM), is proposed in this paper. RULVM has a close link with the framework of Gaussian process latent variable model (GPLVM) and it originates from a recently developed sparse kernel model called relevance units machine (RUM). RUM follows the idea of relevance vector machine (RVM) under the Bayesian framework but releases the constraint that relevance vectors (RVs) have to be selected from the input vectors. RUM treats relevance units (RUs) as part of the parameters to be learned from the data. As a result, a RUM maintains all the advantages of RVM and offers superior sparsity. RULVM inherits the advantages of sparseness offered by the RUM and the experimental result shows that RULVM algorithm possesses considerable computational advantages over GPLVM algorithm.