Çankaya GCRIS Standart veritabanının içerik oluşturulması ve kurulumu Research Ecosystems (https://www.researchecosystems.com) tarafından devam etmektedir. Bu süreçte gördüğünüz verilerde eksikler olabilir.
 

Almost autonomous training of mixtures of principal component analyzers

No Thumbnail Available

Date

2004

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier Science BV

Open Access Color

OpenAIRE Downloads

OpenAIRE Views

Research Projects

Organizational Units

Journal Issue

Events

Abstract

In recent years, a number of mixtures of local PCA models have been proposed. Most of these models require the user to set the number of submodels (local models) in the mixture and the dimensionality of the submodels (i.e., number of PC's) as well. To make the model free of these parameters, we propose a greedy expectation-maximization algorithm to find a suboptimal number of submodels. For a given retained variance ratio, the proposed algorithm estimates for each submodel the dimensionality that retains this given variability ratio. We test the proposed method on two different classification problems: handwritten digit recognition and 2-class ionosphere data classification. The results show that the proposed method has a good performance.

Description

Keywords

PCA Mixture Model, EM Algorithm, Regularization

Turkish CoHE Thesis Center URL

Fields of Science

Citation

Musa, MEM; de Ridder, D.; Duin, RPW; Atalay, V., "Almost autonomous training of mixtures of principal component analyzers" Pattern Recognition Letters, Vol.25, No.9, pp.1085-1095, (2004).

WoS Q

Scopus Q

Source

Pattern Recognition Letters

Volume

25

Issue

9

Start Page

1085

End Page

1095