Adam Farooq
Adaptive Latent Feature Sharing for Piecewise Linear Dimensionality Reduction
Farooq, Adam; Raykov, Yordan; Raykov, Petar; Little, Max
Authors
Dr YORDAN RAYKOV Yordan.Raykov@nottingham.ac.uk
ASSISTANT PROFESSOR IN DATA SCIENCE/STATISTICS
Petar Raykov
Max Little
Abstract
Linear Gaussian exploratory tools such as principal component analysis (PCA) and factor analysis (FA) are widely used for exploratory analysis, pre-processing, data visualization, and related tasks. Because the linear-Gaussian assumption is restrictive, for very high dimensional problems, they have been replaced by robust, sparse extensions or more flexible discrete-continuous latent feature models. Discrete-continuous latent feature models specify a dictionary of features dependent on subsets of the data and then infer the likelihood that each data point shares any of these features. This is often achieved using rich-get-richer assumptions about the feature allocation process where the dictionary tries to couple the feature frequency with the portion of total variance that it explains. In this work, we propose an alternative approach that allows for better control over the feature to data point allocation. This new approach is based on two-parameter discrete distribution models which decouple feature sparsity and dictionary size, hence capturing both common and rare features in a parsimonious way. The new framework is used to derive a novel adaptive variant of factor analysis (aFA), as well as an adaptive probabilistic principal component analysis (aPPCA) capable of flexible structure discovery and dimensionality reduction in a wide variety of scenarios. We derive both standard Gibbs sampling, as well as efficient expectation-maximisation inference approximations converging orders of magnitude faster, to a reasonable point estimate solution. The utility of the proposed aPPCA and aFA models is demonstrated on standard tasks such as feature learning, data visualization, and data whitening. We show that aPPCA and aFA can extract interpretable, high-level features for raw MNIST or COLI-20 images, or when applied to the analysis of autoencoder features. We also demonstrate that replacing common PCA pre-processing pipelines in the analysis of functional magnetic resonance imaging (fMRI) data with aPPCA, leads to more robust and better-localised blind source separation of neural activity.
Citation
Farooq, A., Raykov, Y., Raykov, P., & Little, M. (2024). Adaptive Latent Feature Sharing for Piecewise Linear Dimensionality Reduction. Journal of Machine Learning Research, 25, Article 135
Journal Article Type | Article |
---|---|
Acceptance Date | Mar 15, 2024 |
Online Publication Date | Mar 1, 2025 |
Publication Date | Mar 31, 2024 |
Deposit Date | Feb 7, 2025 |
Publicly Available Date | Feb 7, 2025 |
Journal | Journal of Machine Learning Research |
Print ISSN | 1532-4435 |
Electronic ISSN | 1533-7928 |
Publisher | Journal of Machine Learning Research |
Peer Reviewed | Peer Reviewed |
Volume | 25 |
Article Number | 135 |
Public URL | https://nottingham-repository.worktribe.com/output/45042063 |
Publisher URL | https://www.jmlr.org/papers/v25/21-0146.html |
Files
Adaptive Latent Feature Sharing for Piecewise Linear Dimensionality Reduction
(6 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by/4.0/
You might also like
Passive Monitoring of Parkinson Tremor in Daily Life: A Prototypical Network Approach
(2025)
Journal Article
Principled Machine Learning
(2022)
Journal Article
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search