TY - GEN
T1 - Parametric Dimension Reduction by Preserving Local Structure
AU - Lai, Chien Hsun
AU - Kuo, Ming Feng
AU - Lien, Yun Hsuan
AU - Su, Kuan An
AU - Wang, Yu Shuen
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - We extend a well-known dimension reduction method, t-distributed stochastic neighbor embedding (t-SNE), from non-parametric to parametric by training neural networks. The main advantage of a parametric technique is the generalization of handling new data, which is beneficial for streaming data visualization. While previous parametric methods either require a network pre-training by the restricted Boltzmann machine or intermediate results obtained from the traditional non-parametric t-SNE, we found that recent network training skills can enable a direct optimization for the t-SNE objective function. Accordingly, our method achieves high embedding quality while enjoying generalization. Due to mini-batch network training, our parametric dimension reduction method is highly efficient. For evaluation, we compared our method to several baselines on a variety of datasets. Experiment results demonstrate the feasibility of our method. The source code is available at https://github.com/a07458666/parametric_dr.
AB - We extend a well-known dimension reduction method, t-distributed stochastic neighbor embedding (t-SNE), from non-parametric to parametric by training neural networks. The main advantage of a parametric technique is the generalization of handling new data, which is beneficial for streaming data visualization. While previous parametric methods either require a network pre-training by the restricted Boltzmann machine or intermediate results obtained from the traditional non-parametric t-SNE, we found that recent network training skills can enable a direct optimization for the t-SNE objective function. Accordingly, our method achieves high embedding quality while enjoying generalization. Due to mini-batch network training, our parametric dimension reduction method is highly efficient. For evaluation, we compared our method to several baselines on a variety of datasets. Experiment results demonstrate the feasibility of our method. The source code is available at https://github.com/a07458666/parametric_dr.
KW - Computing methodologies
KW - Dimensionality reduction and manifold learning
KW - Human-centered computing
KW - Visualization toolkits
UR - http://www.scopus.com/inward/record.url?scp=85145578188&partnerID=8YFLogxK
U2 - 10.1109/VIS54862.2022.00024
DO - 10.1109/VIS54862.2022.00024
M3 - Conference contribution
AN - SCOPUS:85145578188
T3 - Proceedings - 2022 IEEE Visualization Conference - Short Papers, VIS 2022
SP - 75
EP - 79
BT - Proceedings - 2022 IEEE Visualization Conference - Short Papers, VIS 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 IEEE Visualization Conference, VIS 2022
Y2 - 16 October 2022 through 21 October 2022
ER -