Zhiwen Xiao
Deep Contrastive Representation Learning With Self-Distillation
Xiao, Zhiwen; Xing, Huanlai; Zhao, Bowen; Qu, Rong; Luo, Shouxi; Dai, Penglin; Li, Ke; Zhu, Zonghai
Authors
Huanlai Xing
Bowen Zhao
Professor RONG QU rong.qu@nottingham.ac.uk
PROFESSOR OF COMPUTER SCIENCE
Shouxi Luo
Penglin Dai
Dr KE LI Ke.Li2@nottingham.ac.uk
ASSISTANT PROFESSOR
Zonghai Zhu
Abstract
Recently, contrastive learning (CL) is a promising way of learning discriminative representations from time series data. In the representation hierarchy, semantic information extracted from lower levels is the basis of that captured from higher levels. Low-level semantic information is essential and should be considered in the CL process. However, the existing CL algorithms mainly focus on the similarity of high-level semantic information. Considering the similarity of low-level semantic information may improve the performance of CL. To this end, we present a deep contrastive representation learning with self-distillation (DCRLS) for the time series domain. DCRLS gracefully combine data augmentation, deep contrastive learning, and self distillation. Our data augmentation provides different views from the same sample as the input of DCRLS. Unlike most CL algorithms that concentrate on high-level semantic information only, our deep contrastive learning also considers the contrast similarity of low-level semantic information between peer residual blocks. Our self distillation promotes knowledge flow from high-level to low-level blocks to help regularize DCRLS in the knowledge transfer process. The experimental results demonstrate that the DCRLS-based structures achieve excellent performance on classification and clustering on 36 UCR2018 datasets.
Citation
Xiao, Z., Xing, H., Zhao, B., Qu, R., Luo, S., Dai, P., Li, K., & Zhu, Z. (2024). Deep Contrastive Representation Learning With Self-Distillation. IEEE Transactions on Emerging Topics in Computational Intelligence, 8(1), 3-15. https://doi.org/10.1109/tetci.2023.3304948
Journal Article Type | Article |
---|---|
Acceptance Date | Aug 10, 2023 |
Online Publication Date | Aug 29, 2023 |
Publication Date | 2024-02 |
Deposit Date | Sep 2, 2023 |
Publicly Available Date | Sep 13, 2023 |
Journal | IEEE Transactions on Emerging Topics in Computational Intelligence |
Electronic ISSN | 2471-285X |
Publisher | Institute of Electrical and Electronics Engineers |
Peer Reviewed | Peer Reviewed |
Volume | 8 |
Issue | 1 |
Pages | 3-15 |
DOI | https://doi.org/10.1109/tetci.2023.3304948 |
Keywords | Artificial Intelligence, Computational Mathematics, Control and Optimization, Computer Science Applications |
Public URL | https://nottingham-repository.worktribe.com/output/24870240 |
Publisher URL | https://ieeexplore.ieee.org/document/10233880 |
Files
TETCI2023
(8.6 Mb)
PDF
You might also like
Impact of Vth Instability of Schottky-type p-GaN Gate HEMTs on Switching Behaviors
(2024)
Journal Article
Impact of Vth Instability of Schottky-type p-GaN Gate HEMTs on Switching Behaviors
(2024)
Journal Article
A Fast and Accurate GaN Power Transistor Model and Its Application for Electric Vehicle
(2023)
Journal Article
A GaN-HEMT Compact Model Including Dynamic RDSon Effect for Power Electronics Converters
(2021)
Journal Article
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search