Zhiwen Xiao
Self-Bidirectional Decoupled Distillation for Time Series Classification
Xiao, Zhiwen; Xing, Huanlai; Qu, Rong; Li, Hui; Feng, Li; Zhao, Bowen; Yang, Jiayi
Authors
Huanlai Xing
Professor RONG QU rong.qu@nottingham.ac.uk
PROFESSOR OF COMPUTER SCIENCE
Hui Li
Li Feng
Bowen Zhao
Jiayi Yang
Abstract
Over the years, many deep learning algorithms have been developed for time series classification (TSC). A learning model’s performance usually depends on the quality of the semantic information extracted from lower and higher levels within the representation hierarchy. Efficiently promoting mutual learning between higher and lower levels is vital to enhance the model’s performance during model learning. To this end, we propose a self-bidirectional decoupled distillation (Self-BiDecKD) method for TSC. Unlike most self-distillation algorithms that usually transfer the target-class knowledge from higher to lower levels, Self-BiDecKD encourages the output of the output layer and the output of each lower-level block to form a bidirectional decoupled knowledge distillation (KD) pair. The bidirectional decoupled KD promotes mutual learning between lower- and higher-level semantic information and extracts the knowledge hidden in the target and non-target classes, helping Self-BiDecKD capture rich representations from the data. Experimental results show that compared with a number of self-distillation algorithms, Self-BiDecKD wins 35 out of 85 UCR2018 datasets and achieves the smallest AVG_rank score, namely 3.2882. In particular, compared with a non-self-distillation Baseline, Self-BiDecKD results in 58/8/19 regarding ‘win’/‘tie’/‘lose’.
Citation
Xiao, Z., Xing, H., Qu, R., Li, H., Feng, L., Zhao, B., & Yang, J. (2024). Self-Bidirectional Decoupled Distillation for Time Series Classification. IEEE Transactions on Artificial Intelligence, https://doi.org/10.1109/tai.2024.3360180
Journal Article Type | Article |
---|---|
Acceptance Date | Jan 28, 2024 |
Publication Date | Feb 11, 2024 |
Deposit Date | Apr 2, 2024 |
Publicly Available Date | Apr 8, 2024 |
Journal | IEEE Transactions on Artificial Intelligence |
Electronic ISSN | 2691-4581 |
Publisher | Institute of Electrical and Electronics Engineers |
Peer Reviewed | Peer Reviewed |
DOI | https://doi.org/10.1109/tai.2024.3360180 |
Keywords | Feature extraction , Semantics , Time series analysis , Data mining , Brain modeling , Deep learning , Heuristic algorithms , Convolutional Neural Network , Deep Learning , Data Mining , Knowledge Distillation , Time Series Classification |
Public URL | https://nottingham-repository.worktribe.com/output/31435533 |
Additional Information | ©2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Files
TAI2024
(6.2 Mb)
PDF
You might also like
Learning-guided cross-sampling for large-scale evolutionary multi-objective optimization
(2024)
Journal Article
A pattern-based algorithm with fuzzy logic bin selector for online bin packing problem
(2024)
Journal Article
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search