Skip to main content

Research Repository

Advanced Search

An Efficient Federated Distillation Learning System for Multitask Time Series Classification

Xing, Huanlai; Xiao, Zhiwen; Qu, Rong; Zhu, Zonghai; Zhao, Bowen

An Efficient Federated Distillation Learning System for Multitask Time Series Classification Thumbnail


Authors

Huanlai Xing

Zhiwen Xiao

Profile Image

RONG QU rong.qu@nottingham.ac.uk
Professor of Computer Science

Zonghai Zhu

Bowen Zhao



Abstract

This paper proposes an efficient federated distillation learning system (EFDLS) for multi-task time series classification (TSC). EFDLS consists of a central server and multiple mobile users, where different users may run different TSC tasks. EFDLS has two novel components: a feature-based student-teacher (FBST) framework and a distance-based weights matching (DBWM) scheme. For each user, the FBST framework transfers knowledge from its teacher’s hidden layers to its student’s hidden layers via knowledge distillation, where the teacher and student have identical network structures. For each connected user, its student model’s hidden layers’ weights are uploaded to the EFDLS server periodically. The DBWM scheme is deployed on the server, with the least square distance used to measure the similarity between the weights of two given models. This scheme finds a partner for each connected user such that the user’s and its partner’s weights are the closest among all the weights uploaded. The server exchanges and sends back the user’s and its partner’s weights to these two users which then load the received weights to their teachers’ hidden layers. Experimental results show that compared with a number of state-of-the-art federated learning algorithms, our proposed EFDLS wins 20 out of 44 standard UCR2018 datasets and achieves the highest mean accuracy (70.14%) on these datasets. In particular, compared with a single-task Baseline, EFDLS obtains 32/4/8 regarding ’win’/’tie’/’lose’ and results in an improvement of approximately 4% in terms of mean accuracy.

Citation

Xing, H., Xiao, Z., Qu, R., Zhu, Z., & Zhao, B. (2022). An Efficient Federated Distillation Learning System for Multitask Time Series Classification. IEEE Transactions on Instrumentation and Measurement, 71, https://doi.org/10.1109/TIM.2022.3201203

Journal Article Type Article
Acceptance Date Aug 13, 2022
Online Publication Date Aug 24, 2022
Publication Date Aug 24, 2022
Deposit Date Aug 30, 2022
Publicly Available Date Sep 2, 2022
Journal IEEE Transactions on Instrumentation and Measurement
Print ISSN 0018-9456
Electronic ISSN 1557-9662
Publisher Institute of Electrical and Electronics Engineers (IEEE)
Peer Reviewed Peer Reviewed
Volume 71
DOI https://doi.org/10.1109/TIM.2022.3201203
Keywords Electrical and Electronic Engineering, Instrumentation
Public URL https://nottingham-repository.worktribe.com/output/10367872
Publisher URL https://ieeexplore.ieee.org/document/9865987

Files




You might also like



Downloadable Citations