Xinyu Fu
CRNN: A Joint Neural Network for Redundancy Detection
Fu, Xinyu; Ch�ng, Eugene; Aickelin, Uwe; See, Simon
Authors
Eugene Ch�ng
Uwe Aickelin
Simon See
Abstract
This paper proposes a novel framework for detecting redundancy in supervised sentence categorisation. Unlike traditional singleton neural network, our model incorporates character-aware convolutional neural network (Char-CNN) with character-aware recurrent neural network (Char-RNN) to form a convolutional recurrent neural network (CRNN). Our model benefits from Char-CNN in that only salient features are selected and fed into the integrated Char-RNN. Char-RNN effectively learns long sequence semantics via sophisticated update mechanism. We compare our framework against the state-of-the-art text classification algorithms on four popular benchmarking corpus. For instance, our model achieves competing precision rate, recall ratio, and F1 score on the Google-news data-set. For twenty-news-groups data stream, our algorithm obtains the optimum on precision rate, recall ratio, and F1 score. For Brown Corpus, our framework obtains the best F1 score and almost equivalent precision rate and recall ratio over the top competitor. For the question classification collection, CRNN produces the optimal recall rate and F1 score and comparable precision rate. We also analyse three different RNN hidden recurrent cells’ impact on performance and their runtime efficiency. We observe that MGU achieves the optimal runtime and comparable performance against GRU and LSTM. For TFIDF based algorithms, we experiment with word2vec, GloVe, and sent2vec embeddings and report their performance differences.
Citation
Fu, X., Ch’ng, E., Aickelin, U., & See, S. (2017, May). CRNN: A Joint Neural Network for Redundancy Detection. Presented at 3rd IEEE International Conference on Smart Computing (Smartcomp 2017), Hong Kong, China
Presentation Conference Type | Edited Proceedings |
---|---|
Conference Name | 3rd IEEE International Conference on Smart Computing (Smartcomp 2017) |
Start Date | May 29, 2017 |
End Date | May 31, 2017 |
Acceptance Date | Apr 27, 2017 |
Online Publication Date | Jun 15, 2017 |
Publication Date | 2017 |
Deposit Date | May 3, 2017 |
Publicly Available Date | Jun 15, 2017 |
Peer Reviewed | Peer Reviewed |
Pages | 1-8 |
Book Title | 2017 IEEE International Conference on Smart Computing (SMARTCOMP) |
ISBN | 978-1-5090-6518-9 |
DOI | https://doi.org/10.1109/SMARTCOMP.2017.7946996 |
Keywords | Logic gates, Training, Redundancy, Recurrent neural networks, Benchmark testing, Computational modeling |
Public URL | https://nottingham-repository.worktribe.com/output/865787 |
Publisher URL | http://ieeexplore.ieee.org/document/7946996/ |
Related Public URLs | http://www.smart-comp.org/ http://ima.ac.uk/wp-content/uploads/2017/04/SMARTCOMP_paper_17_rev.pdf |
Additional Information | ISBN 978-1-5090-6517-2 © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Contract Date | May 3, 2017 |
Files
Dietrich et al post-print.pdf
(8.5 Mb)
PDF
Downloadable Citations
About Repository@Nottingham
Administrator e-mail: discovery-access-systems@nottingham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search