Follow
Jing Liu
Jing Liu
PhD Candidate, Monash University
Verified email at monash.edu - Homepage
Title
Cited by
Cited by
Year
Discrimination-aware channel pruning for deep neural networks
Z Zhuang, M Tan, B Zhuang, J Liu, Y Guo, Q Wu, J Huang, J Zhu
Advances in neural information processing systems 31, 2018
6132018
Scalable vision transformers with hierarchical pooling
Z Pan, B Zhuang, J Liu, H He, J Cai
Proceedings of the IEEE/cvf international conference on computer vision, 377-386, 2021
892021
Generative low-bitwidth data free quantization
S Xu, H Li, B Zhuang, J Liu, J Cao, C Liang, M Tan
Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23 …, 2020
772020
Discrimination-aware Network Pruning for Deep Model Compression
J Liu, B Zhuang, Z Zhuang, Y Guo, J Huang, J Zhu, M Tan
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) 1, 1-15, 2021
602021
Effective training of convolutional neural networks with low-bitwidth weights and activations
B Zhuang, M Tan, J Liu, L Liu, I Reid, C Shen
IEEE Transactions on Pattern Analysis and Machine Intelligence 44 (10), 6140 …, 2021
302021
Less is more: Pay less attention in vision transformers
Z Pan, B Zhuang, H He, J Liu, J Cai
Proceedings of the AAAI Conference on Artificial Intelligence 36 (2), 2035-2043, 2022
282022
Aqd: Towards accurate quantized object detection
P Chen, J Liu, B Zhuang, M Tan, C Shen
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2021
222021
Pruning self-attentions into convolutional layers in single path
H He, J Liu, Z Pan, J Cai, J Zhang, D Tao, B Zhuang
arXiv preprint arXiv:2111.11802, 2021
162021
Sharpness-aware quantization for deep neural networks
J Liu, J Cai, B Zhuang
arXiv preprint arXiv:2111.12273, 2021
92021
Conditional automated channel pruning for deep neural networks
Y Liu, Y Guo, J Guo, L Jiang, J Chen
IEEE Signal Processing Letters 28, 1275-1279, 2021
92021
Deep transferring quantization
Z Xie, Z Wen, J Liu, Z Liu, X Wu, M Tan
Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23 …, 2020
92020
Mesa: A memory-saving training framework for transformers
Z Pan, P Chen, H He, J Liu, J Cai, B Zhuang
arXiv preprint arXiv:2111.11124, 2021
82021
Ecoformer: Energy-saving attention with linear complexity
J Liu, Z Pan, H He, J Cai, B Zhuang
NeurIPS Spotlight, 2022
72022
Single-path bit sharing for automatic loss-aware model compression
J Liu, B Zhuang, P Chen, C Shen, J Cai, M Tan
TPAMI, 2023
4*2023
A Survey on Efficient Training of Transformers
B Zhuang, J Liu, Z Pan, H He, Y Weng, C Shen
arXiv preprint arXiv:2302.01107, 2023
12023
Dynamic Focus-aware Positional Queries for Semantic Segmentation
H He, J Cai, Z Pan, J Liu, J Zhang, D Tao, B Zhuang
CVPR 2023, 2022
12022
Elastic Architecture Search for Diverse Tasks with Different Resources.
J Liu, B Zhuang, M Tan, X Liu, D Phung, Y Li, J Cai
arXiv preprint arXiv:2108.01224 5, 2021
12021
FocusFormer: Focusing on What We Need via Architecture Sampler
J Liu, J Cai, B Zhuang
arXiv preprint arXiv:2208.10861, 2022
2022
Downscaling and Overflow-aware Model Compression for Efficient Vision Processors
H Li, J Liu, L Jia, Y Liang, Y Wang, M Tan
2022 IEEE 42nd International Conference on Distributed Computing Systems …, 2022
2022
Deep Transferring Quantization
MT Zheng Xie, Zhiquan Wen, Jing Liu, Zhiqiang Liu, Xixian Wu
European Conference on Computer Vision (ECCV) 2020, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–20