Follow
Siyu Ren
Title
Cited by
Cited by
Year
Multi-turn response selection using dialogue dependency relations
Q Jia, Y Liu, S Ren, KQ Zhu, H Tang
EMNLP 2020, 2020
392020
Knowledge-driven distractor generation for cloze-style multiple choice questions
S Ren, KQ Zhu
Proceedings of the AAAI conference on artificial intelligence 35 (5), 4339-4347, 2021
362021
Symbol-LLM: Towards foundational symbol-centric interface for large language models
F Xu, Z Wu, Q Sun, S Ren, F Yuan, S Yuan, Q Lin, Y Qiao, J Liu
arXiv preprint arXiv:2311.09278, 2023
102023
Context Compression for Auto-regressive Transformers with Sentinel Tokens
SR , Qi Jia, Kenny Q. Zhu
EMNLP 2023, 2023
8*2023
Taxonomy of abstractive dialogue summarization: scenarios, approaches, and future directions
Q Jia, Y Liu, S Ren, KQ Zhu
ACM Computing Surveys 56 (3), 1-38, 2023
72023
Zero-shot faithfulness evaluation for text summarization with foundation language model
Q Jia, S Ren, Y Liu, KQ Zhu
arXiv preprint arXiv:2310.11648, 2023
62023
On the efficacy of eviction policy for key-value constrained generative language model inference
S Ren, KQ Zhu
arXiv preprint arXiv:2402.06262, 2024
42024
Leaner and Faster: Two-Stage Model Compression for Lightweight Text-Image Retrieval
S Ren, KQ Zhu
NAACL 2022, 2022
42022
EMO: Earth Mover Distance Optimization for Auto-Regressive Language Modeling
S Ren, Z Wu, KQ Zhu
arXiv preprint arXiv:2310.04691, 2023
22023
Low-rank prune-and-factorize for language model compression
S Ren, KQ Zhu
arXiv preprint arXiv:2306.14152, 2023
22023
Pruning pre-trained language models with principled importance and self-regularization
S Ren, KQ Zhu
arXiv preprint arXiv:2305.12394, 2023
22023
Specializing Pre-trained Language Models for Better Relational Reasoning via Network Pruning
S Ren, K Zhu
Findings of the Association for Computational Linguistics: NAACL 2022, 2195-2207, 2022
22022
Combating Short Circuit Behavior in Natural Language Reasoning: Crossover and Mutation Operations for Enhanced Robustness
S Huanga, S Renb, KQ Zhuc
2023
The system can't perform the operation now. Try again later.
Articles 1–13