Follow
Chao Lou
Chao Lou
Verified email at shanghaitech.edu.cn
Title
Cited by
Cited by
Year
Nested named entity recognition as latent lexicalized constituency parsing
C Lou, S Yang, K Tu
arXiv preprint arXiv:2203.04665, 2022
362022
Seqgpt: An out-of-the-box large language model for open domain sequence understanding
T Yu, C Jiang, C Lou, S Huang, X Wang, W Liu, J Cai, Y Li, Y Li, K Tu, ...
Proceedings of the AAAI Conference on Artificial Intelligence 38 (17), 19458 …, 2024
102024
Unsupervised vision-language parsing: Seamlessly bridging visual scene graphs with language structures via dependency relationships
C Lou, W Han, Y Lin, Z Zheng
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2022
102022
AMR Parsing with Causal Hierarchical Attention and Pointers
C Lou, K Tu
arXiv preprint arXiv:2310.11964, 2023
12023
Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints
C Lou, K Tu
arXiv preprint arXiv:2306.02671, 2023
12023
Sparser is Faster and Less is More: Efficient Sparse Attention for Long-Range Transformers
C Lou, Z Jia, Z Zheng, K Tu
arXiv preprint arXiv:2406.16747, 2024
2024
Spa: On the Sparsity of Virtual Adversarial Training for Dependency Parsing
C Lou, W Han, K Tu
Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022 …, 2022
2022
Dependency Transformer Grammars: Integrating Dependency Structures into Transformer Language Models
Y Zhao, C Lou, K Tu
The system can't perform the operation now. Try again later.
Articles 1–8