Follow
Yanzhe Zhang
Title
Cited by
Cited by
Year
Enhanced Visual Instruction Tuning for Text-Rich Image Understanding
Y Zhang, R Zhang, J Gu, Y Zhou, N Lipka, D Yang, T Sun
NeurIPS 2023 Workshop on Instruction Tuning and Instruction Following, 2023
132*2023
Continual Learning for Text Classification with Information Disentanglement Based Regularization
Y Huang*, Y Zhang*, J Chen, X Wang, D Yang
NAACL 2021, 2736–2746, 2021
762021
A Dynamic LLM-Powered Agent Network for Task-Oriented Agent Collaboration
Z Liu, Y Zhang, P Li, Y Liu, D Yang
First Conference on Language Modeling, 2024
56*2024
Continual Sequence Generation with Adaptive Compositional Modules
Y Zhang, X Wang, D Yang
ACL 2022, 3653–3667, 2022
382022
Best practices and lessons learned on synthetic data for language models
R Liu, J Wei, F Liu, C Si, Y Zhang, J Rao, S Zheng, D Peng, D Yang, ...
First Conference on Language Modeling, 2024
33*2024
Design2Code: How Far Are We From Automating Front-End Engineering?
C Si*, Y Zhang*, Z Yang, R Liu, D Yang
arXiv preprint arXiv:2403.03163, 2024
21*2024
Auditing gender presentation differences in text-to-image models
Y Zhang, L Jiang, G Turk, D Yang
EAAMO 2024, 2024
182024
Bounding the Capabilities of Large Language Models in Open Text Generation with Prompt Constraints
A Lu, H Zhang, Y Zhang, X Wang, D Yang
EACL 2023 (Findings), 1937–1963, 2023
182023
Robustness of Demonstration-based Learning Under Limited Data Scenario
H Zhang, Y Zhang, R Zhang, D Yang
EMNLP 2022, 1769–1782, 2022
112022
Leveraging Expert Guided Adversarial Augmentation For Improving Generalization in Named Entity Recognition
A Reich, J Chen, A Agrawal, Y Zhang, D Yang
ACL 2022 (Findings), 1947–1955, 2022
62022
TRINS: Towards Multimodal Language Models that Can Read
R Zhang, Y Zhang, J Chen, Y Zhou, J Gu, C Chen, T Sun
CVPR 2024, 22584-22594, 2024
22024
Distilling an End-to-End Voice Assistant Without Instruction Training Data
W Held, E Li, M Ryan, W Shi, Y Zhang, D Yang
arXiv preprint arXiv:2410.02678, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–12