Follow
Ryokan Ri
Ryokan Ri
LY Corporation / SB Intuitions
No verified email - Homepage
Title
Cited by
Cited by
Year
mLUKE: The power of entity representations in multilingual pretrained language models
R Ri, I Yamada, Y Tsuruoka
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2021
472021
Pretraining with artificial language: Studying transferable knowledge in language models
R Ri, Y Tsuruoka
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
362022
EASE: Entity-Aware Contrastive Learning of Sentence Embedding
S Nishikawa, R Ri, I Yamada, Y Tsuruoka, I Echizen
Proceedings of the 2022 Conference of the North American Chapter of the …, 2022
312022
Designing the business conversation corpus
M Rikters, R Ri, T Li, T Nakazawa
arXiv preprint arXiv:2008.01940, 2020
252020
Document-aligned Japanese-English conversation parallel corpus
M Rikters, R Ri, T Li, T Nakazawa
arXiv preprint arXiv:2012.06143, 2020
112020
Japanese–English conversation parallel corpus for promoting context-aware machine translation research
M Rikters, R Ri, T Li, T Nakazawa
Journal of Natural Language Processing 28 (2), 380-403, 2021
102021
Revisiting the context window for cross-lingual word embeddings
R Ri, Y Tsuruoka
Proceedings of the 58th Annual Meeting of the Association for Computational …, 2020
82020
Zero-pronoun data augmentation for japanese-to-english translation
R Ri, T Nakazawa, Y Tsuruoka
arXiv preprint arXiv:2107.00318, 2021
62021
Data augmentation with unsupervised machine translation improves the structural similarity of cross-lingual word embeddings
S Nishikawa, R Ri, Y Tsuruoka
arXiv preprint arXiv:2006.00262, 2020
62020
Data Augmentation for Learning Bilingual Word Embeddings with Unsupervised Machine Translation
S Nishikawa, R Ri, Y Tsuruoka
arXiv preprint arXiv:2006.00262, 2020
42020
Emergent communication with attention
R Ri, R Ueda, J Naradowsky
arXiv preprint arXiv:2305.10920, 2023
32023
Finding sub-task structure with natural language instruction
R Ri, Y Hou, R Marinescu, A Kishimoto
Proceedings of the First Workshop on Learning with Natural Language …, 2022
22022
Self-preference bias in llm-as-a-judge
K Wataoka, T Takahashi, R Ri
arXiv preprint arXiv:2410.21819, 2024
12024
Self-translate-train: Enhancing cross-lingual transfer of large language models via inherent capability
R Ri, S Kiyono, S Takase
arXiv preprint arXiv:2407.00454, 2024
12024
Self-Translate-Train: A Simple but Strong Baseline for Cross-lingual Transfer of Large Language Models
R Ri, S Kiyono, S Takase
arXiv e-prints, arXiv: 2407.00454, 2024
12024
LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation
I Yamada, R Ri
arXiv preprint arXiv:2402.11485, 2024
12024
Modeling Target-side Inflection in Placeholder Translation
R Ri, T Nakazawa, Y Tsuruoka
arXiv preprint arXiv:2107.00334, 2021
12021
Large Vocabulary Size Improves Large Language Models
S Takase, R Ri, S Kiyono, T Kato
arXiv preprint arXiv:2406.16508, 2024
2024
The University of Tokyo’s Submissions to the WAT 2020 Shared Task
M Rikters, T Nakazawa, R Ri
Proceedings of the 7th Workshop on Asian Translation, 147-153, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–19