Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention A Katharopoulos, A Vyas, N Pappas, F Fleuret International Conference on Machine Learning (ICML), 2020 | 1639 | 2020 |
Not all samples are created equal: Deep learning with importance sampling A Katharopoulos, F Fleuret International Conference on Machine Learning (ICML), 2018 | 594 | 2018 |
Fast transformers with clustered attention A Vyas, A Katharopoulos, F Fleuret Neural Information Processing Systems (NeurIPS), 2020 | 174 | 2020 |
Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks D Paschalidou, A Katharopoulos, A Geiger, S Fidler Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2021, 2021 | 116 | 2021 |
Biased importance sampling for deep neural network training A Katharopoulos, F Fleuret arXiv preprint arXiv:1706.00043, 2017 | 95 | 2017 |
Processing megapixel images with deep attention-sampling models A Katharopoulos, F Fleuret International Conference on Machine Learning (ICML), 2019 | 71 | 2019 |
MLX: Efficient and flexible machine learning on Apple silicon A Hannun, J Digani, A Katharopoulos, R Collobert | 12 | 2023 |
Controllable music production with diffusion models and guidance gradients M Levy, B Di Giorgi, F Weers, A Katharopoulos, T Nickson arXiv preprint arXiv:2311.00613, 2023 | 11 | 2023 |
Masked autoencoding does not help natural language supervision at scale F Weers, V Shankar, A Katharopoulos, Y Yang, T Gunter Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023 | 11 | 2023 |
Fast supervised lda for discovering micro-events in large-scale video datasets A Katharopoulos, D Paschalidou, C Diou, A Delopoulos Proceedings of the 24th ACM international conference on Multimedia, 332-336, 2016 | 6 | 2016 |
Specialized Language Models with Cheap Inference from Limited Domain Data D Grangier, A Katharopoulos, P Ablin, A Hannun arXiv preprint arXiv:2402.01093, 2024 | 4 | 2024 |
Stop Wasting my FLOPS: Improving the Efficiency of Deep Learning Models A Katharopoulos EPFL, 2022 | 2 | 2022 |
Learning local feature aggregation functions with backpropagation A Katharopoulos, D Paschalidou, C Diou, A Delopoulos 2017 25th European Signal Processing Conference (EUSIPCO), 748-752, 2017 | 1 | 2017 |
No Need to Talk: Asynchronous Mixture of Language Models A Filippova, A Katharopoulos, D Grangier, R Collobert arXiv preprint arXiv:2410.03529, 2024 | | 2024 |
Projected Language Models: A Large Model Pre-Segmented Into Smaller Ones D Grangier, A Katharopoulos, P Ablin, A Hannun ICML 2024 Workshop on Foundation Models in the Wild, 2024 | | 2024 |
Processing Megapixel Images with Deep Attention-Sampling Models, Katharopoulos, Angelos and Fleuret, Francois, Idiap-RR-07-2019 A Katharopoulos | | 2019 |
Segmenting the Unknown: Discrete Diffusion Models for Non-Deterministic Segmentation E COURDIER, A Katharopoulos, F Fleuret | | |
Formal Name (English) L'IDIAP Laboratory C Atanasoaei, SO Ba, S Bengio, JI Biel Tres, R Boghetti, H Bourlard, ... | | |
Supplementary Material for Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks. D Paschalidou, A Katharopoulos, A Geiger, S Fidler | | |
Not All Samples Are Created Equal Supplementary material A Katharopoulos, F Fleuret | | |