Publications
2024 Link to heading
HELMET: How to Evaluate Long-Context Language Models Effectively and Thoroughly
Howard Yen, Tianyu Gao, Minmin Hou, Ke Ding, Daniel Fleischer, Peter Izsak, Moshe Wasserblat, Danqi Chen
Preprint
Paper | Repo
RAG Foundry: A Framework for Enhancing LLMs for Retrieval Augmented Generation
Daniel Fleischer, Moshe Berchansky, Moshe Wasserblat and Peter Izsak
Preprint
Paper | Repo
CoTAR: Chain-of-Thought Attribution Reasoning with Multi-level Granularity
Moshe Berchansky, Daniel Fleischer, Moshe Wasserblat, Peter Izsak
The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) Findings
Paper
2023 Link to heading
Optimizing Retrieval-augmented Reader Models via Token Elimination
Moshe Berchansky, Peter Izsak, Avi Caciularu, Ido Dagan and Moshe Wasserblat
The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023)
Paper | Repo
2022 Link to heading
Transformer Language Models without Positional Encodings Still Learn Positional Information
Adi Haviv, Ori Ram, Ofir Press, Peter Izsak and Omer Levy
Findings of the Association for Computational Linguistics: EMNLP 2022
Paper
2021 Link to heading
How to Train BERT with an Academic Budget
Peter Izsak, Moshe Berchansky and Omer Levy
The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021)
Paper | Slides/Video | Repo
2020 Link to heading
Exploring the Boundaries of Low-Resource BERT Distillation
Moshe Wasserblat, Oren Pereg and Peter Izsak
Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing, 2020
Paper
2019 Link to heading
Training Compact Models for Low Resource Entity Tagging using Pre-trained Language Models
Peter Izsak, Shira Guskin and Moshe Wasserblat
2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS) (2019): 44-47.
Paper | Repo
Q8BERT: Quantized 8Bit BERT
Zafrir, Ofir, Guy Boudoukh, Peter Izsak and Moshe Wasserblat
2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS) (2019): 36-39.
Paper | Repo
2018 Link to heading
Term Set Expansion based NLP Architect by Intel AI Lab
Jonathan Mamou, Oren Pereg, Moshe Wasserblat, Alon Eirew, Yael Green, Shira Guskin, Peter Izsak and Daniel Korat
The 2018 Conference on Empirical Methods in Natural Language Processing (EMNLP 2018).
Paper | Repo
SetExpander: End-to-end Term Set Expansion Based on Multi-Context Term Embeddings
Jonathan Mamou, Oren Pereg, Moshe Wasserblat, Ido Dagan, Y. Goldberg, Alon Eirew, Yael Green, Shira Guskin, Peter Izsak and Daniel Korat
Proceedings of the 27th International Conference on Computational Linguistics: System Demonstrations (COLING 2018)
Paper | Repo
2014 Link to heading
The search duel: a response to a strong ranker
Peter Izsak, Fiana Raiber, Oren Kurland and Moshe Tennenholtz
Proceedings of the 37th international ACM SIGIR conference on Research & development in information retrieval (2014).
Paper
2013 Link to heading
Leveraging memory mirroring for transparent memory scale-out with zero-downtime failover of remote hosts
Tell, R., Peter Izsak, A. Shribman, Steve Walsh and B. Hudzia
2013 IEEE Symposium on Computers and Communications (ISCC) (2013)
Paper