Research at Unbabel

November 2, 2022

I am thrilled to initiate a series of posts describing some recent exciting news. At Unbabel, we have been pioneering advancements in machine learning since 2015 when we initiated Unbabel AI Research. Throughout the years, we have been pushing forward the state of the art in translation quality estimation, evaluation metrics, and automatic post-editing towards building the World’s Translation Layer

Our work has been presented at top conferences in the field (ACL, NAACL, EMNLP, ICML, NeurIPS, WMT, AMTA, EAMT). We have fuelled open-source projects such as OpenKiwi and COMET that are now relied upon by many researchers around the world in industry and academia for translation quality estimation and evaluation. You can find some of our work at 

With that, our team is putting pen to paper and providing a platform to give other researchers and engineers a glimpse into our latest advances in research at Unbabel, with the goal to inspire others to use and build their research on top of our work and to contribute to our open-source tools.

We will kickstart this new R&D blog with a sequence of posts showcasing several of our recent achievements in research with topics that include:

Stay tuned!

About the Author

Profile Photo of André Martins
André Martins

André Martins is the VP of AI Research at Unbabel, an Associate Professor at IST, and a researcher at IT. He received a dual-degree PhD (2012) in Language Technologies from Carnegie Mellon University and IST. His PhD thesis received an Honorable Mention in CMU's SCS Dissertation Award and the Portuguese IBM Scientific Prize. His research interests include natural language processing (NLP), ML, structured prediction, and sparse modeling, in particular the use of sparse attention mechanisms to induce interpretability in deep learning systems. He co-founded and co-organizes the Lisbon Machine Learning School (LxMLS 2011--2019). He received a best paper award at ACL 2009 and a best system demonstration paper award at ACL 2019. A. Martins recently won an ERC starting grant for his DeepSPIN project (2018-23), whose goal is to develop new deep learning models and algorithms for structured prediction, for NLP applications.