Group highlights

(Full list is available on CIS website.)

It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners

Timo Schick, Hinrich Schütze

Using much smaller models and a cloze based reformulation of NLP tasks, the paper achieves achieves performance comparable to GPT-3 models.


Placing language in an integrated understanding system: Next steps toward human-level performance in neural language models

James L. McClelland, Felix Hill, Maja Rudolph, Jason Baldridge, and Hinrich Schütze

The article takes a fresh look at the contemporary NLP systems from the standpoint of cognitive neuroscience; sketching a new framework for future research.

Proceedings of the National Academy of Sciences of the United States of America


SimAlign: High Quality Word Alignments without Parallel Training Data using Static and Contextualized Embeddings

Masoud Jalili Sabet, Philipp Dufter, François Yvon and Hinrich Schütze.

SimAlign is a high-quality alignment tool that achieves word alignment without requiring parallel training data

Findings of ACL: EMNLP 2020