Proceedings of the NAACL-HLT 2012 Workshop: Will We Ever Really Replace the N-gram Model? On the Future of Language Modeling for HLT

Bhuvana Ramabhadran, Sanjeev Khudanpur, Ebru Arisoy (Editors)


Anthology ID:
W12-27
Month:
June
Year:
2012
Address:
Montréal, Canada
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
URL:
https://aclanthology.org/W12-27
DOI:
Bib Export formats:
BibTeX MODS XML EndNote
PDF:
https://aclanthology.org/W12-27.pdf

pdf bib
Proceedings of the NAACL-HLT 2012 Workshop: Will We Ever Really Replace the N-gram Model? On the Future of Language Modeling for HLT
Bhuvana Ramabhadran | Sanjeev Khudanpur | Ebru Arisoy

pdf bib
Measuring the Influence of Long Range Dependencies with Neural Network Language Models
Hai Son Le | Alexandre Allauzen | François Yvon

pdf bib
Large, Pruned or Continuous Space Language Models on a GPU for Statistical Machine Translation
Holger Schwenk | Anthony Rousseau | Mohammed Attik

pdf bib
Deep Neural Network Language Models
Ebru Arisoy | Tara N. Sainath | Brian Kingsbury | Bhuvana Ramabhadran

pdf bib
A Challenge Set for Advancing Language Modeling
Geoffrey Zweig | Chris J.C. Burges

pdf bib
Unsupervised Vocabulary Adaptation for Morph-based Language Models
André Mansikkaniemi | Mikko Kurimo

pdf bib
Large-scale discriminative language model reranking for voice-search
Preethi Jyothi | Leif Johnson | Ciprian Chelba | Brian Strope

pdf bib
Revisiting the Case for Explicit Syntactic Information in Language Models
Ariya Rastrow | Sanjeev Khudanpur | Mark Dredze