Neural Decipherment via Minimum-Cost Flow: From Ugaritic to Linear B

Jiaming Luo, Yuan Cao, Regina Barzilay


Abstract
In this paper we propose a novel neural approach for automatic decipherment of lost languages. To compensate for the lack of strong supervision signal, our model design is informed by patterns in language change documented in historical linguistics. The model utilizes an expressive sequence-to-sequence model to capture character-level correspondences between cognates. To effectively train the model in unsupervised manner, we innovate the training procedure by formalizing it as a minimum-cost flow problem. When applied to decipherment of Ugaritic, we achieve 5% absolute improvement over state-of-the-art results. We also report first automatic results in deciphering Linear B, a syllabic language related to ancient Greek, where our model correctly translates 67.3% of cognates.
Anthology ID:
P19-1303
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3146–3155
Language:
URL:
https://aclanthology.org/P19-1303
DOI:
10.18653/v1/P19-1303
Bibkey:
Cite (ACL):
Jiaming Luo, Yuan Cao, and Regina Barzilay. 2019. Neural Decipherment via Minimum-Cost Flow: From Ugaritic to Linear B. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3146–3155, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Neural Decipherment via Minimum-Cost Flow: From Ugaritic to Linear B (Luo et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1303.pdf
Code
 j-luo93/NeuroDecipher