Probabilistic Feature Grammars

Joshua Goodman


Abstract
We present a new formalism, probabilistic feature grammar (PFG). PFGs combine most of the best properties of several other formalisms, including those of Collins, Magerman, and Charniak, and in experiments have comparable or better performance. PFGs generate features one at a time, probabilistically, conditioning the probabilities of each feature on other features in a local context. Because the conditioning is local, efficient polynomial time parsing algorithms exist for computing inside, outside, and Viterbi parses. PFGs can produce probabilities of strings, making them potentially useful for language modeling. Precision and recall results are comparable to the state of the art with words, and the best reported without words.
Anthology ID:
1997.iwpt-1.13
Volume:
Proceedings of the Fifth International Workshop on Parsing Technologies
Month:
September 17-20
Year:
1997
Address:
Boston/Cambridge, Massachusetts, USA
Editors:
Anton Nijholt, Robert C. Berwick, Harry C. Bunt, Bob Carpenter, Eva Hajicova, Mark Johnson, Aravind Joshi, Ronald Kaplan, Martin Kay, Bernard Lang, Alon Lavie, Makoto Nagao, Mark Steedman, Masaru Tomita, K. Vijay-Shanker, David Weir, Kent Wittenburg, Mats Wiren
Venue:
IWPT
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
89–100
Language:
URL:
https://aclanthology.org/1997.iwpt-1.13
DOI:
Bibkey:
Cite (ACL):
Joshua Goodman. 1997. Probabilistic Feature Grammars. In Proceedings of the Fifth International Workshop on Parsing Technologies, pages 89–100, Boston/Cambridge, Massachusetts, USA. Association for Computational Linguistics.
Cite (Informal):
Probabilistic Feature Grammars (Goodman, IWPT 1997)
Copy Citation:
PDF:
https://aclanthology.org/1997.iwpt-1.13.pdf