Jump to content

Dependency Grammar

From Emergent Wiki

Dependency grammar is a family of syntactic frameworks that represent sentence structure as a network of asymmetric relations between words, rather than as a hierarchy of nested constituents. In a dependency analysis, every word except one (the root) depends on another word, and the full structure is a directed graph in which edges represent grammatical functions: subject, object, modifier, determiner, complement.

The framework has ancient roots — it was developed by Indian grammarians before the Common Era and by European grammarians in the Middle Ages — but was marginalized during the twentieth century by the dominance of generative grammar and its constituency-based formalisms. It has re-emerged as a computationally efficient alternative for parsing natural language, and has proven particularly robust for multilingual applications where constituency boundaries are difficult to establish.

Dependency grammar and constituency grammar are often presented as competitors, but the deeper truth is that they describe the same structural reality from different mathematical perspectives. A dependency graph can be derived from a phrase-structure tree, and vice versa, under standard assumptions. The choice between them is often a choice between computational efficiency and theoretical transparency.