In linguistics, syntax is the set of rules, principles, and processes that govern the structure of sentences in a given language, usually including word order. The term syntax is also used to refer to the study of such principles and processes. The goal of many syntacticians is to discover the syntactic rules common to all languages.
1. Sequencing of subject, verb, and object
One basic description of a languages syntax is the sequence in which the subject S, verb V, and object O usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV. The other possible sequences are VSO, VOS, OVS, and OSV, the last three of which are rare. In most generative theories of syntax, these surface differences arise from a more complex clausal phrase structure, and each order may be compatible with multiple derivations.
2. Early history
The Astādhyāyī of Pānini c. 4th century BC in Ancient India, is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory as works on grammar were written long before modern syntax came about. In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.
For centuries, a framework known as grammaire generale first expounded in 1660 by Antoine Arnauld in a book of the same title dominated work in syntax: as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought.
However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.
The Port-Royal grammar modeled the study of syntax upon that of logic. Indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire generale. Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "subject – copula – predicate". Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.
The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi 2001)
There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton, sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists e.g., Gerald Gazdar take a more Platonistic view, since they regard syntax to be the study of an abstract formal system. Yet others e.g., Joseph Greenberg consider syntax a taxonomical device to reach broad generalizations across languages.
3.1. Theories Dependency grammar
Dependency grammar is an approach to sentence structure where syntactic units are arranged according to the dependency relation, as opposed to the constituency relation of phrase structure grammars. Dependencies are directed links between words. The finite verb is seen as the root of all clause structure and all the other words in the clause are either directly or indirectly dependent on this root. Some prominent dependency-based theories of syntax are:
- Functional generative description
- Recursive categorical syntax, or Algebraic syntax
- Meaning–text theory
- Word grammar
- Operator grammar
Lucien Tesniere 1893–1954 is widely seen as the father of modern dependency-based theories of syntax and grammar. He argued vehemently against the binary division of the clause into subject and predicate that is associated with the grammars of his day S → NP VP and which remains at the core of most phrase structure grammars. In the place of this division, he positioned the verb as the root of all clause structure.
3.2. Theories Categorial grammar
Categorial grammar is an approach that attributes the syntactic structure not to rules of grammar, but to the properties of the syntactic categories themselves. For example, rather than asserting that sentences are constructed by a rule that combines a noun phrase NP and a verb phrase VP e.g., the phrase structure rule S → NP VP, in categorial grammar, such principles are embedded in the category of the head word itself. So the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word requiring an NP as an input and produces a sentence level structure as an output. This complex category is notated as NP\S instead of V. NP\is read as "a category that searches to the left indicated by \ for an NP the element on the left and outputs a sentence the element on the right." The category of transitive verb is defined as an element that requires two NPs its subject and its direct object to form a sentence. This is notated as NP/NP\S) which means "a category that searches to the right indicated by / for an NP the object, and generates a function equivalent to the VP which is NP\S, which in turn represents a function that searches to the left for an NP and produces a sentence."
Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.
3.3. Theories Stochastic/probabilistic grammars/network theories
Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism.
3.4. Theories Functional grammars
Functionalist models of grammar study the form–function interaction by performing a structural and a functional analysis.
- Prague linguistic circle
- Systemic functional grammar
- Role and reference grammar RRG
- Functional discourse grammar Dik
3.5. Theories Generative grammar
The hypothesis of generative grammar is that language is a biological structure. The difference between structural–functional and generative models is that, in generative grammar, the object is placed into the verb phrase. Generative grammar is meant to be used to describe all human language and to predict whether any given utterance in a hypothetical language would sound correct to a speaker of that language versus constructions which no human language would use. This approach to language was pioneered by Noam Chomsky. Most generative theories although not all of them assume that syntax is based upon the constituent structure of sentences. Generative grammars are among the theories that focus primarily on the form of a sentence, rather than its communicative function.
Among the many generative theories of linguistics, the Chomskyan theories are:
- Government and binding theory GB revised theory in the tradition of TG developed mainly by Chomsky in the 1970s and 1980s
- Minimalist program MP a reworking of the theory out of the GB framework published by Chomsky in 1995
- Transformational grammar TG Original theory of generative syntax laid out by Chomsky in Syntactic Structures in 1957
Other theories that find their origin in the generative paradigm are:
- Harmonic grammar HG similar to the Optimality Theory of Syntax
- Head-driven phrase structure grammar HPSG
- Generative semantics superseded by Semantic Syntax
- Lexical functional grammar LFG
- Generalized phrase structure grammar GPSG; now largely out of date
- Arc pair grammar
- Relational grammar RG now largely out of date
3.6. Theories Cognitive and usage-based grammars
The Cognitive Linguistics framework stems from generative grammar, but adheres to evolutionary rather than Chomskyan linguistics. Cognitive models often recognise the generative assumption that the object belongs to the verb phrase. Cognitive frameworks include:
- Construction grammar CxG
- Emergent grammar
- Cognitive grammar
- In computer science, SYNTAX is a system used to generate lexical and syntactic analyzers parsers both deterministic and non - deterministic for all kinds
- Syntax highlighting is a feature of text editors that are used for programming, scripting, or markup languages, such as HTML. The feature displays text
- In computer science, a syntax error is an error in the syntax of a sequence of characters or tokens that is intended to be written in compile - time. A program
- In computer science, an abstract syntax tree AST or just syntax tree, is a tree representation of the abstract syntactic structure of source code written
- In computer science, the abstract syntax of data is its structure described as a data type possibly, but not necessarily, an abstract data type independent
- logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretation or meaning given to them. Syntax is concerned
- Syntax comprises a family of fonts designed by Swiss typeface designer Hans Eduard Meier. Originally just a sans - serif font, it was extended with additional
- Syntax guessing, also known as guess - the - verb, guess - the - noun and the syntax quest, is a problem sometimes encountered in text - based video games, such
- Doctor Syntax may refer to: Dr. Syntax a comic character created by William Combe and the cartoonist Thomas Rowlandson. Doctor Syntax horse 1811 1838
- Syntax - Brillian Corporation was an American corporation formed on November 30, 2005, by the merger of Syntax seller of widescreen HDTV - ready LCD televisions
- Arden syntax is a markup language used for representing and sharing medical knowledge. This clinical and scientific knowledge language is used in an executable