|
Syntax is the study of the structure of sentences in language. It explores how the parts of a sentence relate to each other by determining hierarchical dependencies such that ungrammaticalities can be explained systematically.
The single words of a sentence can be grouped into larger units. These are called constituents or phrases. Syntactic phrases are dominated by a head, which projects its categorial characteristics onto the entire phrase. For example, a phrase like \"in the mood\" is headed by the element \"in\", which belongs to the lexical category preposition. Prepositions like \"in\" require a prepositional object (a so-called complement in syntactic terms) to the right in English in order to build up a prepositional phrase. Here, the prepositional object is provided by the noun phrase \"the mood\", which splits up into smaller units again, i.e. the determiner \"the\" and the noun \"mood\". Using this successive strategy, the hierarchical relations within a sentence can be described such that the dependencies between the single words become apparent.
The goal of syntax goes beyond the description of such empirical structural regularities to provide a systematic account of what a speaker/hearer (unconsciously) knows about his or her language. This knowledge is called ?grammar?. All human beings, whether a native speaker of English, Swahili, or sign language, have in common some internal (or mental) knowledge on which the development of language in childhood (its ontogenesis) and its application in adults are based. This knowledge can be characterized as an innate universal grammar. In the ideal case, the theory of syntax defines exactly those structural representations that can be processed in some way by the human brain. This implies that the syntactic representations must obey the principles of learnability and processability. These principles are derived from insights in language acquisition processes as well as in fundamental concepts in psychology. Crucial differences between specific languages provide evidence as to what factors play a central role in language in general. The insertion of so-called ?dummy-do? in English questions or negated sentences is a pertinent example. It shows that grammatical agreement features need to be expressed obligatorily and, therefore, are realized through a dummy element in certain cases.
Linguistic research on syntax varies as to what types of information and structures underlie the derivation of a syntactic string. For example, the Government and Binding Theory (Chomsky (1981), Haegeman (1991)) makes use of recursive syntactic rules that interact with several modules of grammar such as X-bar theory, theta theory, or case theory. By means of these rules an infinite number of correct sentences can be created. Quite another view on language is advocated by the Construction Grammar (cf. Goldberg (1995)). Here, language is seen as a repertoire of fixed complex patterns or constructions that integrate form and meaning in conventionalized and often non-compositional ways. Thus, construction grammar can account for the fact that some constructions are used more frequently than others. In Functional Linguistics the communicative and interactional functions which language serves are considered central. According to this approach, the ecological settings that linguistic utterances can occur in profoundly determine their grammatical structure. In this sense, a central directive in structural approaches to language ? namely, the exclusive exploration of linguistic competence ignoring performance (see also Noam Chomsky's web page at MIT) ? is overridden.
More recent approaches to syntax and language in general also doubt about the central importance of the competence-performance dichotomy. They explore the cognitive processes that are involved in the production and the comprehension of an utterance at the actual moment of speech. The empirical means used in psycholinguistics range from neurobiological on-line techniques such as EEG or fMRi (Functional Magnetic Resonance Imaging) and reaction time studies to more off-line methods such as questionnaire and corpus studies. Common to all these approaches is the basic question whether syntactic structure building is mentally autonomous and modular, i.e. not influenced by non-syntactic (e.g. semantic or conceptual) information from other cognitive areas. Another aspect concerns the flow of information: Is an utterance divided into smaller chunks, which are processed step by step by the different cognitive levels of grammar (serial processing)? Or do we have to assume a more parallel processing, where certain chunks are processed on one level while others are being processed on another level?
Although research on language is a wide field where quite divergent methods are employed, nevertheless, there is presently a consensus that the structural aspects of language constitute a phenomenon that is to be treated in an interdisciplinary fashion from a cognitive vantage point. Under this perspective, syntax and linguistics in general is considered a branch of cognitive science (cf. Stillings et al. (1987)).
References
Chomsky, N. (1981) Lecture on Government and Binding. Dordrecht: Foris.
Frazier, L. (1987) Theories of sentence processing. In: Garfield, J. L. (ed.) Modularity in Knowledge Representation and Natural-Language Understanding. Cambridge (MA): MIT Press.
Friederici, A. (1997) Neurophysiological aspects of language processing. In: Clinical Neuroscience 4, 64-72.
Goldberg, A. (1995) Constructions. A Construction Grammar approach to argument structure. Chicago: University of Chicago Press.
Haegeman, L. (1991) Introduction to Government and Binding Theory. Oxford (UK), Cambridge (MA): Blackwell.
Stillings, N., M. Feinstein, J. Garfield, E. Rissland, D. Rosenbaum, S. Weisler, L. Baker-Ward (1987, eds.) Cognitive Science. An Introduction. Cambridge (MA): MIT Press. |
|