一乔姆斯基的简介Avram Noam Chomsky is an American linguist, philosopher,[4][5]cognitive scientist, historian, and activist. He is an Institute Professor and Professor (Emeritus) in the Department of Linguistics & Philosophy at MIT, where he has worked for over 50 years. Chomsky has been described as the "father of modern linguistics" and a major figure of analytic philosophy. His work has influenced fields such as computer science, mathematics, and psychology.Chomsky is credited as the creator or co-creator of the Chomsky hierarchy, the universal grammar theory, and the Chomsky–Schützenberger theorem. Ideologically identifying with anarcho-syndicalism and libertarian socialism, Chomsky is known for his critiques of U.S. foreign policy[12]and contemporary capitalism,[13] and he has been described as a prominent cultural figure.[14] His media criticism has included Manufacturing Consent: The Political Economy of the Mass Media(1988), co-written with Edward S. Herman, an analysis articulating the propaganda model theory for examining the media.According to the Arts and Humanities Citation Index in 1992, Chomsky was cited as a source more often than any other living scholar from 1980 to 1992, and was the eighth most cited source overall. Chomsky is the author of over 100 books.二 1957年出版了《句法结构》Syntactic StructuresSyntactic Structures is a seminal book in linguistics by American linguist Noam Chomsky, first published in 1957. It laid the foundation of Chomsky's idea of transformational grammar. It contains the famous sentence, "Colorless green ideas sleep furiously", which Chomsky offered as an example of a sentence that is completely grammatical, yet completely nonsensical.In Syntactic Structures, Chomsky tries to construct a "formalized theory of linguistic structure" and places emphasis on "rigorous formulations" and "precisely constructed models".Justification of grammarsChomsky writes that his "fundamental concern" is "the problem of justification of grammars". He defines "a grammar of the language L" as "essentially a theory of L", as well as "a device that generates all of the grammatical sequences of L and none of the ungrammatical ones". Talking about the goals of linguistic theory, he draws parallels to theories in physical sciences. He compares a finite corpus of utterances of a particular language to "observations", grammatical rules to "laws" which are stated in terms of "hypothetical constructs" such as phonemes, phrases, etc.[15]According to Chomsky, the criteria for the "justification of grammars" are "external conditions of adequacy", "condition of generality" and "simplicity". To choose which is the best grammar for a given corpus of a given language, Chomsky shows his preference for the "evaluation procedure" (which chooses the best possible grammar for a languageagainst the aforementioned criteria) over the "discovery procedure" (a procedure employed in structural linguistics which is supposed to automatically produce the correct grammar of a language from a corpus) or the "decision procedure" (a procedure which is supposed to automatically choose the best grammar for a language from a set of competing grammars).[16][edit] GrammaticalityAccording to Chomsky, "the fundamental aim in the linguistic analysis of a language L is to separate the grammatical sequences which are the sentences of L from the ungrammatical sequences which are not sentences of L and to study the structure of the grammatical sequences."[17] By "grammatical" Chomsky means "acceptable to a native speaker".[17]Analyzing further about the basis of grammaticality, Chomsky shows three ways that do not determine whether a sentence is grammatical or not: its inclusion in a corpus, it being meaningful, and it being statistically probable. To illustrate his point, Chomsky presents a nonsensical sentence "Colorless green ideas sleep furiously"[6]and says that even though the sentence is grammatical, it is not included in any known corpus at the time and is neither meaningful nor statistically probable.Chomsky concludes that "grammar is autonomous and independent of meaning, and that probabilistic models give no particular insight into some of the basic problems of syntactic structure." [18][edit] Grammar modelsAssuming that a set of "grammatical" sentences of a language has been given, Chomsky then tries to figure out what sort of device or model gives an adequate account of this set of utterances. To this end, he first discusses finite state grammar, a communication theoretic model based on a conception of language as a Markov process. Then he discusses phrase structure grammar, a model based on immediate constituent analysis. He shows that both these models are inadequate for the purpose of linguistic description and as a solution, proposes his own formal theory of syntax called transformational generative grammar(TGG), "a more powerful model combining phrase structure and grammatical transformations that might remedy these inadequacies."[15]A transformational grammar has a "natural tripartite arrangement": phrase structure rules, transformational rules and morphophonemic rules.[19]The phrase structure rules are used for the expansion of grammatical categories and for substitutions. These yield a string of morphemes. A transformational rule "operates on a given string...with a given constituent structure and converts it into a new string with a new derived constituent structure."[20] It "may rearrange strings or may add or delete morphemes."[21]Transformational rules are of two kinds: obligatory or optional. Obligatory transformations applied on the "terminal strings" of the grammar produce the "kernel of the language",[19] which are simple, active, declarative and affirmative sentences. To produce passive, negative, interrogative or complex sentences, one or more optional transformation rules must be applied in a particular order to the kernel sentences. At the final stage of the grammar, morphophonemic rules convert a string of words into a string of phonemes.In Syntactic Structures, Chomsky invented the term "generative" and used it in a particular technical sense. When he says a finite set of rules "generate" the set of potentially infinite number of sentences of a particular human language, he means that they provide an explicit, structural description of those sentences.[22]三.Aspects of the Theory of Syntax(《句法理论的若干问题》)It is a book written by American linguist Noam Chomsky, first published in August 1965. It is known in linguistic circles simply as Aspects. Chomsky wrote Aspects to address the various deficiencies found in transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. In Aspects, Chomsky presented a deeper, more extensive reformulation of TGG.The goal of linguistic theoryIn Aspects, Chomsky lays down the abstract, idealized context in which a linguistic theorist is supposed to perform his research: "Linguistic theory is concerned primarily with an ideal speaker-listener, in a completely homogeneous speech-community, who knows its language perfectly and is unaffected by such grammatically irrelevant conditions as memory limitations, distractions, shifts of attention and interest, and errors (random or characteristic) in applying his knowledge of the language in actual performance." He makes a "fundamental distinction between competence(the speaker-hearer's knowledge of his language) and performance (the actual use of language in concrete situation)." A "grammar of a language" is "a description of the ideal speaker-hearer's intrinsic competence", and this "underlying competence" is a "system of generative processes." An "adequate grammar" should capture the basic regularities and the productive nature of a language.[6]The structure of grammarChomsky summarizes his proposed structure of a grammar as follows: "A grammar contains a syntactic component, a semantic component and a phonological component...The syntactic component consists of a base and a transformational component. The base, in turn, consists of a categorial subcomponent and a lexicon. The base generates deep structures. A deep structure enters the semantic component and receives a semantic interpretation; it is mapped by transformational rules into a surface structure, which is then given a phonetic interpretation by the rules of the phonological component."[7]The addition of a semantic component to the grammar was the most important conceptual change since Syntactic Structures. Chomsky mentions that the semantic component is essentially the same as described in Katz and Postal (1964). Among the more technical innovations are the use of recursive phrase structure rules and the introduction of syntactic features in lexical entries to address the issue ofsubcategorization.Syntactic featuresIn Chapter 2 of Aspects, Chomsky discusses the problem of subcategorization of lexical categories and how this information should be captured in a generalized manner in the grammar. He deems that rewrite rules are not the appropriate device in this regard. As a solution, he borrows the idea of features from phonology. A lexical category such as noun, verb, etc. is represented by a symbol such as N, V. etc.A set of "subcategorization rules" then analyzes these symbols into "complex symbols", each complex symbol being a set of specified "syntactic features", grammatical properties with binary values.Syntactic feature is one of the most important technical innovations of the Aspects model. Most contemporary grammatical theories have preserved it.四transformational grammar or transformational-generative grammar(TGG) 的定义is a generative grammar, especially of a natural language, that has been developed in the Chomskyan tradition of phrase structure grammars (as opposed to dependency grammars). Additionally, transformational grammar is the tradition that gives rise to specific transformational grammars. Much current research in transformational grammar is inspired by Chomsky's Minimalist Program.[1]五 University grammar(普遍语法)定义Chomsky proved that language is entirely innate and discovered a "universal grammar" (UG). In fact, Chomsky simply observed that while a human baby and a kitten are both capable of inductive reasoning, if they are exposed to exactly the same linguistic data, the human child will always acquire the ability to understand and produce language, while the kitten will never acquire either ability. Chomsky labeled whatever the relevant capacity the human has which the cat lacks the "language acquisition device" (LAD) and suggested that one of the tasks for linguistics should be to figure out what the LAD is and what constraints it puts on the range of possible human languages. The universal features that would result from these constraints are often termed "universal grammar" or UG.[34]六,Language acquisition的定义Language acquisition is the process by which humans acquire the capacity to perceive and comprehend language, as well as to produce and use words to communicate.The capacity to successfully use language requires one to acquire a range of tools including syntax, phonetics, and an extensive vocabulary. This language might be vocalized as with speech or manual as in sign. The human language capacity is represented in the brain. Even though the human language capacity is finite, one can say and understand a infinite number of things.Evidence suggests that every individual has three recursive mechanisms that allows sentences to go indeterminately. These three mechanisms are: relativization, complementation and coordination.[1]Language acquisition usually refers to first language acquisition, which studies infants' acquisition of their native language. This is distinguished from second language acquisition, which deals with the acquisition (in both children and adults) of additional languages.The capacity to acquire and use language is a key aspect that distinguishes humans from other beings. Although it is difficult to pin down what aspects of language are uniquely human, there are a few design features that can be found in all known forms of human language, but that are missing from forms of animal communication.[2]For example, many animals are able to communicate with each other by signaling to the things around them, but this kind of communication lacks the arbitrariness of human vocabularies (in that there is nothing about the sound of the word "dog" that would hint at its meaning). Other forms of animal communication may utilize arbitrary sounds, but are unable to combine those sounds in different ways to create completely novel messages that are automatically understood by another. Hockett called this design feature of human language "productivity." It is crucial to the understanding of human language acquisition that we are not limited to a finite set of words, but rather must be able to understand and utilize a complex system that allows for an infinite number of possible messages. So, while many forms of animal communication exist, they differ from human languages in that they have a limited range of non-syntactically structured vocabulary tokens that lack cross cultural variation between groups.[3]A major question in understanding language acquisition is how these capacities are picked up by infants from what appears to be very little input. Input in the linguistic context is defined as "All words, contexts, and other forms of language to which a learner is exposed, relative to acquired proficiency in first or second languages" [4] It is difficult to believe, considering the hugely complex nature of human languages, and the relatively limited cognitive abilities of an infant, that infants are able to acquire most aspects of language without being explicitly taught. Children, within a few years of birth, understand the grammatical rules of their native language without being explicitly taught, as one learns grammar in school.[5]A range of theories of language acquisition have been proposed in order to explain this apparent problem. These theories include innatism and Psychological nativism in which a child is born prepared in some manner with these capacities, as opposed to other theories in which language is simply learned as one learns to ride a bike. The conflict betweenthe traits humans are born with and those that are a product of one's environment is often referred to as the "Nature vs. Nurture" debate. As is the case with many other human abilities and characteristics, it appears that there are some qualities of language acquisition that the human brain is automatically wired for (a "nature" component) and some that are shaped by the particular language environme nt in which a person is raised.。