Você está na página 1de 13

Name:wulan puspitasari Nim:26.10.6.1.

018

What is Morphology? Mark Aronoff and Kirsten FudemanMORPHOLOGY AND MORPHOLOGICAL ANALYSIS 1 1 Thinking about Morphology and Morphological Analysis 1.1 What is Morphology? 1 1.2 Morphemes 2 1.3 Morphology in Action 4 1.3.1 Novel words and word play 4 1.3.2 Abstract morphological facts 6 1.4 Background and Beliefs 9

What is Morphology? The term morphology is generally attributed to the German poet, novelist, playwright, and philosopher Johann Wolfgang von Goethe (17491832), who coined it early in the nineteenth century in a biological context. Its etymology is Greek: morph- means shape, form, and morphology is the study of form or forms. In biology morphology refers to the study of the form and structure of organisms, and in geology it refers to the study of the conguration and evolution of land forms. In linguistics morphology refers to the mental system involved in word formation or to the branch2 MORPHOLOGY AND MORPHOLOGICAL ANALYSIS of linguistics that deals with words, their internal structure, and how they are formed.

Morphemes

A major way in which morphologists investigate words, their internal structure, and how they are formed is through the identication and study of morphemes, often dened as the smallest linguistic pieces with a grammatical function. This denition is not meant to include all morphemes, but it is the usual one and a good starting point. A morpheme may consist of a word, such as hand, or a meaningful piece of a word, such as the -ed of looked, that cannot be divided into smaller meaningful parts. Another way in which morphemes have been dened is as a pairing between sound and meaning. We have purposely chosen not to use this denition. Some morphemes have no concrete form or no continuous form, as we will see, and some do not have meanings in the conventional sense of the term. You may also run across the term morph. The term morph is sometimes used to refer specically to the phonological realization of a morpheme. For example, the English past tense morpheme that we spell -ed has various morphs. It is realized as [t] after the voiceless [p] of jump (cf. jumped), as [d] after the voiced [l] of repel (cf. repelled), and as [@d] after the voiceless [t] of root or the voiced [d] of wed (cf. rooted and wedded). We can also call these morphs allomorphs or variants. The appearance of one morph over another in this case is determined by voicing and the place of articulation of the nal consonant of the verb stem. Now consider the word reconsideration. We can break it into three morphemes: re-, consider, and -ation. Consider is called the stem. A stem is a base morpheme to which another morphological piece is attached. The stem can be simple, made up of only one part, or complex, itself made up of more than one piece. Here it is best to consider consider a simple stem. Although it consists historically of more than one part, most present-day speakers would treat it as an unanalyzable form. We could also call consider the root. A root is like a stem in constituting the core of the word to which

other pieces attach, but the term refers only to morphologically simple units. For example, disagree is the stem of disagreement, because it is the base to which -ment attaches, but agree is the root. Taking disagree now, agree is both the stem to which dis- attaches and the root of the entire word. Returning now to reconsideration, re- and -ation are both afxes, which means that they are attached to the stem. Afxes like re- that go before the stem are prexes, and those like -ation that go after are sufxes. 2 MORPHOLOGY AND MORPHOLOGICAL ANALYSISMORPHOLOGY AND MORPHOLOGICAL ANALYSIS 3 Some readers may wonder why we have not broken -ation down further into two pieces, -ate and -ion, which function independently elsewhere. In this particular word they do not do so (cf. *reconsiderate), and hence we treat -ation as a single morpheme. It is important to take very seriously the idea that the grammatical function of a morpheme, which may include its meaning, must be constant. Consider the English words lovely and quickly. They both end with the sufx -ly. But is it the same in both words? No when we add -ly to the adjective quick, we create an adverb that describes how fast someone does something. But when we add -ly to the noun love, we create an adjective. What on the surface appears to be a single morpheme turns out to be two. One attaches to adjectives and creates adverbs; the other attaches to nouns and creates adjectives. There are two other sorts of afxes that you will encounter, inxes and circumxes. Both are classic challenges to the notion of morpheme. Inxes are segmental strings that do not attach to the front or back of a word, but rather somewhere in the middle. The Tagalog inx -um- is illustrated below (McCarthy and Prince 1993: 1015; French 1988). It creates an agent from a verb stem and appears before the rst vowel of the word:

(1) Root -um/sulat/ /s-um-ulat/ one who wrote /gradwet/ /gr-um-adwet/ one who graduated The existence of inxes challenges the traditional notion of a morpheme as an indivisible unit. We want to call the stem sulat write a morpheme, and yet the inx -um- breaks it up. Yet this seems to be a property of -umrather than one of sulat. Our denition of morphemes as the smallest linguistic pieces with a grammatical function survives this challenge. Circumxes are afxes that come in two parts. One attaches to the front of the word, and the other to the back. Circumxes are controversial because it is possible to analyze them as consisting of a prex and a sufx that apply to a stem simultaneously. One example is Indonesian ke ... -an. It applies to the stem besar big to form a noun ke-besar-an meaning bigness, greatness (MacDonald 1976: 63; Beard 1998: 62). Like inxes, the existence of circumxes challenges the traditional notion of morpheme (but not the denition used here) because they involve discontinuity. We will not go any more deeply here into classical problems with morphemes, but the reader who would like to know more might consult Anderson (1992: 516).

MORPHOLOGY AND MORPHOLOGICAL ANALYSIS 34 MORPHOLOGY AND MORPHOLOGICAL ANALYSIS n 1.3 Morphology in Action We would like to explore the idea of morphology more deeply by examining some data. These are examples of morphology in action

morphological facts of everyday life. n 1.3.1 Novel words and word play If you had been walking down the street in Ithaca, New York, a few years ago, you might have looked up and seen a sign for the music store Rebop, a name that owes its inspiration to the jazz term rebop. 1 Rebop was originally one of the many nonsense expressions that jazz musicians threw into their vocal improvisations, starting in the early 1920s. In the 1940s, rebop became interchangeable with bebop, a term of similar origin, as the term for the rhythmically and harmonically eccentric music played by young black musicians. By the 1950s the name of this musical style was quite rmly established as simply bop. 2 Today, the original use of rebop is known only to cognoscenti, so that most people who pass by the store will be likely to interpret the word as composed of the word bop and the prex re-, which means approximately again. This prex can attach only to verbs, so we must interpret bop as a verb here. Rebop must therefore mean bop again, if it means anything at all. And this music store, appropriately, specialized in selling used CDs. Theres something going on here with English morphology. Of course, rebop is not a perfectly well-formed English word. The verb bop means something like bounce, but the prex re- normally attaches only to a verb whose meaning denotes an accomplishment. The verb rebop therefore makes little sense. But names of stores and products are designed to catch the consumers attention, not necessarily to make sense, and this one does so by exploiting peoples knowledge of English in a fairly

complex way and breaking the rules so as to attract attention, as verbal art often does. Consider now the following phrases, taken from a Toni Braxton song: Unbreak my heart, uncry these tears. We have never seen anyone unbreak something, and you certainly cant uncry tears, but every English speaker can understand these words. We all know what it means to unbreak somebodys heart or to wish that ones heart were unbroken. If we asked somebody, unbreak my heart, we would be asking them to reverse the process of having our heart

Phonology is the study of sounds and speech patterns in language. The root "phone" in phonology relates to sounds and originates from the Greek word phonema which means sound. Phonology seeks to discern the sounds made in all human languages. The

identification of universal and non-universal qualities of sounds is a crucial component in phonology as all languages use syllables and forms of vowels and consonants. Syllables are involved in the timing of spoken language since speaking each word takes a portion of time. Syllables are units of measurement in language. Vowels allow air to escape from the mouth and nose unblocked, while consonants create more covering of the vocal tract by the tongue. The heard friction that is a consonant is made from the air that cannot escape as the mouth utters the consonant. Phonemes are units of sound in a language that convey meaning. For example, changing a syllable in a word will change its meaning, such as changing the "a" in "mad" to an "o" to produce "mod". A phoneme can also achieve no meaning by creating non-existent words such as by changing the "m" in "mad" or "mod" to a "j" to produce "jad" or "jod". Phonemes differ from morphemes and graphemes. A morpheme refers to main grammar units, while a grapheme is the main unit of written language. Ads by Google English Courses Central London, UK Small Classes, All Levels, Study English in London Apply! www.schoolofenglish.org.uk TOEFL Test Official No Other Test Can Get You Into 7500+ Universities. Register Now. Site www.ToeflGoAnywhere.org Find Out How to Study at an Australian University University in Australia StudiesInAustralia.com/University Be prepared to study in Singapore with our full-time English City College (Singapore) courses! www.citycollege.edu.sg IB revision courses & study guides Easter & summer courses in Oxford Study Courses UK/USA www.osc-ib.com Ensuring that the proper pronunciation is used in a language is a practical application of phonology. For example, phonology uses symbols to differentiate the sounds of a particular vowel. The vowels are classified into "front", "central", and "back" depending on the positioning of the jaw and tongue when the vowel sounds are made. Phonology also notes lip position such as if the lips are spread out or rounded as well as if the vowel sound is long or short. The symbol for the vowel sound in words such as "chilly" or "tin" in phonology is /i/ and refers to a front, short vowel spoken with a tongue in high position and spread lips. Contrastingly, the symbol for the vowel sound in words such as "moon" or "blue" in phonology is /u:/ and refers to a back, long vowel spoken with a tongue in high position still, but with rounded lips.

emantics (from Greek smantik, neuter plural of smantiks)[1][2] is the study of meaning. It focuses on the relation between signifiers, such as words, phrases, signs and symbols, and what they stand for, their denotata. Linguistic semantics is the study of meaning that is used by humans to express themselves through language. Other forms of semantics include the semantics of programming languages, formal logics, and semiotics. The word "semantics" itself denotes a range of ideas, from the popular to the highly technical. It is often used in ordinary language to denote a problem of understanding that comes down to word selection or connotation. This problem of understanding has been the subject of many formal inquiries, over a long period of time, most notably in the field of formal semantics. In linguistics, it is the study of interpretation of signs or symbols as used by agents or communities within particular circumstances and contexts.[3] Within this view, sounds, facial expressions, body language, proxemics have semantic (meaningful) content, and each has several branches of study. In written language, such things as paragraph structure and punctuation have semantic content; in other forms of language, there is other semantic content.[3] The formal study of semantics intersects with many other fields of inquiry, including lexicology, syntax, pragmatics, etymology and others, although semantics is a well-defined field in its own right, often with synthetic properties.[4] In philosophy of language, semantics and reference are related fields. Further related fields include philology, communication, and semiotics. The formal study of semantics is therefore complex. Semantics contrasts with syntax, the study of the combinatorics of units of a language (without reference to their meaning), and pragmatics, the study of the relationships between the symbols of a language, their meaning, and the users of the language.[5] In international scientific vocabulary semantics is also called semasiology.

In linguistics, semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words, phrases, sentences, and larger units of discourse (referred to as texts). The basic area of study is the meaning of signs, and the study of relations between different linguistic units: homonymy, synonymy, antonymy, polysemy, paronyms, hypernymy, hyponymy, meronymy, metonymy, holonymy, linguistic compounds. A key concern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units of meaning. Traditionally, semantics has included the study of sense and denotative reference, truth conditions, argument structure, thematic roles, discourse analysis, and the linkage of all of these to syntax.

[edit] Montague grammar


In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of the lambda calculus. In these terms, the syntactic parse of the sentence John ate every bagel would consist of a subject (John) and a predicate (ate every bagel); Montague showed that the meaning of the sentence as a whole could be decomposed into the meanings of its parts and relatively few rules of combination. The logical predicate thus

obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of Tarskiian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to the language of thought hypothesis from the 1970s. Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as:

situation semantics (1980s): truth-values are incomplete, they get assigned based on context generative lexicon (1990s): categories (types) are incomplete, and get assigned based on context

[edit] Dynamic turn in semantics


In Chomskian linguistics there was no mechanism for the learning of semantic relations, and the nativist view considered all semantic notions as inborn. Thus, even novel concepts were proposed to have been dormant in some sense. This view was also thought unable to address many issues such as metaphor or associative meanings, and semantic change, where meanings within a linguistic community change over time, and qualia or subjective experience. Another issue not addressed by the nativist model was how perceptual cues are combined in thought, e.g. in mental rotation.[6] This view of semantics, as an innate finite meaning inherent in a lexical unit that can be composed to generate meanings for larger chunks of discourse, is now being fiercely debated in the emerging domain of cognitive linguistics[7] and also in the non-Fodorian camp in Philosophy of Language.[8] The challenge is motivated by:

factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this x, him, last week). In these situations "context" serves as the input, but the interpreted utterance also modifies the context, so it is also the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences is viewed as context change potentials instead of propositions. factors external to language, i.e. language is not a set of labels stuck on things, but "a toolbox, the importance of whose elements lie in the way they function rather than their attachments to things."[8] This view reflects the position of the later Wittgenstein and his famous game example, and is related to the positions of Quine, Davidson, and others.

A concrete example of the latter phenomenon is semantic underspecification meanings are not complete without some elements of context. To take an example of a single word, "red", its meaning in a phrase such as red book is similar to many other usages, and can be viewed as compositional.[9] However, the colours implied in phrases such as "red wine" (very dark), and "red hair" (coppery), or "red soil", or "red skin" are very different. Indeed, these colours by themselves would not be called "red" by native speakers. These instances are contrastive, so "red wine" is so called only in comparison with the other kind of wine (which also is not "white" for the same reasons). This view goes back to de Saussure:
Each of a set of synonyms like redouter ('to dread'), craindre ('to fear'), avoir peur ('to be afraid') has its particular value only because they stand in contrast

with one another. No word has a value that can be identified independently of what else is in its vicinity.[10]

and may go back to earlier Indian views on language, especially the Nyaya view of words as indicators and not carriers of meaning.[11] An attempt to defend a system based on propositional meaning for semantic underspecification can be found in the Generative Lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into the lexicon. Thus meanings are generated on the fly based on finite context.

[edit] Prototype theory


Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch in the 1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions, but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members. Systems of categories are not objectively "out there" in the world but are rooted in people's experience. These categories evolve as learned concepts of the world meaning is not an objective truth, but a subjective construct, learned from experience, and language arises out of the "grounding of our conceptual systems in shared embodiment and bodily experience".[12] A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for different cultures, or indeed, for every individual in the same culture. This leads to another debate (see the SapirWhorf hypothesis or Eskimo words for snow).

[edit] Theories in semantics


[edit] Model theoretic semantics Main article: formal semantics (linguistics)

Originates from Montague's work (see above). A highly formalized theory of natural language semantics in which expressions are assigned denotations (meanings) such as individuals, truth values, or functions from one of these to another. The truth of a sentence, and more interestingly, its logical relation to other sentences, is then evaluated relative to a model.
[edit] Formal (of truth-conditional) semantics Main article: truth-conditional semantics

Pioneered by the philosopher Donald Davidson, another formalized theory, which aims to associate each natural language sentence with a meta-language description of the conditions under which it is true, for example: `Snow is white' is true if and only if snow is white. The challenge is to arrive at the truth conditions for any sentences from fixed meanings assigned to the individual words and fixed rules for how to combine them. In practice, truthconditional semantics is similar to model-theoretic semantics; conceptually, however, they differ in that truth-conditional semantics seeks to connect language with statements about the real world (in the form of meta-language statements), rather than with abstract models.

[edit] Lexical & conceptual semantics Main article: conceptual semantics

This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntactic properties of phrases reflect the meanings of the words that head them.[13] With this theory, linguists can better deal with the fact that subtle differences in word meaning correlate with other differences in the syntactic structure that the word appears in.[13] The way this is gone about is by looking at the internal structure of words.[14] These small parts that make up the internal structure of words are referred to as semantic primitives.[14]
[edit] Lexical semantics Main article: lexical semantics

A linguistic theory that investigates word meaning. This theory understands that the meaning of a word is fully reflected by its context. Here, the meaning of a word is constituted by its contextual relations.[15] Therefore, a distinction between degrees of participation as well as modes of participation are made.[15] In order to accomplish this distinction any part of a sentence that bears a meaning and combines with the meanings of other constituents is labeled as a semantic constituent. Semantic constituents that can not be broken down into more elementary constituents is labeled a minimal semantic constituent.[15]
[edit] Computational semantics Main article: computational semantics

Computational semantics is focused on the processing of linguistic meaning. In order to do this concrete algorithms and architectures are described. Within this framework the algorithms and architectures are also analyzed in terms of decidability, time/space complexity, data structures which they require and communication protocols.[16]

[edit] Computer science


In computer science, the term semantics refers to the meaning of languages, as opposed to their form (syntax). Additionally, the term semantic is applied to certain types of data structures specifically designed and used for representing information content.

[edit] Programming languages


Main article: semantics of programming languages

The semantics of programming languages and other languages is an important issue and area of study in computer science. Like the syntax of a language, its semantics can be defined exactly. For instance, the following statements use different syntaxes, but cause the same instructions to be executed:
x += y (C, Java, Perl, Python, Ruby, PHP, etc.)

x := x + y ADD x, y LET X = X + Y x = x + y

(Pascal) (Intel 8086 Assembly Language) (early BASIC) (most BASIC dialects, Fortran)

ADD Y TO X GIVING X (COBOL) (incf x y) (Common Lisp)

Generally these operations would all perform an arithmetical addition of 'y' to 'x' and store the result in a variable called 'x'. Various ways have been developed to describe the semantics of programming languages formally, building on mathematical logic:[17]

Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced. Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained. Axiomatic semantics: Specific properties of the effect of executing the constructs are expressed as assertions. Thus there may be aspects of the executions that are ignored.

[edit] Semantic models


Terms such as "semantic network" and "semantic data model" are used to describe particular types of data models characterized by the use of directed graphs in which the vertices denote concepts or entities in the world, and the arcs denote relationships between them. The Semantic Web refers to the extension of the World Wide Web through the embedding of additional semantic metadata, using semantic data modelling techniques such as RDF and OWL.

[edit] Psychology
In psychology, semantic memory is memory for meaning in other words, the aspect of memory that preserves only the gist, the general significance, of remembered experience while episodic memory is memory for the ephemeral details the individual features, or the unique particulars of experience. Word meaning is measured by the company they keep, i.e. the relationships among words themselves in a semantic network. The memories may be transferred intergenerationally or isolated in a single generation due to a cultural disruption. Different generations may have different experiences at similar points in their own time-lines. This may then create a vertically heterogeneous semantic net for certain words in an otherwise homogeneous culture.[18] In a network created by people analyzing their understanding of the word (such as Wordnet) the links and decomposition structures of the

network are few in number and kind, and include "part of", "kind of", and similar links. In automated ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, neural networks and predicate calculus techniques.

Você também pode gostar