1 Nida, Eugene A., Componential Analysis of Meaning: An Introduction to Semantic Nida, E. A., "Semantic Components in Translation Theory" in: Perren G. E. Key words: components, analysis, meaning, semantics Jackson in “Words and their meaning” ( 83) dan Nida in “Componential. Analysis of Meaning”. Nida, Eugene A. A Componential Analysis of Meaning. An Introduction to Semantic Structures eBook (PDF): 2nd print. Reprint Publication Date: June.

Eugene Nida Componential Analysis Of Meaning Pdf

Language:English, Arabic, Japanese
Published (Last):27.06.2015
ePub File Size:25.55 MB
PDF File Size:14.26 MB
Distribution:Free* [*Sign up for free]
Uploaded by: NEOMI

Componential Analysis of Meaning: An Introduction to Semantic Structures. Eugène Albert Nida. Mouton (). Abstract, This article has no associated abstract. componential analysis of meaning as it is considered to be an important approach to the .. Nida, Eugene " Semantic Components in Translation Theory ." In. Componential analysis (Nida, ), semantic components 15 Eugene A. Nida , Componential Analysis of Meaning, An Introduction of Semantic (Great.

His linguistic theory moves towards the fields of semantics and pragmatics, which leads him to develop systems for the analysis of meaning. The British translation theorist Peter Newmark, influenced by the work of Nida, feels that the difference between the source language and the target language would always be a major problem, thus making total equivalence virtually impossible Munday p.

Introduction: the historical background

He also works in the area of correspondence, a linguistic field dedicated to examining similarities and differences between two language systems.

Mona Baker and Bassnett both acknowledge its importance while, at the same time, placing it in the context of cultural and other factors. The emphasis of the structural approach to translation changes towards the end of the s and early s with the work of Vinay, Darbelnet and Catford, and the concept of translation shift, which examines the linguistic changes that take place in the translation between the ST and TT Munday p.

Two other important features arise from the work of Vinay and Darbelnet. Option is an important element in translation because it allows for possible subjective interpretation of the text, especially literary texts Munday pp. His systematic linguistic approach to translation considers the relationship between textual equivalence and formal correspondence. Catford also considers the law of probability in translation, a feature that may be linked to the scientific interest in machine translation at the time.

Some thirty years after Vinay and Darbelnet proposed the direct and oblique strategies for translation, Kitty van Leuven-Zwart developed a more complex theory, using different terminology, based on their work. Her idea is that the final translation is the end result of numerous shifts away from the ST, and that the cumulative effect of minor changes will alter the end product www.

She suggested two models for translation shifts: 1 Comparative — where a comparison of the shifts within a sense unit or transeme phrase, clause, sentence between ST and TT is made. She proposes a model of shift based on micro-level semantic transfer. The s and s sees a move away from the structural side of the linguistic approach as functional or communicative consideration is given to the text. Katharina Reiss continues to work on equivalence, but on the textual level rather than on the word or sentence level.

She proposes a translation strategy for different text types, and says that there are four main textual functions: 1 Informative — designed for the relaying of fact. The TT of this type should be totally representative of the ST, avoiding omissions and providing explanations if required.

The TT should therefore produce the same impact on its reader as the reader of the ST. This means, for example, that things like the source text type may be altered if it is deemed to be inappropriate for the target culture.

She sees translation as an action that involves a series of players, each of whom performs a specific role in the process. The language used to label the players very much resembles that of Western economic jargon — initiator, commissioner, ST producer, TT producer, TT user, TT receiver, that is adding another dimension to the theory of translation as yet rarely mentioned Munday pp Cultural issues in a sociolinguistic context therefore need to be considered.

Skopos is important because it means that the same ST can be translated in different ways depending on the purpose and the guidelines provided by the commissioner of the translation. In Vermeer and Reiss co-authored Grundlegung einer allgemeine Translationstheorie Groundwork for a General Theory of Translation based primarily on skopos, which tries to create a general theory of translation for all texts.

As a result, criticism has been levelled at skopos on the ground that it applies only to non-literary work Munday p. I tend to disagree with this last point because I look at skopos as a means of reflecting the ability of the translator. She places emphasis on the ST as she proposes a ST analysis that can help the translator decide on which methods to employ.

Some of the features for review are subject matter, content, presupposition, composition, illustrations, italics, and sentence structure Munday p. In Translation as a Purposeful Activity her theory is developed as she acknowledges the importance of skopos.

The information provided by the commissioner allows the translator to rank issues of concern in order before deciding on inclusions, omissions, elaborations, and whether the translation should have ST or TT priority.

One of the proponents of this approach was the Head of the Linguistics Department of Sydney University, Michael Halliday, who bases his work on Systemic Functional Grammar — the relationship between the language used by the author of a text and the social and cultural setting.

Halliday says that the text type influences the register of the language — the word choice and syntax. He also says that the register can be divided into three variables: 1 Field — the subject of the text 2 Tenor — the author of the text and the intended reader 3 Mode — the form of the text all of which are important on the semantic level.

She creates a model for translation, which compares variables between ST and TT before deciding on whether to employ an overt or covert translation Stockinger p. This means that the target audience is well aware that what they are reading is a translation that is perhaps fixed in a foreign time and context.

This is just one of the techniques used to reveal the overt nature of the text. Such is the case with the guide leaflets distributed to visitors at Chenonceau Castle in the Loire Valley, which seem to have been created individually for an English audience and a French audience and possibly German, Spanish, Italian and Japanese audiences , so much so that it is almost impossible to tell which is the ST and which is the TT. She examines textual structure and function and how word forms may vary between languages, such as the substitution of the imperative for the infinitive in instruction manuals between English and French.

Gender issues are raised as she discusses ways in which ambiguous gender situations can be overcome, such as adjectival agreement in French. This raises problems in translation because TT readers may not have the same knowledge as ST readers. Possible solutions are rewording or footnotes. Basil Hatim and Ian Mason co-authored two works: Discourse and the Translator and The Translator as Communicator , in which some sociolinguistic factors are applied to translation.

They look at the ways that non-verbal meaning can be transferred, such as the change from active to passive voice which can shift or downplay the focus of the action. Like all other theories, discourse and register analysis has received its share of criticism. It has been labelled complicated and unable to deal with literary interpretation. The linguistic approach to translation theory incorporates the following concepts: meaning, equivalence, shift, text purpose and analysis, and discourse register; which can be examined in the contexts of structural and functional linguistics, semantics, pragmatics, correspondence, sociolinguistics and stylistics.

Think also of various literary phenomena such metaphor, narrative, and poetry. Think of the role of the Holy Spirit in enabling readers to appropriate the message of Scripture. All of these point to mystery, complexity, and ultimately uncontrollable richness. In contrast to this richness, exegesis in its technical forms faces some reductionistic temptations.

We may conveniently focus on the whole area of the nature of language. What view do we hold about the nature of language? What is the nature of meaning in language? Do we allow richness here or not?

Componential analysis by nida

Our assumptions about language will clearly influence our approach to word meanings, sentence meanings, exegesis, and Bible translation. If we have an impoverished view of language, we are likely to have an impoverished view of the Bible as well. For example, if we think that language is designed only to communicate literal propositions, we will probably end up minimizing the functions of metaphor and allusions.

If we think that language is designed only to talk about this world, we will be suspicious of God-talk as an allegedly improper use. Origins of language Our challenges increase because of some unhealthy pressures deriving from the surrounding culture.

To begin with, evolutionary modes of thinking would like to trace language back to animal cries and calls. According to this kind of thinking, just as man has gradually ascended from the slime, human language has gradually ascended from grunts. Thus modern language, like modern human anatomy, finds its original essence in providing for survival.

This mode of thinking naturally throws suspicion on all use of human language for nonmaterial goals. The most material and simplest meaning is the most basic. Talk about God obviously stretches, perhaps to the breaking point, the original functions of language.

By contrast, the Bible shows that human language from the beginning included the function of serving for communication between God and man Gen ; Speech about God and speech from God does not represent a stretch, but a normal function of human language. For example, God is the first and principal ruler over the world.

Human beings created in the image of God become subordinate rulers. The creation of human beings according to the plan of God produces an analogical relation between God's rulership and human rule.

God is king in the supreme sense, while human kings mirror his rule on a subordinate level. To call God king is not "mere" metaphor, in the sense of being unreal. It affirms a real analogy between God and man.

Fler böcker av Nida Eugene A Nida

It involves a normal function for human language. Moreover, it is plain from Scripture that God designed language in such a way that there can be multi-dimensional, complex, nuanced communication between God and man. God can tell stories, both fictional parables and nonfictional.

He can expound and reason theologically, as in Romans, and he can express the full range of human emotions, as in the Psalms. The Bible contains propositional truth, but can express it either in prose or poetry. It contains both short sayings, as in Proverbs, and multi-generational histories, as in Genesis.

The meaning of one sentence in Genesis coheres with the meanings in the whole narrative. Meaning is not reducible to pellet-sized isolated sentences that are thrown together at random. For example, Genesis says, "I will make of you a great nation. And the full meaning of "great nation" can be seen only as the promise begins to find fulfillment near the end of Genesis and into Exodus.

And what do we do with a more loaded term like "blessing"? This promise contrasts subtly with the earlier arrogant attempt at Babel where people desired to "make a name for ourselves" Gen And it resonates with the later instances of blessing that run all the way through the Old Testament and into the New. All this is fairly obvious to a reasonably skilled reader.

But we must now ask whether modern theories of meaning are adequate to capture this richness. So let us look at three technical tools that have blossomed in the twentieth century: symbolic logic, structural linguistics, and translation theory. All three contribute to understanding language, but at the same time, when clumsily used, threaten to reduce meaning to one dimension.

Symbolic logic Reflection about logic goes all the way back to Aristotle. But formal symbolic logic blossomed in the late nineteenth and early twentieth century with the work of Gottlob Frege, Bertrand Russell, and others.

And it proves useful in uncovering logical fallacies in informal reasoning. But what of its limitations? For the most part, the use of mathematical logic requires that we begin with isolated sentences. This step already involves a reduction of the full richness of human communication as it occurs in long discourses and social interaction. It also requires that a sentence be isolated from its situational context.

It then treats the sentence almost wholly in terms of its truth value. Modern evangelicalism has rightly insisted on propositional revelation in Scripture in response to liberal and neo-orthodox reductions of revelation to religious feeling and personal encounter.

But in the process, we must beware of the reverse problem of reducing the discourse of Scripture merely to its truth value. It does have truth value. But the meaning of a whole discourse or of one sentence within it includes more than the fact that it is true or false. It is related in meaning to many other part of Scripture; it asks for application in our lives; it has the power to transform our hearts; and so on.

Symbolic logic is so obviously reductive in its approach to meaning that perhaps we do not need so much to remind ourselves of its reductive character. So let us pass on to the second great area of advance, structural linguistics. Human language is so complicated and multi-dimensional that simplifications had to be made in order to get linguistics started. But it is easy along the way, in the excitement of discovery, to forget those simplifications and to make exaggerated or one-sided claims about the implications.

SearchWorks Catalog

In considering the development of structural linguistics, I will have to make some simplifications myself, and confine myself to some high points illustrating the trends. Ferdinand de Saussure, Many consider that structural linguistics had its origin in the lectures of Ferdinand de Saussure in , , and , which were lated compiled into the book Course in General Linguistics.

Linguistics will study language langue as a system, instead of studying speech parole. That is, it will study the systematic regularities common to all native speakers, rather than the particularities of every individual speech by every individual speaker.

In the light of hindsight this famous move toward focusing on the system of language decisively contributed to the delineation of linguistics as a subject distinct from textual analysis and exegesis. But the advance came with a cost.

Any reasonable approach to the meaning of a specific communication parole must take into account the speaker, the audience, and the circumstances, since all three affect the nuances of a particular speech or text. The meaning of a particular parole naturally depends on the particular words and their meanings. But it is not simply a mechanical product of word meanings, but includes a complex particular texture that varies with circumstance.

Saussure deliberately cut off the variations in order to study "the system. In later discussion he added context back in with the distinction between syntagmatic and associative or paradigmatic relations.

In many ways this reduction is quite understandable, perhaps in some sense necessary, because words are stable in relation to the surrounding speech parole , and one must start with some simplifications if one is to get linguistics off the ground. Third, Saussure introduced a model for linguistic signs with three parts: the "sound-image" or signifier, the "concept" or signified, and the "sign" that consists of both parts together.

This move makes sense as a way of defining more rigorously the distinction between form and meaning. But it introduces a subtle reductionism in the thinking about meaning.

Children learning a language often learn the meanings of words through their occurrences in social situations where there is reference to a real-world object.

Words for milk and soup, cat and dog come to have meaning through the help of occurrences of milk and soup and cats and dogs in the environment. In the long run, referential functions have an indispensable role in meaning. Saussure has left out reference, and settled on "concept," which suggest a purely mental phenomenon.

This restriction is once again understandable, given his earlier decision to focus on the language system. The language system does not directly refer to objects in the world in the same way that specific speakers refer to such objects in specific speeches parole.

But one can never understand meaning in its fullness if one leaves out reference. The omission of reference offers an open door for later reductionisms, as one can see with the case of certain forms of structuralism in which language is treated as a closed system of signs that refer only to other signs.

In the hands of certain practitioners, the "meaning" of any one particular text got reduced to the central truth that meaning is a function of system.

Saussure proposed still another reduction when he shifted from "meaning" to "value. He says, "Language is a system of interdependent terms in which the value of each term results solely from the simultaneous presence of the others. One will thereby ignore both reference and the historical accumulation of potential for allusion to earlier occurrences of the same expression. The benefits of focusing on the system of oppositions are now well known and undeniable.

But we should not conceal from ourselves that these benefits derive partly from ignoring intractable complexities in what is left out. Leonard Bloomfield, A second milestone in the development of structural linguistics occurred with Leonard Bloomfield's publication of Language in But simplifications entered in as he focused on the concerns of linguistics. For one thing, Bloomfield used a simple stimulus-response model for understanding human behavior. Meaning is effectively reduced to the meaning of an expression that is independent of the larger context.

Together with the later work Aspects of the Theory of Syntax, [16] this book had enormous influence on the direction of linguistic research, because of its appeal to rigor and formalization, and because of the impressive conclusion that certain simple types of formal grammar were provably inadequate for the complexities of natural language.

But rigor and formalization came, as usual, with a price. Chomsky stipulated that a language was "a set finite or infinite of sentences, each finite in length and constructed out of a finite set of elements.

It is a vast simplification, but unfortunately Chomsky did not overtly acknowledge how much it simplifies. In the next sentence after this definition, he simply declared that "All natural languages in their spoken or written form are languages in this sense, Chomsky also introduced the significant distinction between kernel sentences and nonkernel sentences.

These sentences arise within Chomsky's formalism by the application of phrase structure rules and obligatory transformations. Nonkernel sentences include passive sentences, such as "The dog was fed by the boy," and derived expressions like "It was the boy who fed the dog. Such an analysis is tempting precisely because in many cases it approximates the truth, and captures some of the core meaning or basic meaning that we associate with a sentence.

But as a total account of meaning it is obviously reductionistic. Linguistics has continued to develop since the Chomskyan revolution in and Chomsky's generative grammar eventually mutated into the theory of government and binding, and then into the minimalist program. But we also see challenges from competing theories. Cognitive linguistics with its meaning-centered approach challenges the grammar-centered approach of generative grammar and its successors.

Bible translators confronted the task of translating into thousands of third-world tribal languages. Eugene Nida, in consultation with other pioneers in the field, developed the theory of "dynamic equivalence" or "functional equivalence," which stressed the importance of transferring meaning, not grammatical form. He spoke explicitly about many dimensions of meaning, and referred favorably to Roman Jakobson's classification of meaning into emotive, conative, referential, poetic, phatic, and metalingual dimensions.

Bloomfield , p. If, as is obviously true, each person employs language on the basis of his background and no two individuals ever have precisely the same background, then it is also obvious that no two persons ever mean exactly the same thing by the use of the same language symbols. At the same time, however, there is an amazing degree of similarity in the use of language So in Chapter 4 he focused on what he called "linguistic meaning.

Meaning was now to be seen within this framework. Nida was aware of this, and so in the following chapter he supplemented this account with a discussion of "referential and emotive meanings.

The temptation is all the stronger because Nida himself suggested that his scheme could serve as the basis for a translation procedure Instead of attempting to set up transfers from one language to another by working out long series of equivalent formal structures which are presumably adequate to "translate" from one language into another, it is both scientifically and practically more efficient 1 to reduce the source text to its structurally simplest and most semantically evident kernels, 2 to transfer the meaning from source language to receptor language on a structurally simple level, and 3 to generate the stylistically and semantically equivalent expression in the receptor language.

All languages show "remarkably similar kernel structures. In addition, the nonkernel structures do not necessarily reveal directly the underlying semantic relations. For example, the sentence "he hit the man with a stick" [41] may mean either that he used the stick as an instrument, or that the man who received the blow had a stick in hand.

Such ambiguous constructions often have to be translated differently depending on the underlying meaning.Vermeer, Hans J. Diagnostic or distinctive components. Become a member of TranslationDirectory. But it is not simply a mechanical product of word meanings, but includes a complex particular texture that varies with circumstance. The information provided by the commissioner allows the translator to rank issues of concern in order before deciding on inclusions, omissions, elaborations, and whether the translation should have ST or TT priority.

In many contexts, this involves a decided change of meaning, since the expression "God's love" does not indicate the object of his love. One can uncover reductionistic forces in semantics semantic features, semantic domains , structural grammar kernel sentences, transformational generative grammar , treatment of grammatical categories, discourse analysis, and the relation between word-meaning and sentence-meaning.

AHMED from Lexington
Review my other articles. I am highly influenced by baton twirling. I do enjoy studying docunments nicely.