Semantic primes: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Anthony.Sebastian
(Start new section ==Natural Semantic Metalanguage (NSM)==)
imported>Anthony.Sebastian
(add note)
Line 2: Line 2:
:::''See also [[Linguistic universals]]''
:::''See also [[Linguistic universals]]''


Suppose  you, as a bright child, have an excellent dictionary of the English language &mdash; ''The New Oxford American Dictionary'' or ''The American Heritage Dictionary of the English Language'', say &mdash; and you try to use it to learn the 'meaning'<ref>We leave the definition of the 'meaning' of a word for the time being, assuming the reader has an intuitive understanding of a word's 'meaning'. Shortly we will see the fundamental requirement for a word to have 'meaning', as argued by Wierzbicka and colleagues.</ref> of all the words therein, because you lack confidence that you use in a meaningful way all the words you ordinarily use, and because you want to learn the meanings of all the other words your elders speak.  In your dictionary you will find every entry-word defined in terms of other words, of course.  In your determination to learn the meaning (or meanings) of every word in the dictionary, you find you need look up definitions of the words employed in the definitions of every word.  But sooner or later you will find that you cannot complete your task &mdash; learning the meaning of every word in the dictionary &mdash; because every word’s description of meaning employs other entry-words whose meaning you also wish to firmly establish in your mind.  You find you have embarked on a circular task, not surprisingly, because the dictionary contains only a closed set of words, finite in number, that enable descriptions of the meanings of each other.   
Suppose  you, as a bright child, have an excellent dictionary of the English language &mdash; ''The New Oxford American Dictionary'' or ''The American Heritage Dictionary of the English Language'', say &mdash; and you try to use it to learn the 'meaning'<ref>We leave the definition of the 'meaning' of a word for the time being, assuming the reader has an intuitive understanding of a word's 'meaning'. Shortly we will see the fundamental requirement for a word to have 'meaning', as argued by Wierzbicka and colleagues.</ref> of all the words therein, because you lack confidence that you use in a meaningful way all the words you ordinarily use, and because you want to learn the meanings of all the other words your elders speak.  In your dictionary you will find every entry-word defined in terms of other words, of course.  In your determination to learn the meaning (or meanings) of every word in the dictionary, you find you need look up definitions of the words employed in the definitions of every word.  But sooner or later you will find that you cannot complete your task &mdash; learning the meaning of every word in the dictionary &mdash; because every word’s description of meaning employs other entry-words whose meaning you also wish to firmly establish in your mind.  You find you have embarked on a circular task, not surprisingly, because the dictionary contains only a closed set of words, finite in number, that enable descriptions of the meanings of each other.<ref>To further exemplify the circularity issue, consider that with a child on the brink of speech, you could not teach her how to recognize and pronounce every word in the dictionary and then expect her to go from there to learn every word's definition just by using the dictionary.</ref>  


If you do not already have in your mind a set of basic words whose meanings you somehow happen to have come to know independently, without the need of words to define them &mdash;  even though you came to associate that set of basic words with their meanings by listening to your elders speaking them &mdash; you will remain forever in a continuous circular loop in your dictionary, and fail to achieve your goal.<ref>We can now state the fundamental requirement for a word to have 'meaning', according to Wierzbicka and colleagues: one must already have in one's mind a set of basic words (semantic primes) whose meanings (concepts) one somehow happens to come to know, or develop, independently, without the need of words to define them. Would seem of little difficulty for a child just learning to speak to come to know the meaning of the semantic prime, MORE, when the child repeatedly hears the sound of MORE associated with the sounder receiving what we call 'more'.</ref>
If you do not already have in your mind a set of basic words whose meanings you somehow happen to have come to know independently, without the need of words to define them &mdash;  even though you came to associate that set of basic words with their meanings by listening to your elders speaking them &mdash; you will remain forever in a continuous circular loop in your dictionary, and fail to achieve your goal.<ref>We can now state the fundamental requirement for a word to have 'meaning', according to Wierzbicka and colleagues: one must already have in one's mind a set of basic words (semantic primes) whose meanings (concepts) one somehow happens to come to know, or develop, independently, without the need of words to define them. Would seem of little difficulty for a child just learning to speak to come to know the meaning of the semantic prime, MORE, when the child repeatedly hears the sound of MORE associated with the sounder receiving what we call 'more'.</ref>

Revision as of 18:58, 27 January 2008

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.
See also Linguistic universals

Suppose you, as a bright child, have an excellent dictionary of the English language — The New Oxford American Dictionary or The American Heritage Dictionary of the English Language, say — and you try to use it to learn the 'meaning'[1] of all the words therein, because you lack confidence that you use in a meaningful way all the words you ordinarily use, and because you want to learn the meanings of all the other words your elders speak. In your dictionary you will find every entry-word defined in terms of other words, of course. In your determination to learn the meaning (or meanings) of every word in the dictionary, you find you need look up definitions of the words employed in the definitions of every word. But sooner or later you will find that you cannot complete your task — learning the meaning of every word in the dictionary — because every word’s description of meaning employs other entry-words whose meaning you also wish to firmly establish in your mind. You find you have embarked on a circular task, not surprisingly, because the dictionary contains only a closed set of words, finite in number, that enable descriptions of the meanings of each other.[2]

If you do not already have in your mind a set of basic words whose meanings you somehow happen to have come to know independently, without the need of words to define them — even though you came to associate that set of basic words with their meanings by listening to your elders speaking them — you will remain forever in a continuous circular loop in your dictionary, and fail to achieve your goal.[3]

If you read linguist Anna Wierzbicka’s book, Semantics: Primes and Universals (Wierzbicka, 1996), you will find an argument, grounded in biologically plausible hypotheses and experimental observations, suggesting that we all in fact do have as part of our inherited human faculties a basic set of innate 'concepts', or perhaps more precisely, a non-conscious propensity and eagerness to acquire those concepts and encode them in sound-forms (words). The words that those concepts become encoded in Wierzbicka calls semantic primes, or alternatively, semantic primitives — 'semantic' because linguists have assigned that word in reference to the meaning of words (=linguistic symbols). Words that qualify as semantic primes need no definition in terms of other words. In that sense, they remain undefinable. We know their meaning without having to define them. They allow us to construct other words defined by them.

Moreover, as we shall learn later, as modern human beings who share the same set of types of inherited determinants that make us human (i.e., the human genome), all our natural languages, despite the diversity of language families, share the same basic set of innate concepts, or share the same propensity and eagerness to encode the same set of concepts in words. Our common grandmother and grandfather many generations removed may have encoded those concepts in a specific vocabulary, and therefore had an original set of semantic primes. The dispersal of their descendants from their African homeland throughout the world enabled the evolution of many different languages, each with a unique set of sound-forms for their words. Nevertheless the same set of semantic primes remained within each language, though expressed in differing sound-forms. Thus, as Wierzbicka argues, all modern humans have the same set of semantic primes, though not the same set of sound-forms expressing them, rendering semantic primes cross-culturally universal.

In this article we will elaborate on Wierzbicka’s theory (Wierzbicka, 1996; Goddard and Wierzbicka (eds.), 1994); exemplify the list of semantic primes; show how they underlie the meaning of the non-primes in our language; give some of the experimental observations that support the claim of semantic primes as universal among human languages; discuss the contributions and comments of other linguists and scholars from other disciplines; and indicate how far back we can trace the history the idea of semantic primes by any other name.

List of semantic primes

Consider Wierzbicka’s colleague, Cliff Goddard's words:

When Wierzbicka and colleagues claim that DO, BECAUSE, and GOOD, for example, are semantic primes, the claim is that the meanings of these words are essential for explicating the meanings of numerous other words and grammatical constructions, and that they cannot themselves be explicated in a non-circular fashion. The same applies to other examples of semantic primes such as: I, YOU, SOMEONE, SOMETHING, THIS, HAPPEN, MOVE, KNOW, THINK, WANT, SAY, WHERE, WHEN, NOT, MAYBE, LIKE, KIND OF, PART OF. Notice that all these terms identify simple and intuitively intelligible meanings which are grounded in ordinary linguistic experience (Goddard, 2002).

ALL PEOPLE KNOW ALL WORDS BELOW BECAUSE THEY HEARD THE WORDS SPOKEN A LONG TIME:[4]


Caption: Adapted from Goddard, 2002
Table adapted from Goddard, 2002.

A universal syntax of meaning

Semantic primes represent universally meaningful concepts, but to have meaningful messages, or statements', such concepts must combine in a way that they themselves convey meaning. Such meaningful combinations, in their simplest form as sentences, constitute the syntax of the language. Wierzbicka provides evidence that just as all languages use the same set of semantic primes, they also use the same, or very similar syntax. She states: "I am also positing certain innate and universal rules of syntax-not in the sense of some intuitively unverifiable formal syntax a la Chomsky, but in the sense of intuitively verifiable patterns determining possible combinations of primitive concepts(Wierzbicka, 1996)." She gives one example comparing the English sentence, "I want to do this", with its equivalent in Russian. Although she notes certain formal differences between the two sentence structures, their semantic equivalence emerges from the "....equivalence of the primitives themselves and of the rules for their combination."

This work [of Wierzbicka and colleagues] has led to a set of a highly concrete proposals about a hypothesised irreducible core of all human languages. This universal core is believed to have a fully ‘language-like’ character in the sense that it consists of a lexicon of semantic primitives together with a syntax governing how the primitives can be combined (Goddard, 1998).

It might strike many as not particularly surprising that all humans today possess a common language core of semantic primes and a more or less universal syntax, inasmuch as modern science teaches that all humans today descended from a common speech-enabled male and female Homo sapiens ancestor living in Africa before the exodus of a founder group that left Africa and dispersed throughout the world into at first many geographically separate groups developing into different so-call races. Linguist Johanna Nichols traces Homo sapiens language origin as far back as 130,000 years ago, perhaps only 65,000 years after the earliest Homo sapiens fossil finds (Adler, 2000). Philosopher G. J. Whitrow expresses it:

....despite the great diversity of existing languages and dialect, the capacity for language appears to be identical in all races. Consequently, we can conclude that man's linguistic ability existed before racial diversification occurred (Whitrow, 1988).

Natural Semantic Metalanguage (NSM)

In effect, the combination of a set of semantic primes each representing a different basic concept, residing in minds with a propensity to acquire certain basic concepts, and a common set of rules for combining those concepts into meaningful messages, constitutes a natural semantic prime language, or natural semantic metalanguage. In English, the natural semantic metalanguage reduces language to a core that enables full development of the English language. A new word can be added as a shorthand substitute for a 'text' in the natural semantic metalanguage, a 'text' that can convey what English speakers mean by happy, by what a person does when he says something not true because he wants someone to think it true, and by what has happened if something made in one part has had something happen to it making it now two parts, or many parts. Any English word can be described (defined) with a text using a primitive lexicon of about 60 words (concepts) in the English natural semantic metalanguage. Likewise can any complex semantic sentence in English be paraphrased reductively to the core words and syntax of the natural semantic metalanguage. The texts can make subtle distinctions English-speakers make between happy, glad, joyful, ecstatic", etc.


Notes

  1. We leave the definition of the 'meaning' of a word for the time being, assuming the reader has an intuitive understanding of a word's 'meaning'. Shortly we will see the fundamental requirement for a word to have 'meaning', as argued by Wierzbicka and colleagues.
  2. To further exemplify the circularity issue, consider that with a child on the brink of speech, you could not teach her how to recognize and pronounce every word in the dictionary and then expect her to go from there to learn every word's definition just by using the dictionary.
  3. We can now state the fundamental requirement for a word to have 'meaning', according to Wierzbicka and colleagues: one must already have in one's mind a set of basic words (semantic primes) whose meanings (concepts) one somehow happens to come to know, or develop, independently, without the need of words to define them. Would seem of little difficulty for a child just learning to speak to come to know the meaning of the semantic prime, MORE, when the child repeatedly hears the sound of MORE associated with the sounder receiving what we call 'more'.
  4. A sentence using only semantic primes.

References cited

  • Goddard C. (1998) Bad arguments against semantic primitives. Theoretical Linguistics 24:129-156. View/Download PDF of article [Goddard: "....this paper is heterogenous in nature and polemical in purpose...."]
  • Goddard C. (2002) The search for the shared semantic core of all languages. In Cliff Goddard and Anna Wierzbicka (eds). Meaning and Universal Grammar - Theory and Empirical Findings. Volume I. Amsterdam: John Benjamins. pp. 5-40. View/Download PDF of the book chapter
  • Whitrow GJ. (1988) Time in History: The evolution of our general awareness of time and temporal perspective. Oxford University Press. ISBN 0-19-215361-7. p. 11.