When Words Stray from Their Roots
Imagine the following scenario: you're at a dinner party, or some other such social gathering, and are explaining to a small child that December, the twelfth month of the year, is when such holidays as Christmas and Hanukkah occur, when suddenly you are interrupted by a fellow who informs you that December is not, in point of fact, the twelfth month of the year.
Since you have the spirit of politeness, you raise an eyebrow or two and ask for clarification. Your interrupter obliges and begins to explain his reasoning: "see, the name December is based on the Latin word decem, which means “ten,” and does not mean “twelve,” so it actually is the tenth month of the year..."
Most of us, it seems safe to assume, would at this point begin to slowly back away from the crazy man with the rigid adherence to etymological fidelity, making sure as we do so that he does not have any sharp object in his hands, and if he does, making additionally sure that the aforementioned small child is positioned between us and said crazy man.
And yet, had the scenario been slightly different—if one were perhaps explaining to a small child the proper method for decimating the tray of cupcakes, and that very fellow had approached and explained that decimate, because it comes from a word meaning “ten” in Latin (that same decem) was most properly used to refer to removing one tenth of a thing, such as soldiers in a regiment as a form of military punishment—in that scenario, many of us seem inclined to nod our heads and say "hmmm, you've got a point there.”
Why is this?
One possibility is that a word such as decimate occupies a singular position in the English language, describing something that no other word can describe, and because of this singularity of meaning some people are determined to try to prevent the word from shifting in meaning. It would be pleasant, were this indeed the motivation behind people who correct the putatively incorrect use of words such as decimate, if only because the alternative explanation is that we are awash with linguistic pedants who are trying to impose a series of capricious and illogical rules on our language use.
Have you ever said anything along the lines of “kids these days keep using the word awesome to describe things that are not, in point of fact, deserving of awe, and boy, does it burn me up”? There is a good chance that you have, since this is a very common complaint. Now, have you ever used the word awful to describe something that is not, in point of fact, full of awe? There is a good chance that you have, since this is a very common way to use this word. Why is it considered improper to use awesome to refer to such things as brunch, yet awful is, for most people, an acceptable descriptor for the meal that restaurants use to get rid of all their leftover food that is about to go bad?
There have been a number of people who have inveighed against this loose sense of awful over the years, but their ranks are thinning, and most of us seem to not mind its use very much. If you have taken these conflicting positions about awesome and awful, you needn’t feel bad about it (and you probably don’t); one of the only things that is as resolutely illogical as the English language is the way that most of us feel it should be used.
Ambrose Bierce, in his 1909 guide to usage, Write it Right, stated that dilapidated “cannot properly be used of any but a stone building,” on the grounds that “the word is from the Latin lapis, a stone.” Bierce’s admonition notwithstanding, we seem to feel comfortable referring to building made of wood and other substances as dilapidated. We are likewise entirely comfortable in referring to something splendid as fabulous, even though the original sense of the word was “like the contents of fables in being marvelous, incredible, absurd, extreme, exaggerated, or approaching the impossible,” and the word clearly comes from the Latin root of fabula (“conversation, narrative, tale, play, fable”).
Some defenders of semantic purity have taken the position that aggravate should not be used in the sense of “to rouse to displeasure or anger,” since an earlier meaning of the word was “to make worse.” However, if we are to insist on a word retaining its original meaning we would be stuck with only using aggravate to mean “to make heavy, or weigh down,” since this was the sense that the word first had in English (it comes from the Latin word aggravātus, the past participle of aggravāre, “to weigh down, burden, oppress, make worse”).
Some of these shifts, such as fabulous, make a certain kind of sense, and one can easily see how a word might logically move from meaning “resembling a fable” to “that’s really great.” Yet there are a number of other words in our language that have unmoored themselves in seemingly inexplicable fashion from their roots (such as talented, which can be traced back to a Latin plural for units of weight or money, talenta).
Regarding the fact that December has a root coming from the word for “ten,” yet is occupying a position that does not quite match this … well, it’s hardly the only month that doesn’t match its roots. September, the ninth month of the year, comes from the Latin word for “seven” (septem), October comes from octo (“eight”), and November may be traced back to the Latin word for “nine” (novem). In the calendar used by the ancient Romans (which had ten months), each of these months did match the number indicated in its name. Around 700 BCE January and February were added on, initially placed at the end of the year. When these two months were later bumped to the front of the calendar the four preceding months found themselves at odds with their roots.
When we consider the ways that people have complained about the expanded senses of such words as literally, dilapidated, awful, aggravate, and talented (the poet Samuel Taylor Coleridge hated this word), three things seem clear:
1) Words will often stray from their roots
2) People will complain about this
3) The English language will somehow survive