This is today’s “word of the day” from the Merriam-Webster’s Dictionary daily email. Obviously–or maybe not–it is a noun, and here are the definitions:
1. catchword, slogan
*2. a widely held belief or truism
3. a custom or usage regarded as distinctive of a particular group
When I read the first one, I thought it was a cool word. Then I read the second one, and I laughed to myself. I said, “HA! What exactly is a WIDELY held belief these days? What is truth to us??” I am a Christian, so my life is governed by what the Bible says is truth–which, by the way, I happen to believe is THE truth. But that’s me and all my Christian brothers and sisters. What does America as a whole believe? I mean like, what are some beliefs that we as a country can unanimously agree on? What is a truism amongst Americans? Now, I know there are basic things that we all believe like you shouldn’t kill anybody (which comes from the Bible by the way :-)), we shouldn’t harm children, and all that type of stuff that everyone in the universe believes anyway. What are some of the deeper shibboleths in the American society? Just in case you’re missing my tone, I’m not trying to be combative. I truly am curious as to what people believe and what you think truth is. Please leave your comments!