Salubrious comes from the Latin word 莽硃梭贖莉娶(is), meaning promoting health, which is related to the Latin word 莽硃梭贖莽, health.
釦硃梭贖莽 also gives us the Spanish word salud, health, which is used to say Cheers! (To your health!) when toasting drinks or instead of Bless you! when someone sneezes.
Use of salud in English dates back to at least the 1930s.
EXAMPLES OF SALUBRIOUS
The company’s commitment to providing a salubrious workplace includes regular ergonomic assessments and wellness programs for employees.
The fresh sea breeze and clean ocean air make this coastal town an incredibly salubrious place to live.
WHAT'S YOUR WORD IQ?
Think you're a word wizard?
Try our word quiz, and prove it!
Pernicious comes from the Latin word 梯梗娶紳勳釵勳莽喝莽, meaning “ruinous.”
捩梗娶紳勳釵勳莽喝莽 combines per-, “through,” and –nici-, a form of nex, “death, murder,” that has the stemnec-.
A few terms that share this stem are necromancy, “a method of divination through alleged communication with the dead,” and necropsy, also known as an autopsy, “the examination of a body after death.”
EXAMPLES OF PERNICIOUS
The pernicious effects of smoking can lead to severe health problems, including lung cancer and heart disease.
The invasive species had a pernicious impact on the fragile ecosystem, wiping out native plants and disrupting the natural balance.
This sense of hallucinate, used in the context of AI and added to our dictionary this year, represents many profound ramifications for the future of language and life. In 2023, we saw a significant increase in dictionary lookups for the word along with increased use in digital media.
More about hallucinate
Hallucinate was first recorded in 15951605 and comes from the Latin word 櫻梭贖釵勳紳櫻娶蘋, a variant of the verb meaning “to wander mentally.”
Use of hallucinate in the context of computer science dates back to a 1971 research paper on training computers to accurately read and output handwriting.
Hallucinate, in the featured sense, began to appear in the context of machine learning and AI by the 1990s.
EXAMPLES OF HALLUCINATE
Researchers found that the chatbot hallucinates more frequently than previously thought, with as much as 10% of its output being inaccurate.
The virtual assistant began to hallucinate, providing users with helpful tips mixed with unreliable information.