Top Winter Quotes

Winter Definition

(n.) The season of the year in which the sun shines most obliquely upon any region; the coldest season of the year.

(n.) The period of decay, old age, death, or the like.

(v. i.) To pass the winter; to hibernate; as, to winter in Florida.

(v. i.) To keep, feed or manage, during the winter; as, to winter young cattle on straw.