(The Federalist) It’s hard to survey the state of our country and not conclude that something is very wrong in America. I don’t just mean with our economy or the border or rampant crime in our cities, but with our basic grasp on reality itself.
Our cultural and political elite now insist that men can become women, and vice versa, and that even children can consent to what they euphemistically call “gender-affirming care.” In a perfect inversion of reason and common sense, some Democratic lawmakers now want laws on the books forcing parents to affirm their child’s “gender identity” on the pain of having the child taken from them by the state for abuse.
Abortion, which was once reluctantly defended only on the basis that it should be “safe, legal, and rare,” is now championed as a positive good, even at later stages of pregnancy. Abortion advocates now insist the only difference between an unborn child with rights and one without them is the mother’s desire, or not, to carry the pregnancy to term. (Read More)