Post Truthism Is Destroying America From Within
Post Truthism Is Destroying America From Within

The American zeitgeist has dramatically shifted in recent years.

Today, accepting lies has become exponentially easier than fighting for truth.

It isn't just cancel culture.

It's now possible and even likely to get jailed just for dealing in objective facts.