When you mention fascism, many people immediately think of Nazi Germany. Very few… dare I say, “Americans”… are aware of the fact that fascism was somewhat popular in the United States around the same time as the rise of the National-Socialist Party in Germany. This is an important issue to understand, because the general consensus […]
Category Archives: Fascism
Fascism in America
While everyone was distracted, arguing over the loss of Trayvon Martin, President Obama was selling our country to the devil –big government. The collusion of business and government has finally been codified in the United States, and fascism is officially upon us; not full-fledged fascism, but a confusing mess of political and business interests that […]