Jamie Druhan
1 min readJan 25, 2023

--

You keep writing about Democracy as if it has ever been anything other than a hope in America. America has never had a true Democratic Republic. There have been moments it seemed we were heading in the right direction but Capitalism got a hold on those also.

America was first a Confederacy of white supremacist states loosely held together by the Articles of Confederation that became an Oligarchy of white upper crust who controlled government and the states which they solidified with a Civil War. That war freed foundational black Americans from chattel slavery but did not give them true freedom or safety.

There were plenty of mini-wars with bloodshed to force laws to be passed that gave women rights, took children out of factories, gave more safety in the workplace, passed laws against monopolies, regulated asylums, helped the poor, etc. Rights for women to have control of their bodies, LGBTQ rights, and racial equality among others are still a fight. Elections are manipulated and stolen.

What Democracy? How can you destabilize something that is not there? The real fight is to gain it in the first place.

--

--

Jamie Druhan
Jamie Druhan

Written by Jamie Druhan

Advocate for the vulnerable. Striving for better thinking.

Responses (2)