One of the biggest myths perpetrated on the American people is that the U.S. is a democracy. The claim that the U.S. is a democracy is as specious as the claim that the U.S. economy is a “free market economy”. The U.S. never was a democracy and there’s no evidence to suggest that it ever will be. The best that can be said about the U.S. system of government is that it’s a republic, however a more accurate term would be a plutocracy bordering on a kleptocracy and on a track to fascism if the American people don’t wise up. Here’s a brief history of democracy in America:
The part of North America that was colonized by the British in the 1600s and later became the United States had its beginnings as a colonialist-settler white Christian ethno-racist state steeped in Protestant religion. Among the first settlements were the Massachusetts Bay Colony and Jamestown Virginia both of which were theocracies. The only thing “democratic” about the early American colonies is that white Christian men who owned property were allowed to vote and participate in the political process. The Founding Fathers (writers of the U.S. Constitution) didn’t even include the right to vote in the Constitution and left it up to the individual states to decide who could vote. It wasn’t until 1920 after the passage of the 19th amendment that non-Native American women were granted the right to vote and not until 1924 under the Snyder Act which granted U.S. citizenship to Native Americans were Native American women allowed to vote. Decades later, Asian Americans were granted the right to vote and so on. During that time, Black women were granted the right to vote, however the reality was that the individual states did everything they could to prevent the Black vote via poll taxes, literacy tests, out and out violence, etc. To this day, the individual U.S. states continue their centuries long methods of voter suppression and intimidation in order to prevent non-whites from voting and otherwise participating in the democratic process.