Pages

April 16, 2021

Did the Nazis Win The War?

I've said it many times; if an alien visited Earth in 1939, then came back now, they'd be convinced the Nazis had won the war. Especially right now, as we have global authoritarianism, lack of individual freedom, big corporates that are above the law and a media that is compliant and feeding the populous the official 'truth'. All are being subjected to medical experimentation. Facts are no longer facts, all facts are now checked and if they conflict with the mainstream narrative they are labelled false, or mostly false. We have sunk into the abyss. 

No comments:

Post a Comment