The honest truth is that America has always been the villain.
Wait, are the U.S… the villains?
The honest truth is that America has always been the villain.
Contemplating a bewildering world through empathy and understanding
The honest truth is that America has always been the villain.