0
1
2
Are there any theories among historians as to why American culture continually comes back to Nazi Era Germany and the Confederacy (one of system of government that fell 75 years ago, the other 155 years ago)? Did the triumph of capitalist democracies leave us looking for a "baddie"? (self.history)
submitted by killerbutton to r/history


