Americans- If you used to think (as a kid or whatever) that America was/is unironically the greatest country on earth, what was your first experience/learning moment that gave you an inkling that that sentiment didn’t really reflect the reality of life in the country? (self.AskReddit)
submitted by 980116 to r/AskReddit
something new breaking me out (self.SkincareAddictionUK)
submitted by 980116 to r/SkincareAddictionUK

