My sister is taking an American culture something something class at the local community college. It's basically an online class of essays and papers. Her latest topic made me curious about your opinions because I've heard both sides. You don't have to write a paper, but which way do you tend to be and why?
I'd love to hear your opinion whether or not you are an American or have ever lived here. In fact, I'd enjoy the opinion of nonAmericans about this topic.
Here is what her teacher gave them ...
should just forget the rest of the world. When we try to help, they take
advantage of us. When we try to do what we have to in order to defend
ourselves, we are viewed as evil. Well, if they don't appreciate us, fine. If
they don't like us, fine. We should just stay at home, keep our money here, and
if the rest of the world wants to go to heck in a hand basket, fine by
me!" Either agree or disagree with this sentiment and defend your answer.