Real quick, cuz I'm on vacation:
I just saw Avatar, and I find it really strange (no better word for it) how masterful Americans have gotten at producing and consuming obviously anti-American (-imperial) narratives. The movie would make anyone able to follow its simple plot want to leave the theater and say, "Death to America," or at least to the America that has existed ever since it wiped out the Native Americans and took New Mexico from Natives and went into Vietnam and Iraq and so on. I mean, if we all so sincerely feel that Empire is a bad thing, that working against the interest of Natives is bad, why are we still doing this?
How can we watch a movie starring us as the bad guys, and like it? What a strange development in moral history.