I have noticed from websites, and international opinion that Americans and America itself is looked at like were all fat-ass arrogant little *****s. The rest of the world envys America for it's greatness, I am assuming. Why is it that a country that supports independence and freedom on such a level, looked down upon? Is it our foreign policy? Are we not giving enough money and rations to the starving nations? Did we not create somewhat of a "freedom" in Afghanistan, by destroying the Taliban? Did we not send in the troops to Somalia, when the people were being starved?

Why is it that when 9-11 first when down, all these European nations claimed they had our backs...Germany gifted our government with a few tanks painted red, insinuating their Support in a military action. But as soon as Bush went over there to call to arms, Germans didnt want anything to do with it. Grant it, we dont really want Germany having an army anyways...(Sarcastic, has been playing Medal Of Honor) but I am tired of people saying something and doing the opposite.
Is this the type of behavior that has lead America to its reputation? Or is it that America is simply the most powerfull country in the world and the world has become envious of her power? Or is it because we support Isreal, and the world simply hates the Jews or what ?

I would be most interested in Negative's responce...you seem to hate Americanism more than anyone here...