Jump to: navigation, search
Restoring USA image in the world, a common liberal talking point. They have determined that America, with it unilateral approach on Iraq has damaged our image. False. The image has been damaged for some time now. First, is was a coalition of the willing, not the USA alone. Second, now that the Europeans have chimed into the Iraq debate as America wrong, liberals see it as the whole world hates America now. Before 9/11, extremists plotted to kill Americans. Why? We are the richest nation on Earth, do you think others hate us for that? Many nations are appalled at our pornagraphic industry that we export to them, can that be a part of it? Our support of Israel? We give women equal rights and a voice, many women in the world can't even leave their house alone or without a veil or go to school. My belief, and I am always correct, that liberals have found a hammer to hit conservatives with because the Europeans dislike our capitalism, and we are against socialism.--[[User:Jpatt|jp]] 18:20, 6 May 2007 (EDT)