This post by phil2003 ( http://www.vbulletin.com/forum/showp...90&postcount=6 ) really upset me.
I am not trying to flame people or anything, but do you realize that it's not just in America that kids bring guns to school? It happens in ENGLAND and FRANCE and any other country in the world. I notice people on here (and in-fact everywhere) saying horrible things about America, and how we are this horrible group of people. While I understand we can be a little stuck up and rude at times, it's not like every other country is little angels too.
Why is America ALWAYS the target for things like this? It especially comes from the people who live in England. I don't go around saying "Damn Canadians" or "Stupid English" or any other saying like that, and infact, I've never heard of anyone bad mouth another country, yet America seems to get it the worst. I don't even say rude comments about other countries or the people in it or the way they are as a society, yet it seems ok for others to say things like this about America.
This country is NOT as bad as people think it is. The media tends to expand and explode on whatever happens here (and in any country for that matter).
My question is, WHY AMERICA? I could name a LIST of things that piss me off when people say things like their country is perfect, but I don't because I'm not rude like that.
I realize this topic will probably get locked, but I just wanted to know what peoples' thoughts were on this.
I am not trying to flame people or anything, but do you realize that it's not just in America that kids bring guns to school? It happens in ENGLAND and FRANCE and any other country in the world. I notice people on here (and in-fact everywhere) saying horrible things about America, and how we are this horrible group of people. While I understand we can be a little stuck up and rude at times, it's not like every other country is little angels too.
Why is America ALWAYS the target for things like this? It especially comes from the people who live in England. I don't go around saying "Damn Canadians" or "Stupid English" or any other saying like that, and infact, I've never heard of anyone bad mouth another country, yet America seems to get it the worst. I don't even say rude comments about other countries or the people in it or the way they are as a society, yet it seems ok for others to say things like this about America.
This country is NOT as bad as people think it is. The media tends to expand and explode on whatever happens here (and in any country for that matter).
My question is, WHY AMERICA? I could name a LIST of things that piss me off when people say things like their country is perfect, but I don't because I'm not rude like that.
I realize this topic will probably get locked, but I just wanted to know what peoples' thoughts were on this.
Comment