- #1
flashgordon2!
- 29
- 0
Ever 'notice' that the majority of people in a given country say they are whatever religion that is dominant in that country? For most people, if they are born in a christian country, they will become christian. If they are born in a oriental country, they will say they are Buddhist, or Taoist. If they are born in Arabic country, they will say they are Muslim.
People wonder why Nazyism and Rascism comes around. Few people ever ask the question of where they go when things become 'unpopular.' In watching the original "War of the Worlds", I couldn't help noticing how in the beginning of the movie, everybody seems so nice to one another, and then, when L.A. is about to get wiped out by the aliens, all of a sudden, people are portrayed as CRAZY!
So, which is it? Does anybody have the balls to pick out logically what's what around here? Are there any scientists around here? Or did everybody go to school just to get their wicks wet and have a roof over their head?
People wonder why Nazyism and Rascism comes around. Few people ever ask the question of where they go when things become 'unpopular.' In watching the original "War of the Worlds", I couldn't help noticing how in the beginning of the movie, everybody seems so nice to one another, and then, when L.A. is about to get wiped out by the aliens, all of a sudden, people are portrayed as CRAZY!
So, which is it? Does anybody have the balls to pick out logically what's what around here? Are there any scientists around here? Or did everybody go to school just to get their wicks wet and have a roof over their head?