- #1
Kerrie
Staff Emeritus
Gold Member
- 844
- 15
It seems to me that Hollywood stars set the image and influence of America GREATLY. I finally figured out why the American people look like a bunch of superficial idiots in this world, and that is the image set from these stars that the entire world sees. They are super rich, super flashy, and live among the elite without (perhaps) a true understanding of reality in this world. I guess it saddens me to see this because I don't believe that Hollywood stardom is a true representative of the American people, yet because of their worldwide fame, this is how we look.