Is hollywood racist? am i irrational?
The media just seems completely racist to me. It's sickening. There are so many things but I'm going to focus on just two.
Hollywood hates interracial couples. Will Smith, the actor that proved a black man could make a block buster, falls under this. The two movies that do show Smith in relationship with a white woman, he produced himself. Is that really what it takes? The audience loves his movies. The audience has no problem with the main character being black or an interracial couple. But Hollywood thinks the audience hates this stuff so it doesn't want to show it.
Whitewashing. This is a common one and it seems like the only company that's ever reverse-whitewashed is Marvel. I'll just focus on one movie, The Hunger Games. The movie so progressive proving that men would watch a movie with a female lead, as long as she's white. The Hunger Games had a low budget so the directors didn't merely pick a starlet, there were auditions for the main character Katniss. A character described in the book to have straight dark hair and olive skin. When the auditions were advertised the studio wrote that they were only looking for Caucasian actresses. This goes beyond simply picking the white actress over the olive one. They might as well have written "Colored need not apply"
Part of the reason why this pisses me off so much is because the people of the entertainment industry always call themselves liberal and progressive. They paint themselves as these wonderful understanding guys. They're not bible thumpers or trailer trash they're the good ones. It's not true at all. They're just a bunch of racists with too much power.
They're racist | 25 | |
They're just playing it safe | 19 | |
There's no racism | 13 | |
Other | 2 |