Is it normal to think that the media is extremely liberal?
The media in general, and especially Hollywood, has always come across to me as very liberal, to the point of kissing up to the Obamas, especially the First Lady.
is it normal that I think this from what I've observed?
And, no, I DON'T watch Fox News, so I haven't been "corrupted" by it or whatever. It's just that I've noticed that the media tends to reports of one political party more than the other.