Why does the liberal media.....
Continuously publish reports about violence against women in middle eastern countries? Why is this even relevant in the everyday life of the average American? Aside from the media being liberal, what other agenda is there to constantly reporting it? Frankly, I don't care about women's "rights" in general, and I certainly don't care about what happens in tribal countries like Pakistan.