I study and work (in a field of science, that is predominantly male, far removed from any sort of liberal art) and it hasn't affected me in the least. If a place decides to fire you or not to hire you for those reasons is it really somewhere you want to be working anyway?
I have never heard of any of those things you mentioned. Have you seen any of this happening at your work/hobbies, or only in online news articles?
What's with people constantly blaming everything on feminism?
↑ View this comment's parent
← View full post
I study and work (in a field of science, that is predominantly male, far removed from any sort of liberal art) and it hasn't affected me in the least. If a place decides to fire you or not to hire you for those reasons is it really somewhere you want to be working anyway?
I have never heard of any of those things you mentioned. Have you seen any of this happening at your work/hobbies, or only in online news articles?