Is it normal to believe that a persons career is based solely on politics?
Like when I watch the news, I always hear them say “President Biden is creating blah blah new jobs” and all that? I’m just saying, is that normal for people to believe that their job or career is based or totally decided on Democrat/Republican Politics or the President? I would think that it’s how much education and experience you have that gets you a job? Not “The President created this job for you”?