Everything in politics seems backward
Does it seem to anyone else that in US politics, republicans say we are all about the government being smaller, staying out of your life, and lowering taxes, but during the time of republican control, I felt an increased government presence in my life, noticed outrageous spending increases for war, and the shrinking of our civil rights through the Patriot Act? Is it normal to feel this way?