Is it normal to think the us is owned by drug cartel?
Is it normal to think the USA is owned by the drug cartel, that all drugs are perfectly legal across the US, that all cops are automatically dirty, that a cop will never get busted for doing drugs, and that USA is the only country in the world that has or ever had issues with drugs? Is this a normal idea for people to believe?