Is it normal to be drug-free thanks to doctors?
In the '80s doctors were things I could trust, now they're scary, corrupt, and present the ultimate no-fun, isn't that extremist?! Come on! They overdose me with drugs and wonder why I refuse to take my medicine. Drug medicine is a lie accepted by popular people, I prefer healing naturally or by sorcery. The whole drug industry is no respect for the patient, it's bullshit! I think drugs are not healthy, just tell yourself you don't need them. This is the reason I usually don't take pills, I heal myself with a Pepsi or Coca-Cola, do you think these drugs (pills and injections) are good? It's Christian medicine, and Christianity is only good for suicide. Am I the only one who distrusts doctors? I don't do drugs, drugs can't be healthy. I'd rather smoke to relieve stress than take calming medicine, it's pseudo-science, believing a doctor because it's a doctor is the appeal to authority: believing something because an 'expert' said so. Is this normal?