when did pharmaceutical companies become so powerful and their medicines so popular and why? how is in the UK or not?
I know the USA has had a long time of people "needing" medicines for this and that. I think mostly just stuff that can be taken care of by being in good physical shape by excercise and eating a healthy diet, but pharmaceutical companies have brainwashed people in the USA to think they "need" medicine" when became this way and how is similar or not in the UK and Europe AND WHY THIS IS? WHY THIS DIFFERENCE?
- FoofaLv 72 months ago
Via a series of Congressional actions that over the years removed a lot of commercial impediments to Big Pharma becoming as big as it is. At least three of these were supported by the new US President back when he was in Congress. So I don't think the US is going to see any brakes put on pharma any time soon. But the difference between the UK and the US is that in Britain the taxpayers pay for drugs via the taxes that support the NHS, so the profit motive isn't really there.
- ?Lv 72 months ago
In the USA medicine is big business. It's a profit driven business. The welfare of the patient is secondary to profit. Your doctor is NOT a healer but is a salesman for pharmaceuticals and procedures. The longer they can keep you sick, the more money they make.