Watching one of my favorite shows....Frontline, Sick Around the World...last night. Like the movie Sicko, it investigated healthcare systems around the world and compared them to the US. I think it was summed up best this way...."We have to take the profit out of healthcare." Since healthcare became a major business in the US, healthcare quality, availability, and affordability has suffered dearly. The other countries' officials and doctors were amazed that Americans go bankrupt over healthcare costs...very few, if any, countries have such an issue.
Another shockingly true statement made: "Healthcare is a human right." Isn't it true that taking care of our society physically and mentally a right of each citizen? Doesn't the government and business want everyone to be healthy and happy? Won't we be a more productive and safe society that way?
I have to laugh when I heard about the many countries that offer free mental health services including spa treatments, Eastern medical practices, and even paid vacation time. We don't even give basic care to everyone...I'd go for free dental and vision care for all.
In our family, it was a requirement that one of us have a job with full health benefits. So we have 'comprehensive' health insurance with dental and vision. So, I can go to the dr when I need to. But I have a $20 co-pay. And my prescriptions cost between $15-60. These co-pays have quadrupled in the past 8 years. I would not hesitate to bring my child to the ER and I don't skip my annual physical. But I do wait to go to the dr for anything minor and I ask my dr for the minimum prescriptions (I take regular medications). And we have insurance. What about those people who don't have anything? Can they afford a $100 dr visit to check out their kid's ears?
I just want to live in a society that I feel like cares about me and everyone equally. A universal healthcare system is essential to our society's survival.