To start changing the crisis of health care in this country.
What would it be like if we paid a health care tax instead of paying for health insurance? I had this thought about changing it because I'm thinking about writing a book and this thought came to me, what if the government paid the doctors and nurses and all of the hospital and doctor office employees? I know, I know, this is a scary thing to think of. That is a very dramatic thing to propose. But I don't hear anything else coming from all of our big bad politicians, they aren't worried about not having benefits. I'm sure they are completely covered, courtesy of the American population. But there are so many of us little guys who are suffering, me included, with no coverage at all. What can we do to improve our situations? Is it possible to improve if the government took over the health care industry? I think if they took it over, then we would all have to be covered at least basically. Maybe, if people with more money were to pay for better coverage, or something like that. If I'm not mistaken, Canada already has health care for all of their citizens and we can look to them for guidance. Am I completely insane maybe? Please, someone let me know what you think and feel about this. I'm extremely interested in this. Thanks!