posted
So what do people here think we should do with our current health care system in the US?
I work in health care and have a personal connection to this topic. But before I share my thoughts, I'd like to hear a few from everyone else.
What do you think put us in this mess? Is there a mess? Should we change the system? If yes, How? Does the constitution guarantee the right to health care? Should the government (state or federal) provide health care? What about medications? What about insurance? Should drug companies be more regulated? (BTW, medications in general are the most highly regulated industry in the country, can you say FDA).
I know this is a lot to chew on. Just trying to start a civil discussion. Lets bounce some ideas around.
posted
It almost certainly belongs on the "Books, Film, and American Culture" section of this site. I believe you can ask Kathy or Kristine to move it; they've done that in the past.
Posts: 37449 | Registered: May 1999
| IP: Logged |