Do People have a right to good health?
I think a lot of people's political standpoint comes down to the belief that humans have a right to have good health. IMO many people are democrats because it is the only party that is pushing for mandatory health care for all people.
To get a better idea where the rest of you are coming from, I'd like to know if you think that people have a "right" to health care. I'd also like to know why you feel that way.
BTW, while thinking about this, I'm reminded of that guy, I forget his name, but everyone who has been to college knows who I'm talking about. Anyway, the guy was against government aid and thought that each class was for it's own. I wonder if that is ultimately why people fight for healthcare. If the stuff about "for the good of humanity" is bullshit, and they are fighting for it because they want to force someone else to help insure their survival and the survival of their family/community. I may be full of shit though, it was just a random thought.
Anyway, do you think people have a right to others insuring their good health? Why, or why not?