I don’t feel up to writing a dissertation paper, so I will keep it short.
Apparently, the majority of Americans feel that healthcare should be free. I don’t believe in socialist concepts. I feel that eliminating the cost of healthcare devalues medical care providers. I believe that health care is not a “right”, it is a luxury. Life itself is a luxury.
Obamacare increased health insurance premiums. It also became mandatory to have insurance. It became illegal for insurance providers to deny individuals with preexisting conditions.
Now, our new president is amending the changes his predecessor implemented. Opinions are pouring out of every media outlet. Obamacare is still the law of the land and Trump-care still has to pass through the senate.
I don’t care what the changes are. Few things in life are free, so why should healthcare be one of those? Insurance companies are a business. They have to earn income too or the business will go under. Would you like the government coming into your house and telling you how to go about your day?
You are asking insurance companies to take risks and shell out more money because you feel like you should pay less. I believe it is inherently wrong to force these corporations to cater to your wants.
I’m not against the need to value life, but what I am against is the requirement for me to fork over my earnings because of the inability for most individuals to make good choices.
I’m going to die someday. There is no way to prevent it, only prolong the inevitable. It is a luxury to be able to see a doctor. If I can’t afford healthcare, then I don’t have it. It sucks, and the possibly that I lose my life is real. I still don’t think that the nation as a whole should coddle.
Well, that’s all I got. Feel free to ask questions and I will expand on my opinion.