I work in hospital administration, so my views are probably biased, but I just don't see the purpose of health insurance companies. Patients are expected to pay a premium, then...meet a deductible....then, depending on their plan, they may get everything else covered, or they still have a portion of the bill they are responsible for. Meanwhile, I can speak with 100% certainty that hospitals are not walking away with huge profit margins. At my hospital alone, we write off millions of uncompensated care on a monthly basis.
I just can't wrap my head around why a health insurance company, that provides no actual care to the patient, is making big profits in this current system. Then, they turnaround and fight the hospitals on every single thing to pay as little as possible. That is what is driving up the costs for the actual consumers. If hospitals want to stay open, they have no choice but to increase the costs and the patients are the ones that have to bear it.
I consider myself a Conservative, but working in hospital administration makes it difficult not to wonder what would happen if private for-profit insurance companies were eliminated. I work in a state that did not expand Medicaid and that is just another burden on the hospital's plate. We can't turn away someone that shows up to our ER, regardless of payment. Even if we could, I don't think it would be right to do so.