Are employers responsible for health insurance at all?
I think employers should be responsible for a lot of things like paying living wages, making safe products, creating safe work environments, not scamming the government out of trillions in bail out money.
But I don’t think that health care should be their responsibility. Period. It makes a business less competitive, it increases unemployment and under-employment (hello part-time workers who are that way because the cost of benefits is too high) and it makes it more “justified” to discriminate for health reasons.
Just imagine the kind of country we could have when healthcare is universal and public. How many small businesses could be started by people with great ideas who can’t leave their current jobs because of health insurance? How many more people could a business hire when their labor costs are drastically cut because they no longer have the burden of health and welfare benefits to carry?
Medicare for all is good for business.