I could be completely wrong here, but I always thought of it as a cultural thing. Like, Americans for a long time have been very anti-socialist and have equated anything even remotely socialist as evil. (Likely due to the cold war and their poor relationship with countries like Russia.) Because of this, something like a free health care system may feel like a ‘slippery slope’ to becoming the very kind of people they deem as bad so to speak and as such, want to avoid that at all costs. I do feel like times are changing though and more Americans are realizing that they can achieve things like free health care without becoming a socialist state.
In my opinion, as an American and former healthcare worker, it has more to do with our government and their inability to handle just about anything. Healthcare is also political. The affordable care act reinforced that for a lot of people. It did give coverage to people but at the same time, it drove premiums for everyone else insanely high. As in from a few hundred to thousands. How is that affordable? The insurance industry here is a joke. Those companies will do anything they can to not pay out and so many of these companies are in the pockets of politicians. It’s a shit show quite honestly. If the government would not stick their nose in everything it would force more competition and lower prices. And, it has become so ridiculously complicated. But, just my opinion.