Healthcare will never be free in the US because people need to make money (which is needed to survive and which is what the world revolves around). The medical field is one of the biggest industries in the world, it's basically a cartel which rakes in billions every year. There's a reason why so many people want to be doctors: because it pays super well. Doctors make like $200k a year, with some specialties even making close to $500k. At the end of the day, healthcare is just a business. Kind of hypocritical for a profession where the job is "helping people", right? If they truly wanted to help us, they wouldn't charge us exorbitant costs and leave so many people in medical debt. Unfortunately, people will always profit off of sick people, that's just how the world is. They don't even view us as human beings, rather they just view us as cash cows: ways to profit and make even more money.