Walmart is in the news for jumping on Obama's bandwagon. They are embracing "President Obama's call for requiring all large employers to offer health insurance to their workers". I am getting too cynical in my old ago because I really am suspicious of this. The first thing I think of is what angle Walmart is going for here and what is their ulterior motive? The USA Today article states that Walmart is trying to improve its image regarding worker treatment so I guess that is a good thing. They also want to block any proposal that would have employers pay the Medicaid costs of new hires. Maybe that is the issue right there. Or maybe requiring large employers to pay health insurance for their workers (which I agree with) would make a bunch of companies go bust. That would leave less companies standing. Now who would that benefit? Hmmm, I wonder.