When President Obama signed the Affordable Care Act into law in 2010, one key component of the landmark healthcare law was the “ individual mandate.” The individual mandate required almost all Americans to have health insura…
The post Is health insurance mandatory in the United States? appeared first on HealthSherpa Blog.
3295 River Exchange Drive
Ste 275
Peachtree Corners, GA 30092