Health insurance mandate

A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of (or in addition to) a national health insurance plan.

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.