by elkanah1 » Sun Feb 26, 2012 12:18 pm
It is insurance that the employer (business) buys from an insurance company which pays for medical bills and provides other benefits when a worker is injured on the job. It is required by law in all U.S. states, except Texas. By having this insurance, GENERALLY, the worker is prevented from filing a lawsuit against the employer if he gets hurt on the job.
For more Information click on this link:
http://www.nolo.com/legal-encyclopedia/your-right-to-workers-comp-benefits-faq-29093.html