Getting injured at work can instill fear about keeping a job and keeping the ability to perform it. That fear shouldn’t prevent someone from going to see the doctor as soon as possible.
Nearly 3,000,000 workers were injured in the United States on the job in 2018. Unfortunately, being put in a less than safe situation isn’t an isolated incident–and that is why employers have workers’ compensation insurance.
Should you go see the doctor? If a work-related injury or an accident has affected you recently, the answer is likely yes.