Most Americans go their entire lives without having to worry about getting injured on the job or being diagnosed with a work-related illness. Work environments are much safer today than they were even 20 years ago, with the ever-increasing focus on enhancing safety equipment and educating workers on how to avoid illness and injury.
Unfortunately, injuries and work-related illnesses do still occur despite the best efforts of employers and employees. When this happens, the injury or illness must be reported. The catch, however, is knowing whether or not the injury or illness is truly work-related.