Do I need safety boots signs in my workplace?
You may need safety boots signs in your workplace if there is a risk of foot injuries. The Health and Safety Executive (HSE) states that employers must provide safety signs if there is a significant risk that can't be avoided or controlled in any other way, such as through safe systems of work or engineering controls.