Nurses must have a healthy work environment that is safe, empowering, and satisfying in order to lead the way in improving health and health care for all. Just as health care workers have a duty of care to their patients, employers have a fundamental duty of care to their employees – to create a healthy work environment for them.