What it really comes down to is your perspective on work. Do you live to work or do you work to live? When your job becomes 90% of your life, you have to accept the consequences of that choice. When your job is everything in your life, then what occurs there on a daily basis will impact your mental and physical health, because you live to work. If work is just a means of income so that you can support what really matters( friends, family, hobbies etc.) then what occurs there has far less impact on your life. This doesn't mean that your job should not be enjoyable or have a positive culture. It means that if you work in a place that does not have a work life balance, you should leave and find a place that does. The reason is simple" you do not want a job that you bring home". It really is up to you to define the relationship between yourself and your employer. You have more control than you think.