We’ve been taught to suffer in silence. Cope on our own. Keep “family business” in the family. If/when talks of seeing a therapist arise, we’re now viewed as either crazy, emotionally unstable, dramatic, or all of the above. Why is seeking help to heal viewed so negatively in our culture?.