Fat is a part of our body that we need to survive, yet as a culture, we have changed the meaning of the word because of weight bias and the stigma of obesity in society, Ramos Sales said.
The word has become associated with poor health, less discipline, or less willingness to take care of yourself, all perceptions she said aren't necessarily based on evidence.