Do yall really think Cali is any better?
If so why? how?
California is 250% better than any of the states that I listed. Better than every state in the U.S? I don't know, but if it's not the best, it's probably close.
Why? Because the diversity that we have here. I'm not denying the fact that racism exists here, but to compare it to that of the midwest or the south is...well, you can't compare it.
I've lived in Cali pretty much my whole life and have never heard the word "Nigger". Never even heard anyone talk badly towards anyone of color. In Texas, Ohio, Wisconsin and Alaska, I heard that shit on the first day out there.
Going to these other states, it's like a whole different vibe out there. It's almost as if you can feel the hatred in the air. White people (from what I've seen) REALLY HATE black people out there. And vice versa...the black people REALLY HATE the white people. You see the double takes by white people...you see the evil eyes.
Now I'm not saying it doesn't exist out here, but I have yet to see it. And from that stand point, it makes Cali a better place. Sure 1 racist is too many, but to live here my whole life and not see it, that's a pretty good track record.
With the diversity that we have here, it's made Cali a better place. People of all races grow up together, go to school together, work together, etc. It's made people tolerant of each other.
In some of these other states, you go to a city, and you won't find 1 person of color...not 1. That's crazy. It's no surprise that people act the way they do over there.
Again, I'm not saying it doesn't exist here, but you can't compare it to these other states.....