I recently moved to California. Before i moved, people asked me “why are you moving there, its so bad?”. Now that I’m here, i understand it less. The state is beautiful. There is so much to do.
I know the cost of living is high, and people think the gun control laws are ridiculous (I actually think they are reasonable, for the most part). There is a guy I work with here that says “the policies are dumb” but can’t give me a solid answer on what is so bad about it.
So, what is it that California does (policy-wise) that people hate so much?
California gets trotted out in the conservative media sphere as “liberalism run wild”, a place where being what they consider to be a “real American” is illegal but crime is subsidized by the state, where everything is expensive and dangerous, and homeless people have gay sex in the street. There’s an entire industry focused on filtering for the most extremely awful news they can find in a state of almost 40 million people, packaging that news as though it’s the typical experience everyone there goes through, and then blasting that news into the brains of Americans 24/7. That image, carefully crafted to be as extremely negative as possible, is the only experience most people have with California.
The liberalism run wild concept is kinda what I’m curious about. Like what things? I know California protects abortions and has stronger gun control laws. But is that really it? There’s gotta be more actual examples
A lot of social programs, better employee pay and benefits, legal weed. Conservatives are just jealous that their shithole backwater hick towns will never change so they point at the scary liberal boogeyman that is “Commiefornia” in some vain hope they will get noticed.
FuckTheSouth.com