After decades of cities redevelopment and downtowns coming back, I think that we are seeing a new villification of big cities, specifically urban environments happening again due to right wing media constantly characterizing cities as "dem-runned urban cesspools and/or hellholes and/or armpits etc."
They have their supporters who have never been here thinking they know more about our cities than we do. It's bizarre and annoying.
I have 'friends' and some relatives on social media that have never set foot in CA who think they know all there is to know about LA or SF. I'm like bitch please STFU.
__________________
"Two roads diverged in a wood, and I—I took the one less traveled by, And that has made all the difference."-Robert Frost
|