What great truths about cities have you learned from movies and TV?
Considering the neverending sluice of shit in the news these days, let's chat about something a tad more lighthearted. Namely, what irrefutable truths have movies and TV shows taught you about these great cities we all love?
For instance, did you know that every third resident of Portland is a were-possum or some such? Had I never watched Grimm this knowledge would have been hidden from me.
Then there's the fact that Atlanta is the only place on God's gray earth where Black people live lives of sufficient fullness to allow them to function as main characters. Everywhere else -- everywhere else -- Blacks just can't thrive enough to get past the sidekick or ensemble team member stage. It's very sad, really. Oh, occasionally you'll get a Black person who tries to forge a main character path somewhere else, such as when Jada Pinkett Smith played a doctor in a medical drama set in Richmond, but it never takes.
And Miami! Who knew that all the crimes committed in Miami are committed by rich white people with sufficient engineering and welding experience to commit their crimes with remote-controlled robotic guns attached the front of luxury cars? Thank goodness CSI: Miami was there to inform me!
And let's not even talk about how nice it is to live in New York, be in your early twenties, and have the time and the money to spend days at a time sitting around in coffee shops. Friends reruns taught me about that.
More will be added as they occur to me, but in the meantime, what universal truths have you learned from your studious media consumption?
__________________
"To sustain the life of a large, modern city in this cloying, clinging heat is an amazing achievement. It is no wonder that the white men and women in Greenville walk with a slow, dragging pride, as if they had taken up a challenge and intended to defy it without end." -- Rebecca West for The New Yorker, 1947
|