We've known for the longest time that Hollywood was full of liberal nut jobs and that they may not exactly but the best examples to follow.
Given that the latest rounds of some insanely serious sexual harassment allegations are coming to light across the board I think it's time we all say, out loud, that Hollywood has no right to lecture America ever again on the morals and values we strive for.
It's not just Weinstein anymore, folks... more and more of the Hollywood elite are getting called out. But perhaps the bigger picture here is that it's become VERY CLEAR that Hollywood was covering for him and essentially enabling his behavior.
It’s pretty hard to imagine that few of these entertainment figures knew about Weinstein’s atrocious behavior, especially with the dozens of women coming forward to share their horror stories about the producer.
Yet, it only became a problem when it reached the pages of The New York Times last week. In the past, Weinstein could expect newspaper editors, stars like Matt Damon and other powerful folks to go to bat for him to protect his reputation.
Now that protection is all gone and Hollywood has disowned him.
Hollywood enabled Weinstein, celebrated him and laughed about his harassment for years. It’s likely that his predatory behavior is normal for the business, as Corey Feldman and Elijah Wood have alleged.
This likelihood undermines entertainment elites’ greatest aspiration: lecturing the rest of the country on moral and political issues.
Many of these stars now spending their time trying to claim they knew nothing about the producer’s abuse, like Ben Affleck, have used their public statements to condemn Trump and conservative proposals. They always argue from the position that celebrities somehow know right and wrong better than the average American.
Harvey Weinstein himself made this very argument back in 2009. “Hollywood has the best moral compass because it has compassion,” he told the Los Angeles Times — while on a PR campaign to get convicted child rapist Roman Polanski released from a Swiss jail cell.
Nothing says morally superior like helping out your fellow sexual predator.
Culture is meant to reflect the values and dreams of the people who consume it. When Americans watch a movie, they expect their beliefs to be reflected in what they see. TV shows and films have an incredible influence in shaping the public consciousness and how Americans view current issues.
The decline in religion and community institutions leaves the makers of popular culture in the driver’s seat for determining what is good and what is bad in our society. Pop culture is now more important than church sermons.