The myth of the woke corporation

Conservatives who criticize Hollywood as liberal may take the industry’s approach to Georgia’s abortion ban as further proof that the movie and television businesses are hopelessly biased. But inasmuch as the entertainment industry is actually liberal, its decisions about where to take stands are driven as much by the desire to retain vocal young employees and to avoid the sort of entanglements that make for bad headlines. In this environment, conservative criticism actually serves as a sort of subsidy to Hollywood with liberal observers; it’s easy to believe that a business that spends so much time angering the sanctimonious must be doing something right, and doing it for the right reasons. That perception allows executives to get credit for waving in the direction of sacrifice, while pursuing profit and market access in other areas of their business.

Advertisement

If Hollywood wants to monetize its perceived liberalism and make arguments for its social significance by pointing to the values it exports overseas, it ought to offer more than vague promises, throwaway efforts at representation, and late-breaking expressions of horror about abusive workplaces. The corporations that dominate the entertainment industry are in the business of entrancing, not saving, us. We’ll have to do that ourselves.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement