The Home of Steven Barnes
Author, Teacher, Screenwriter


Sunday, February 28, 2010

Show Me the Women -- In Hollywood

Hollywood is no more racist or sexist than other aspects of human society. Ageist? Perhaps--it is in the business of image. Reflecting American society? Sure. But is there a country where women's films dominate? I doubt it. You might be dealing with the fact that film thrives on the unique image, and that usually means something splashy and new, rather than the internal world of emotions--which is what most people mean when they say "chick flick." Hollywood will make any movie they believe will make money, and track records are everything. Saying "it's Hollywood" or "it's men" or "it's white people" or "it's America" avoids the hardest thing of all--looking at ourselves in the mirror and asking why tribalism causes the pain it does. When women studio heads make male-oriented movies, they're just doing the thing they believe will make money. Blaming men, or America, or white people for all of this is as blind as the pattern of blaming older men for being with younger women--and not blaming the younger women, as well.



www.realherosjourney.com
Read the Article at HuffingtonPost

1 comment:

Lester Spence said...

This is an empirical question that has two components:

First there's the comparison to other industries within the country. Second there's the comparison to similar industries in other countries.

I believe that Hollywood is a very distinct industry, and that there are aspects of its distinctness that DO make it differentially racist/sexist. I also believe that it is likely LESS sexist than film industries in other countries, while MORE sexist than other industries in this country. The question though is one of metrics. How do we measure this? Presence of female producers/directors? The pay of female actors? The presence of female gaffers/writers/etc?

Institutions matter. Sexism and racism are not simply the product of individual attitudes, nor are they the simple product of genetics.