To all concerned
A couple months ago, I watched a TV Special which indicated that smoking
cigars is that latest vogue fad among Hollywood Actresses. It seems
reasonable
to me that this fad would find its way on to the screen. I also think
Peter's comment 's are very true, it is meant to show women as captains of
industry in films, or at least to afford women equal status as those
captains.
So in answer to your Question Chris, I would say no, I don't think your
paranoid, I think the increase is very real.
Another real behavior change I have noticed in a variety of TV shows and
Movies, is the assertiveness of females and submisiveness of males in
sexual roles. There is nothing wrong with women being sexually assertive,
its just in 95% of the shows I have seen lately it is always the women
being assertive, and the man is always submissive. This seems a bit
unbalanced to me. In fact, it seems that only a males playing the "Bad
guy," in any picture can be assertive with women anymore.
When you consider the impact that movies and television have on are youth,
I thinks it a good idea that stereotypical belief systems are dispelled,
but I wonder if total role-reversal, is a good idea? Surely all sexually
assertive males are not bad guys. I realize for a longest time men where
the only sex allowed to be assertive in movies, and some might say: its pay
back time. Howerver, life isn't about getting even, its about being fair.
The only way to have true equally between men and women is to betray these
roles as being equal, with neither one, nor the other dominating the other.
I think Hollywood is trying to make some sort of statement by reversing the
traditional male/female gender roles, in movies and in television. Is
Hollywood making the statement all good men are submissive? That all women
need to be more assertive? Anybody care to comment on this issue?
Jon
----
To signoff SCREEN-L, e-mail [log in to unmask] and put SIGNOFF SCREEN-L
in the message. Problems? Contact [log in to unmask]
|