SCREEN-L Archives

February 2005, Week 4

SCREEN-L@LISTSERV.UA.EDU

Options: Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Mime-Version:
1.0
Content-Type:
text/plain; charset="iso-8859-1"; format=flowed
Date:
Sun, 27 Feb 2005 00:16:05 +0000
Reply-To:
Film and TV Studies Discussion List <[log in to unmask]>
Subject:
From:
Leo Enticknap <[log in to unmask]>
In-Reply-To:
Content-Transfer-Encoding:
8bit
Sender:
Film and TV Studies Discussion List <[log in to unmask]>
Parts/Attachments:
text/plain (92 lines)
James Monaco writes:

>1. Is it really possible that frame-for-frame transfers of 24 fps were the 
>norm in 50 Hertz countries at one time?

If you mean - did they run the film transport at 24fps and sod the flicker? 
- I don't know but I certainly hope not!  The resulting interlace flicker 
is horrific, and is the inevitable result of many amateurs' attempts to 
transfer their home movies by the simple expedient of camcordering them off 
a wall.

>4. Do PAL cathode ray tubes still flicker? Or has the refresh rate been 
>increased?

In the late '90s Philips launched a massive ad campaign for a new line of 
TV sets which they claimed were a lot brighter and sharper than any of 
their predecessors.  After reading a techie magazine or two it turned out 
that their main innovation was a 100hz scanning rate.  I don't know if this 
doubled scanning rate is the norm for most or all CRT tellies sold now.  To 
a certain extent this is academic, because if the principle behind Moore's 
Law applies to the growth in market share of TFT displays, they'll consign 
CRTs to the Science Museum within 3-5 years or so.  As a thin film 
transistor can retain its luminance output more or less indefinitely, you 
can in effect have whatever scanning rate you like on one.  By the same 
token interlacing has been technically unnecessary since the invention of 
Chromatron/Trinitron tubes in the late '60s - broadcasters could have 
changed to progressive scanning decades ago - but both the PAL and NTSC 
standards predate this development, still encode interlacing, and so the 
broadcast signal is still interlaced.

As for whether or not PAL CRTs flicker, they never have noticeably done so 
to my eyes.  There again, I started watching PAL TV sets when I was a babe 
in arms and have been doing so continuously for the subsequent 31 years - 
so I'm probably just used to them.  However whenever I see an NTSC set in 
the US I'm always struck by how blurred or 'fuzzy' the picture looks - I 
really miss those extra 100 lines of horizontal resolution, and keep 
wanting to tweak the focus knob.  Then I remember that it's a TV set and it 
hasn't got one!  But if I lived in North America and watched NTSC every day 
I'd probably get used to it.  Interestingly I really can't see the 
difference between an NTSC and a PAL DVD when watching them on my TFT 
computer monitor.  The DVD playback software automatically takes control of 
the graphics card driver and sets the scanning rate to 50 or 60 as 
appropriate.  Maybe something very clever happens in the way the software 
transcodes the 720 x 576 pixels (PAL) or 640 x 576 (NTSC) into the XGA 
output to the monitor, but I honestly can't see any difference at all in 
flicker or definition between a PAL or NTSC disc.  On a set-top player 
connected to a TV set, I certainly can.

I've converted 35mm projectors in venues running the odd one-off silent to 
run at 18 or 20fps with a two-blade shutter and to me the flicker looks 
migrane-inducing: but when I apologise for the inevitable flicker to the 
theatre manager or musicians, they almost always say that they can't see 
anything wrong (and therefore won't let me spend the extra £100 or so on a 
three-blade shutter assembly).

>As Leo points out, 24 fps (doubled by the shutter to 48 fps) is the least 
>you can get away with--and we've been stuck with it for 75 years. 
>(Showscan never took off, although IMAX HD might still.) I think this is 
>the mosty salient example of what we might call the Lincoln Rule of 
>Perception: you only have to fool most of the people most of the time. A 
>more recent example is CD audio standards (significantly inferior to the 
>vinyl it replaced). There has been a lot of interesting work done in 
>perceptual psychology in the last 20 years, but it seems most of it has 
>been devoted to extending the Lincoln Rule.

The tradeoff between perceived image/sound quality and what investment will 
deliver the most effective bang for your buck is behind this, at a 
guess.  If you play an RCA Dynagroove LP from the '60s through a turntable 
costing four figures, the resulting sound will blow seven bells out of a CD 
of the same recording played through a £50 no-name player bought from 
Asda.  But if you play the LP on a cheap and nasty turntable the CD will 
win hands down.  Unlike a dynamic groove record, the frequency range of a 
CD is fixed: but 99.9% of consumers have neither the equipment nor the 
hearing for that to be a problem.  By the same token 99.9% of moviegoers 
won't see the more fluent movement and (assuming the projector is designed 
and set up right) consistent light output you'd get by increasing the film 
speed significantly.  Forget Showscan - Todd-AO even had to back off from 
30fps because theatres and studios weren't prepared to absorb the increased 
cost of stock and equipment conversion.  For today's blockbusters, where 
you might make as many as 3,000 prints for the US market alone (1,000 for 
the UK is not uncommon, though a good proportion are usually imported), 
that cost wouldn't be insignificant.  Perversely this might be one area 
where digital origination and distribution might help, if the extra 
bandwidth needed for a higher frame rate proves to have a significantly 
lower equivalent cost than with film.

Leo 

----
Online resources for film/TV studies may be found at ScreenSite
http://www.ScreenSite.org

ATOM RSS1 RSS2