I posted my query about my digital captures not being 1.33:1 on a Matrox
(the manufacturer of my video card) bulletin board and received a very
interesting response.  It gets a bit too techie for me, but it seems
there's a difference in the SHAPE of the pixels on TV screens and on
computer monitors!

That's news to me!

---- Matrox BB posting ----

Eytan, posted June 07, 2000 06:50 PM
Is there a particular technical reason for this aspect ratio? Something
about the analog-to-digital conversion, perhaps? (320x240)
It has nothing to do with the ADC, it's just TV resolution. For now toss
out VGA (640x480 this is square pixels or 3/4ths).

take 2 fields and their VBlanks (EQ->Burst->EQ->RSVD) you end up with 44
lines. (525-44=Active Video)

So max Vert. resolution is 481.5 lines.

On the Horizontal if you count only the B/W you can achieve a max of 280
Pixels, if you include color then figure a max of 360 visible pixels.

So we can say that at MAX overscan NTSC is
360x481.5 interlace or 360x241 /Field, and we end up with a 1.5:1 ratio

So you can see in TV you have rectangles instead of nice square pixels.

So you see where the 704x480, and 352x240 resolutions came from they appear
normal on a TV set, and they also had their roots in the MPEG1 standard.

As a note there are 2 sets of defined resolutions for PC video, one is the
square pixel, and the other is the rectangular pixel, and you can see how
their aspect ratio is different.

Square Pixel NTSC
UnderScan 512x384
OverScan 704x480

Rec. Pixels NTSC
UnderScan 640x400
OverScan 724X482

So you see it's not Hardware is that box we call a TV that defines the
limitations, and resolutions.

Jeremy Butler
[log in to unmask]
Telecommunication & Film/University of Alabama/Tuscaloosa

Online resources for film/TV studies may be found at ScreenSite