Question:
why is the fps of movie / video devided by 1.001?
enrager
2012-02-04 22:41:47 UTC
I was talking to a friend today who was telling me his camera can record videos at 24 or 30 fps.
been looking around for a while i knew that the real number are more "23.9..." and 29.9... while searching on google i came up with the conversion fomula 24*1000/1001 or 30*1000/1001 but i cannot find why the standards has been set this way.

do they have anything to do with the way film were shot back in the day with a manivel crank at a constant speed?
Three answers:
DSV
2012-02-04 22:52:13 UTC
Drop frame timecode dates to a compromise invented when color NTSC video was invented. The NTSC re-designers wanted to retain compatibility with existing monochrome TVs. Technically, the 3.58 MHz (actually 315/88 MHz = 3.57954545 MHz) color subcarrier would absorb common-phase noise from the harmonics of the line scan frequency. Rather than adjusting the audio or chroma subcarriers, they adjusted everything else, including the frame rate, which was set to 30/1.001 Hz.



This meant that an "hour of timecode" at a nominal frame rate of 29.97 frame/s was longer than an hour of wall-clock time by 3.59 seconds, leading to an error of almost a minute and a half over a day, as the timecode was calculated in a manner that assumed the frame rate was exactly 30 frame/



NTSC drops 2 frames every minute, except every tenth minute, achieving 30×0.999 = 29.97 frame/s. The error is the difference between 0.999 and 1/1.001 = 0.999000999000999



**ADDITIONAL DETAILS

23.976 is simply a television-friendly version of 24 fps that is traditionally used in film. While most people think television is broadcast at 30 frames per second, it's actually 29.976 (or 59.94 interlaced fields per second). Black and white TV was 30.00/60.00i, but the NTSC colour standard changed the time base slightly to help squeeze in the colour information. That has plagued NTSC ever since.
Dennis C
2012-02-05 12:00:01 UTC
Hi "enrager":



My fellow Contributor from Down Under, "DSV", danced around the subject but never really explained what you asked about. This may be due to the fact that NTSC color television broadcasting design causes the need for the 1000/1001 factor, and Australia uses the PAL color television system, which doesn't necessitate the slight shift.



It's a matter of what's called "beat frequencies" in electronics and physics. Television uses a number of "sub-carriers" that ride along with the main radio carrier wave that's transmitted. In black & white TV, there was just the sound sub-carrier added 4.5MHz above the main visual carrier wave. But when the NTSC method added a chroma (color info) sub-carrier at 3.579545 MHz above the visual carrier, certain conditions can cause intermodulation "beats" (which show up as interference dots on the video screen) unless the either the frame frequency (30Hz) was lowered or the audio carrier frequency (4.5Mhz) was raised. Since changing the audio subcarrier would make older B&W TVs non-compatible, the frame rate was lowered slightly (by a factor of 1000/1001) which was within the tolerance (and "Vertical Hold" knob range!) of TV set synchronization and kept all the other mathematical relationships to where visual interference was eliminated.



PAL and SECAM TV systems use 25fps (due to 50Hz electrical systems) and different color encoding methods, and never required this frame-rate adjustment.



Since you already have browsed Wikipedia, a better article explaining this is the "Color Encoding" section of the "NTSC" article: http://en.wikipedia.org/wiki/NTSC#Color_encoding



For a visual explanation of TV carriers & sub-carriers, the "Transmission modulation scheme" section of the same article has an excellent diagram: http://en.wikipedia.org/wiki/NTSC#Transmission_modulation_scheme



And to cover the "24fps" versus "23.976fps" issue, it's merely a logical extension of the 1000/1001 math applied to the telecine 3:2 pulldown ratio required in NTSC broadcasts of films. See "Framerate conversion" in the same article I referenced: http://en.wikipedia.org/wiki/NTSC#Framerate_conversion



hope this helps,

--Dennis C.



===========

UPDATE EDIT:

===========

Both my Aussie friends "Tech" & "DSV" have put the cart before the horse. NTSC came before timecode. SMPTE drop-frame was designed AFTER & due to 29.976fps NTSC. Timecode has nothing to do with the "why" of it. --DC

 
TECH
2012-02-05 15:47:04 UTC
Both Dennis and DSV are generally right .. it has to do with the SMPTE timecode with the sequence is dropped every tenth minute , the frame rate is consistent at 30fps but a frame is dropped every tenth minute scaling the overall rate to 29.97 it is a different code to the Manchester system , and the trans-coding is different to PAL , it is basically set on an anolog system , and although electronics allow better transcoding and essentially a 30fps linear transition , the rate has been maintained to old parameters , the rate is slowed for analog television to around 23.937 based on the analog receiver equipment however digital rates and digital transmission are at the recorder format


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...