Sunday, January 15, 2017

Classic Systems - The True Framerate

Classic color NTSC uses a frame rate of 59.94  However, classic video game consoles and home computers never adhered strictly to the NTSC standard.  Here are the exact frame rates as I have been able to find :

NES & SNES : 60.0988
GB, GBC & GBA : 59.7275
SGB : 61.1679
SGB2 : 60.0988

Apple II, Atari 2600, Colecovision, IBM CGA, PCjr., Tandy 1000, EGA @ 200, MSX, SMS & Genesis : 59.9275
Commodore 64 = 59.862

Hercules Graphics : 50.050048
IBM VGA : 70.086303
IBM VGA 640x480 : 59.940475

Gamecube & Wii : 60.00222p/59.88814i




How do you calculate the exact frame rate?  You do it by discovering exactly how long it takes for the console to output all the pixels it sends to the TV each second and divide that by the number of pixels per line and the number of lines per frame.

Let's start with True NTSC.  The colorburst frequency is 3,579,545 Hz.  That represents the number of times the color can change in a given second.  True NTSC outputs 262.5 lines every odd and even field.  The half line tells the TV tube to draw an odd field or an even field.  Each line of an NTSC display has 227.5 color clocks.  This was chosen to minimize interference with the luminance and audio signals.  The frame rate is = 3,579,545/227.5/262.5 = 59.94 Hz.  The line rate is 59.94 * 262.5 or 15,734.26 Hz.

For consoles, the calculation is almost identical, but the figures are not.  First, we need to know the total number of lines output by the device every frame.  Classic consoles and computer adapters typically output 262 or 263 lines to avoid telling the TV to interlace.  No classic NTSC device used that many lines as part of their active video.  Typically only 192, 200, 224 or 240 lines were used. The Atari 2600 is unique in that unlike other devices, it does not have modes with a set number of lines.  It is up to the programmer to tell its graphics chip, the TIA, to send a vertical sync signal to get the TV beam back up the screen.  Although 262 lines was the standard, games more often than not output lines other than 262.  Some games output so many lines that some modern TVs get confused and think they have a PAL signal!  Here is a list of games and the lines they output :

http://www.digitpress.com/library/techdocs/vcs_scanlines.htm

Next we go to the trickier part, we need to figure out how many color clocks and how many pixels are being sent for each line.  Typically, the number of color clocks consoles use are 228.  Usually the console is sending pixels at some multiple of the color clock rate.  The Atari 2600 sends one pixel per clock, but only 160 of those pixels are active.  The Apple IIe and IBM CGA can send one, two or four pixels per clock.  The Sega and Nintendo 8-bit consoles usually send 1.5 pixels per clock.  Their graphics chips run at 5,369,317.5 Hz, 1.5x faster than the colorburst.

So for the Sega 8-bit console, it is outputting 342 pixels for 228 color clocks.  This gives us the following formula  : 5,369,317.5/342/262 = 59.9275 Hz

The Apple II High Resolution and the IBM CGA 320x200 modes are outputting 456 pixels at 7,159,090 Hz.  The calculation remains the same.  In the Apple IIe/IIc double high resolution mode or the IBM CGA 640x200 mode, both the rate and the number of pixels are doubled again, but the refresh rate remains the same.

Let us consider the C64.  The NTSC C64 outputs 8 pixels per cycle and uses 65 cycles per line.  In other words, it is outputting 520 pixels per line.  It does this at 8,181817 Hz.  It displays 263 lines per frame, that gives a framerate of 59.826Hz.

The NES and the SNES act similarly to the SMS and Genesis, but have two important differences.  First, while they output 1.5 pixels per color clock, they output 341 pixels, not 342.  341/1.5 = 227.33 color clock.  As a result, there is a staircasing effect every three scanlines for color objects because each line is offset a third of a pixel from the previous line.  Black and white pixels are unaffected, since they don't care about NTSC color.  Second, the number of pixels output changes by 1 for the last scanline of every frame.  On odd frames, the last scanline has 341 pixels, and on even frames the scanline has 340 pixels.  Nintendo did this to reduce composite color artifacts, but on non-CRTs it makes for a very gritty display.

Because of the 1-pixel difference, here we calculate by the total number of pixels per frame.  That is (341 x 261) + 340.5 =  89,341.5.  The formula becomes = 5,369,317.5/89,341.5 = 60.0988.

The Game Boy doesn't care about NTSC color, so its calculation is done differently.  The Game Boy runs at 4,194,304 Hz or cycles per second.  It draws a frame in 70,224 clock cycles (not machine cycles!).  4,194,304 / 70,224 = 59.7275 Hz.  The Game Boy Color runs twice as fast and draws a frame in twice as many cycles.  Ditto for the Game Boy Advance except that now it runs four times as fast as the Game Boy and takes four times the cycles to draw a frame.  The Super Game Boy uses the SNES master clock signal (21,477,273MHz) divided by 5 to run at 4,295,454 Hz.  Because the Super Game Boy still takes 70,224 cycles to draw a frame, it draws 61.16 frames per second.  The Super Game Boy 2 does not use the SNES for a clock source, it has a Game Boy 4,194,304 clock in the cartridge, but it is still accelerated by the SNES to its framerate, not an LCD Game Boy's.

5 comments:

Anonymous said...

It's interesting that devices which replicate old systems such as RetroUSB AVS have different frame rates to avoid issues with modern HD screens: https://www.youtube.com/watch?v=uJkBw7rYr2o

blasterbeam said...

I have a question, maybe you know something about it. If some of the game consoles you mentioned that can be hooked up to an NTSC TV have slightly off framerates, how is it possible that there are no visible frame drops or double frames?

Great Hierophant said...

SD CRTs were far more tolerant of non-standard and out of spec signals than anything else. This is because they were intended to receive analog broadcast signals, which could deform quite a bit from a TV tower to your antenna. When it comes to the number of lines to draw, they do what they are told. If the signal is wildly out of spec, then you will encounter rolling. Older CRTs had a vertical hold dial that could fix most of these issues. The vertical hold dial on many TVs can essentially dial the vertical refresh down to PAL 50fps.

blasterbeam said...

So you are say, the CRT TVs are syncing to the video signal? I thought they were somehow hard-wired to the AC power frequency, e.g. 60hz. Thus, my question.

Scali said...

Yes, there are special sync pulses in a PAL or NTSC signal.
There is a hsync pulse at every scanline, and a vsync pulse for every field.
(There are a few other pulses/signals in there, eg to (re)calibrate the black level, and the colorburst as mentioned, which is a 'synchronization' for the color decoding circuitry).

I think they mainly chose sync rates that were virtually identical to the AC power frequency to mimimize interference.