I've been looking at video cards lately. A lot of references talk about the importance of frame rate, and use this parameter as a measure of video performance. For this an other reasons, I've been trying to get a way to measure the frame rate on my PC. Among others, I found three programs that can measure frame rate:
#1 Tunnel at http://active-hardware.com/english/benchmarks/benchmarks.htm
#2 FRAPS at http://www.fraps.com/
#3 CBench at http://www.sysopt.com/cbench.html
These are the only three I could find that allow me to vary the resolution. I got the following results:
This shows the Frames per Second (FPS) varying from 3.1 to 8.9 for 640x480x8 and from 10.3 to 28.7 for 320x200x8. This appears to show inconsistency in the measurement from program to program. Then I started to play with the numbers.
Program Resolution Frames/s
Tunnel 640x480x8 3.1
FRAPS 640x480x8 5.6
CBench 640x480x8 8.9
One thing I did was to round the results. Then I took the ratio of FPS at 640x480x8 to the FPS at 320x200x8. Here's what I found:
Not only is the ratio constant, but look at the pattern from program to program: Tunnel to FRAPS to CBench is 3, 6 and 9 for 640x480x8 and 10, 20 and 30 for 320x200x8. The first group increases by 3s and the second increases by 10s.
Program Resolution Frames/s Ratio
Tunnel 640x480x8 3 3.333
FRAPS 640x480x8 6 3.333
CBench 640x480x8 9 3.333
All of these tests were done on the same card installed in the same system. The card is WG-1000VL/4...Cirrus Logic...E5YWG1000VL4...1995...32-bit VLB...8 x TMS44C256DJ(-70)dsw...1MB RAM...24 bit color...VESA 1.02...Cirrus Logic CL-GD5428-80QC-A. The system is a 486 built around a SiS471B motherboard with a DX2-66 Intel CPU.
My question: What could be causing the simple relationship observed between the measured values? In other words, why do each of the software programs give a different measurement for the same parameter, and what is the basis for this difference?