Page 2 of 2 FirstFirst 1 2
Results 16 to 30 of 30

Thread: Why are Athalon's so popular?

  1. #16
    Ultimate Member richard_cocks's Avatar
    Join Date
    Mar 2002
    Posts
    1,199
    3Dmark overrides vsync and AA/AF settings anyway.

  2. #17
    Member
    Join Date
    Feb 2004
    Location
    Weird NJ
    Posts
    285
    Wow thanks guys. A lot more info than I had expected. I will be sure to keep this in mind when i run 3DMark '03 and Aquamark. Also do you think my computer would benifit from a memory OC or possibly a second Kingston 3-3-3 chip?
    Lord AnthraX

  3. #18
    Ultimate Member
    Join Date
    Aug 2002
    Posts
    3,922
    Originally posted by zybch
    Stick the refresh rate up to 75 for a few days, then go back to 60. I doubt you'll be wanting to stay back there for too long.
    Maybe he has an LCD, on which it is much more difficult to tell the difference (if any). Even on an LCD, higher refresh rate is better, especially in high-velocity situations like gaming where the screen can rapidly change state.

    On CRT's you'd have to be blind to not notice the difference. 60Hz is PAINFUL for me...85 reasonable, 100 ideal.

    Many of my goggle-eyed friends can't tell. Blind they truly are.

  4. #19
    Ultimate Member
    Join Date
    Aug 2002
    Posts
    3,922
    Originally posted by Lord AnthraX
    Wow thanks guys. A lot more info than I had expected. I will be sure to keep this in mind when i run 3DMark '03 and Aquamark. Also do you think my computer would benifit from a memory OC or possibly a second Kingston 3-3-3 chip?
    Lord AnthraX
    Get a second stick of RAM. That'll allow for dual-channel operation, which will double the bandwidth available to the processor...important for P4s.

    For now, you can run the memory higher than FSB to obtain more performance. Or you can just try to see the limits of your 2.6C...most top out at 3.2-3.4.

  5. #20
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    Originally posted by causticVapor
    Maybe he has an LCD, on which it is much more difficult to tell the difference (if any). Even on an LCD, higher refresh rate is better, especially in high-velocity situations like gaming where the screen can rapidly change state.

    On CRT's you'd have to be blind to not notice the difference. 60Hz is PAINFUL for me...85 reasonable, 100 ideal.

    Many of my goggle-eyed friends can't tell. Blind they truly are.
    Some people are more sensitive to flickering than others. I can't stand being in a room lit by flourescent tubes and can easily pick out individual frames in a movie at the cinema, mosy of my friends can't. However even they hate 60Hz and run at 75. I use 100Hz and everything is rock solid.

  6. #21
    Ultimate Member richard_cocks's Avatar
    Join Date
    Mar 2002
    Posts
    1,199
    what annoys me at cinemas is all the motion blur that you don't get on PCs, and don't get that much of on TV is all over cinema.

  7. #22
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    Movies are only 24 frames per second which is at the bottom end of what is required to trick our brains into seeing motion instead of just a rapid number of still images.
    At 24fps you'd think that anyone would get fatigued eyes, however each frame is repeated twice (frames a, a, b, b, c, c, d, d) instead of just once (a, b, c, d) so its actually playing back at 48fps even though you are only seeing 24 discrete frames per second, however it is more jerky than a true 48fps video image would be.
    There were some experiments a few years back with film being projected at 60 fps, aparently it looked fantastic but because 60fps is close to the human eye's 'native' refresh rate, peoples' brains were convinced that what was on the screen was real and a lot of motion sickness, dizzyness and queezyness was experienced.
    Because the average (well not perhaps the tiny multiplex screens) movie screen is quite large, it became difficult to get a frame of reference between the real world and the movie and that exacerbated the problem, especially in movies with lots of action and camera movement so the experiment was largely disgarded.
    Digital projection, when it finally becomes manistream, might encourage people to try something other than the 24fps we have had for the last hundred or so years.

    Thus ends the lesson.

  8. #23
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    Dont forget NTFS runs at 29fps. US (and I think Japan) use this standard.

  9. #24
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    NTSC = National Television System Committee, but I prefer Never Twice the Same Color because its very suseptable to distortion.
    NTSC runs at 29.97 fps, 30 was just not good enough
    This is a television standard rather than a celuloid one and works using interlaced fields that each only contain 1/2 a frame of information.
    TVs also have, and benefit from, phosphor glow (the bane of LCD pannels-ghosting). This makes the image stability of a TV seems okay, even though its only running at around 60Hz.

    The PAL standard used in europe and australia is of higher resolution, gives much better color purity and isn't anywhere near as distortable as NTSC but only runs at 25fps, 50Hz. Some newer TVs run at 100Hz and repeat every field twice, just like a cinema projector, they cost more but are much nicer to watch, especially 68cm and above TV sizes.

    Interlacing, for those who don't know, means that the TV will 1st display lines 1, 3, 5, 7..., then the next frane will be lines 2, 4, 6, 8... and so on. Our eyes are tricked into seeing a complete picture even when each frame per second only contains 1/2 of a frames worth of image. Sometimes with rapid up/down movement you will see a double image like the woman's right hand in the picture.
    The NTSC standard is much more affected by interlacing as can be shown in the pic below, but you can get line doublers that help fix the image (mainly used with DVD players).
    Attached Images Attached Images

  10. #25
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    Because TVs work at 30 (we'll call it that anyway) and 25 frames per second (ntsc/pal) there needs to be a way to convert a movie's 24fps to these other frame rates.
    With PAL its easy, the film just runs 4% faster which isn't noticeable.
    NTSC uses a method called 3/2 pulldown in which is a really strange idea but seems to work. Basically each 2nd frame is repeated 2 times and every other frame is repeated 3 times.
    With medium speed tracking shots you can make out some juddering which is caused by the 2-3-2-3-2-3 frames.

    Why, any sane person would ask themselves, was NTSC ever adopted if it has so many problems?
    Simple:
    PAL was originally a german conceived standard.
    The NTSC commitee for america was chaired or at least heavily loaded with jewish members.
    Jews (at that time) refused to have anything to do with Germans and decided to adopt the inferior standard (in the 1940s, when anti-semitism in germany was gaining rapid momentum) simply because it wasn't german.

  11. #26
    Senior Member
    Join Date
    Jul 2003
    Location
    Ontario, Canada
    Posts
    955
    Originally posted by Lord AnthraX
    So in your own opinion did I screw myself over when I bought my system? I had the choice between the P4 or the A64.

    Abit IC7-MAX3 Free
    P4 2.6c (Generic Heatsink) 175$
    512 MB Kingston CL3 (non hyperx, this i goofed on) 82$
    9800 PRO 200$
    Windows XP pro

    You think I should have gone for an A64 system insted, or would this be a worthy gaming system with a few after market mods (like new HS for OC'ing and such and such..).

    P.S. I always thought the AMD's ran a helluva lot hotter than the intels, is this fact or fiction?
    I don't think you've screwed yourself, in some cases you're probably ahead with the P4C 2.6 over a comparable Athlon XP (3200+) - the price is currently about the same and the P4 leads in the benchmarks more often then not. Had you gone with the Athlon 64 you would have a better performing system but it would have cost you more - the Athlon 64 3200+ alone currently goes for $282. You also got the ABIT board for free... that would have an influence on my decision.

  12. #27
    Member
    Join Date
    Feb 2004
    Location
    Weird NJ
    Posts
    285
    Yeah the abit board was a birthday gift and the rest I had to buy with my own money (it was that or a car). Also I have a 17" CRT monitor I got from the garbage, so it flickers at anything above 60Hz so unless there is a way to stop it flickering while putting the Hz up i'll leave it at 60, Painkiller and such seem to run pretty well.
    Lord AnthraX

    P.S. Would it be advisable to mix 3-3-3 with CL2 or CL2.5 memory?

  13. #28
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    Ram will always run at the speed of the slowest module. So although you won't get any performance increase if you stick in faster stuff, it does mean that once you get rid of the slow stick the remaining ram will be able to work faster.

  14. #29
    Member
    Join Date
    Feb 2004
    Location
    Weird NJ
    Posts
    285
    Ah correct you are, I totally forgot about that, and ironically I was just having an argument about that with someone. Thanks for the clerification. Also do you think it would be advisable to get a gig set of HyperX (DDR400) or of the Corsair equivilent. I really need to get higher end memory for this computer, I've got this amazing board but i'm feeding it **** (sort of speak).
    Lord AnthraX

  15. #30
    Ultimate Member deadkenny's Avatar
    Join Date
    Jul 2001
    Location
    Toronto, Canada
    Posts
    1,123
    I agree with most the others that you didn't 'screw yourself over' with the system you purchased. You might have gotten a better performance / dollar ratio with AMD 32bit, or better overall performance with AMD 64bit, but your system is fine.

    Regarding the original question regarding AMD, I'll just mention a couple of fine points possibly not explicitly mentioned. First, older AMD processors tend to drop in price more quickly than Intel. So while the more recent offerings from each might lead to one conclusion, the older processors a few notches down in the charts clearly favour AMD. Secondly, the current iteration of the Intel 'bargain' processor - Celeron - is pathetic. On the other hand the Duron is dirt cheap and actually a decent performer. Finally, until relatively recently the CPU multipliers on AMD processors were partially unlocked without having to mod the processor. This provides more OC'ing options. In fact the unlock XP2500 Bartons were all the rage with OC'ers when they first came out. Unfortunately AMD started locking the multipliers, but there's still the mobile Barton that's completely unlocked.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •