Page 5 of 6 FirstFirst 1 2 3 4 5 6 LastLast
Results 61 to 75 of 84

Thread: DUAL CORE PENTIUM!!!

  1. #61
    Ultimate Member porsch1909's Avatar
    Join Date
    Feb 2004
    Posts
    2,121
    As stated the Emergency Edition was Intel's responce to the FX-51 so as to not be so embarresed, they just threw some extra cache on the Zeon and marked the price up. It's still not as good as the FX line BY FAR, unless for the reason stated above.
    by far is an exageration the FX-53 beat the EE but not convincingly okay. im sorry i havent got any links this darn summer job is draining me away to nothingness.

    Just think if Intel was the only plausible choice like they used to be, CPU prices would be sky high.
    il just reword and cut that now :

    Just think if AMD was the only plausible choice CPU prices would be sky high.

    any sensible company would do that, it makes good business sense is it ethical no but when you are a CEO you would probably understand. if your company needs to make huge profits and you deliver then your a good CEO.

    personally i think the FX series and the EE are totally overpriced and not worth that cost at all. and can i remind everybody who says the EE isnt a harcore gamers that the first person with an aquamark of over 100000 used an EE. so it is a very good gaming CPU and to most humans the difference in FPS between changing the CPU is unrecognizable IMO. for gaming the graphics card makes a much bigger difference. im happy with my 2600 XP and my P42.8C i can play any game i want on both, and i can type up a report and i can compress video and i can do other stuff, just cant think of any just now. whao i think that as i type on this computer the text comes up faster than on my intel ooooooo.....

  2. #62
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    Originally posted by porsch1909
    by far is an exageration the FX-53 beat the EE but not convincingly okay.
    I agree with the rest of your post exept for this comment. By far is an understatement. The P4EE is left in the dust for gaming rig's. In 8 gaming benchmarks, the P4EE beat's the FX in only 1, and is left behind by a far margin on average.

    http://www.tomshardware.com/cpu/2004...16.html#opengl

    Also the price of the 2.4ghzEE is almost $200 more than the FX-53, now tell me how much sense that makes?
    http://www.zipzoomfly.com/jsp/Produc...ductCode=80668
    IntelĀ® PentiumĀ® 4 Processor 3.4GHz Extreme Edition , 800MHz FSB, Socket 478 Retail *** Free 2nd Day ***
    Our Price: $990.00


    http://www.zipzoomfly.com/jsp/Produc...ctCode=80716-R
    AMD Athlon 64 FX-53 Processor Socket 940 Retail ***Free 2nd Day***
    w/Fan and Heatsink, 3 Years Manufacturer Warranty
    Our Price: $808.00
    Last edited by Vampiel; 07-16-2004 at 05:24 PM.

  3. #63
    Ultimate Member porsch1909's Avatar
    Join Date
    Feb 2004
    Posts
    2,121
    well yeh it beats it but i wouldnt say 10 FPS here or there is much again its not noticable and im glad people are finally agreeing with me!!! sort of 5 FPS in aquamark and 40 odd 3dmarks isnt that much, ive just scanned the link ill see it in detail when im off on sunday then comment more fully.

  4. #64
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    I'd take 10 frames and $200 anyday. BTW anything 10+ is considered alot in a game.

  5. #65
    Ultimate Member porsch1909's Avatar
    Join Date
    Feb 2004
    Posts
    2,121
    yeh but when the frames are so high its unoticable i would say but your probably a lot more experienced in these matters than me

  6. #66
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    For the games they tested on the 2 high end CPU's you wouldnt notice it. I guarentee you would notice it in Far Cry, Doom3, Half Life 2 and the games coming out recently.

    Anything above 60-70FPS is very hard to notice. You bank account would notice it.

  7. #67
    Ultimate Member
    Join Date
    Sep 2001
    Posts
    18,631
    I'd agree that 10FPS isnt enough of a margin to start getting ones Y-Fronts in a turmoil, and certainly not one worth paying a few hundred quid extra for.

    However, when you consider the fact this particular 10FPS is coming for $200 less, you would have to be silly, or be head over heels in love with Intel not to see that as a convincing reason to buy AMD on this one.

    That said, i'd argue that anyone willing to spend $800 and $900 dollars on a CPU deserves to be given a light brush across the back of the cranium with a cricket bat.

    EDIT: Vamps points above about forthcoming games and high end CPU's are very valid, graphics card technology has now reached a point where many cards (even the R9800 Pro/XT) are being bottlenecked by lack of grunt from the CPU, until CPU and VPU technology begins to reach an equilibrium again, at present the more performance you can get out of the CPU, the better.

    --Jakk
    Last edited by Bigjakkstaffa; 07-16-2004 at 05:46 PM.

  8. #68
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    For those of you who say 'well anything over 30fps the human eye cannot notice'. That's incorrect, at 30fps the human eye BEGINS to perceive motion.

    I challenge you to buy a strobe light and set it to 30 flashes per second, look straight into (well not really straight into it unless you seek to go blind) then set it to 50 per second and come back with your results.

  9. #69
    Ultimate Member Someone Stupid's Avatar
    Join Date
    Oct 2002
    Posts
    3,133
    porsch: Intel was the one for the longest time with the highest prices, and as Vamp pointed out, they still are. Well unless you want a celeron.

  10. #70
    Member
    Join Date
    Feb 2001
    Location
    jersey
    Posts
    177

    Re: one last thought..

    Originally posted by dmanlyr
    I forgot.. more random thoughts..

    if you rewite the code, why not take the time and catch up to where we should be at..

    8086/88 - 1982 or so, 16 bit (8 or 16 bit bus)

    80286 - 1988 or so

    80386 - 1990 or so, 32 bit (16 or 32 bit bus)

    Alpha xx164 - 1994 or so, 64 bit (mips also had a 64 bit)

    why not 128 bit in say 1999, and we should be on the verge of 256 bit chips right now..

    But no.. everyone goes on a mhz chase... wrong IMO.. slow everything down but move more data.. less heat and faster speed.. if the code is written for it!

    And, in the ALpha chip defence, it was the FIRST 64 bit windows (NT4 workstation - DESKTOP o/s) chip, NOT AMD's current offering.. how AMD can lie like that baffles me.. and I only use AMD chips, still it galls me evertime I see it advertised!!

    Just more thoughts..
    think of shifting from 8 to 16 to 32 bits etc as shifting gears in a car, you don't get the same out of all of them, the higher the gear, the longer you can stay in it before needing to shift. plus many people argue that we STILL don't really need 64bits, all i've seen people say is that the memory addressing is an issue (with 32bits only 4GB of ram can be addressed, with 64bits its MUCH higher)

    first off, he's right about alpha, when intel had thier neato p1 166 or 200mhz, and the p2 was getting ready, Digital had a 500mhz alpha chip, getting ready to releast a 600mhz variant, and intel now owns those patents because digital merged or was bought by compaq, and compaq, tring to be all kissie kissie, gave the technology to intel, who has a "not invented here, not used here" methodology (the oppisate of microsoft, which is "not invented here but better then what we got? steal it! use it! then make it even better" ) which is foolish because if you recall, Digital even had some software that took 32bit code and translated it to 64bit, while slowly morphing it to 64bit code, think of how that could help them right now going against the opterons!! I am not sure who AMD licenced its EV6/7 bus to actually, THAT is the bus that alpha used which was also invented by Digital (equipment corp) and i have heard that there were a couple old Digital people on the design team of the K7 before it had a name.
    they were some smart cookies thats for sure.

    intel is going to switch to model numbers much like AMD because the MHz race has run thier heat dissipation into the sky. 100watts+ from a surface how small?? prescott is literally the hottest processor out there now, the die shrink didn't produce any noticable heat reduction like before, the transistors leak energy like a scive because they are so small. dual cores? yea its comming, but they better pump up the L2 if only to increase the die size so the heatsink (or watercooler) has more then just a squair inch to grab heat from (i don't know the actual size and please don't debate it with me) its the same problem amd had with the tbred A processors, they added another layer of connections in the tbred B's to compensate... maybe intel can/will do the same maybe not. how amd is doing for heat i honestly am not sure, but i know their flagship isn't as bad off as intel's.
    argue if you like but
    just watch, intel chips with model numbers ARE on the way.

    and yea posch, of course if a company has a monopoly they charge an arm and a leg...... lol we'd all be VIA cyrix boys without intel..... ok maybe not but....
    Last edited by dosmastr; 07-17-2004 at 01:10 PM.
    System now:
    1.8ghz northwood. (will pin mod to OC later)
    568MB DDR1 at 200mhz
    Sapphire Radeon 9500 np
    16GB 15k scsi if i can get it to work
    160GB wdc RAID 1

  11. #71
    Ultimate Member porsch1909's Avatar
    Join Date
    Feb 2004
    Posts
    2,121
    intel chips with model numbers are here!!! they are stupid though, the AMD one give a better idea, intel just picked random numbers like 540 i think its random anyway.

  12. #72
    Member
    Join Date
    Feb 2001
    Location
    jersey
    Posts
    177
    lol its a higher number then amd has right? thats probably got something to do with it
    System now:
    1.8ghz northwood. (will pin mod to OC later)
    568MB DDR1 at 200mhz
    Sapphire Radeon 9500 np
    16GB 15k scsi if i can get it to work
    160GB wdc RAID 1

  13. #73
    Ultimate Member stix_kua's Avatar
    Join Date
    Dec 2002
    Location
    My spoon is too big...
    Posts
    2,884
    So who's coming out with Dual Core first....


    AMD!!!

    From this document I conclude that AMD is passed the design stage.
    "I'm no technical supervisor, I'm a supervising technician."
    --Homer Simpson

  14. #74
    Member
    Join Date
    Feb 2001
    Location
    jersey
    Posts
    177
    the "tape out" period is critical, this is where they get out all the bugs and get it ready for mass production, after the product has "taped out" its pretty much ready to go,,,, nvidea went through some hell during its nv3x products because they leaked that the product still hadn't taped out dispite ati having a product a week away from a REAL launch.

    If dual core Opterons do indeed have two memory controllers, the pincount of dual core Opterons will go up significantly - it will also make them incompatible with current sockets. AMD is all about maintaining socket compatibility so it is quite possible that they could only leave half of the memory controllers enabled, in order to offer Socket-940 dual core Opterons


    **** KNOCKER!!! i want dual memory controllers both with dual channel! lol
    System now:
    1.8ghz northwood. (will pin mod to OC later)
    568MB DDR1 at 200mhz
    Sapphire Radeon 9500 np
    16GB 15k scsi if i can get it to work
    160GB wdc RAID 1

  15. #75
    Ultimate Member porsch1909's Avatar
    Join Date
    Feb 2004
    Posts
    2,121
    is the opteron not a server CPU though not usually used my home users.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •