Comments Locked

72 Comments

Back to Article

  • Matthew12222 - Thursday, July 17, 2008 - link

    2MB L2 cache VS 4MB L2 cache! And FSB 1000mhz VS 1333mhz. THis is after proving L2 makes a big difference and Fsb a smaller one.
  • SiliconDoc - Thursday, February 7, 2008 - link

    I have been enjoying Anandtech now for many years, and have appreciated the articles I've learned so much from for so long. One of the first uses of that knowledge was passing along the review on the HOT 591P to a very good programmer friend, upon which he purchased the board for his at home portion of work.
    That said, I had to finally make a username so I could comment on - the bias that is so often shown against AMD here. It is often subtle, in certain wordings and in less than blatantly obvious ways, but it has bothered me for some time. I guess that's the way the cookie crumbles, everyone has a favorite, for whatever reason.
    Concerning this article, I plodded along, and then found out that something was amiss once again with the tests chosen, or the equipment chosen, that resulted in a strange result, to AMD's disadvantage. I've seen it here it seems 100 times. Like good representatives, the articles writers pass along that they notified the manufacturers/companies, relieving some of the disdain for it I tasted. I wonder if AMD feels the same way. I doubt it. I suspect they are and have been angry about it.
    If it was the UT3 game, or the sli board, or whatever, why was the test posted as "valid" when it scientifically proved something other than card framerate limits were amiss ?
    I just can't help wondering how ragingly angry AMD reps are that view this type of thing, over and over again.
    I'm not sure why the bias is so consistent here, but it is here, and I just wish it wasn't.
    I've really never seen this site "unable" to diagnose the most obscure of matters when it comes to performance issues, but then again, if it's an AMD chip, often the two shoulders go up and the blank pout is given, then the applause for Intel is heartily enjoyed.
    If I'm not the first person who has said something like this, good.
    I don't generally read any comments, so maybe everyone has accepted the slant already and moved on.
    Nonetheless I think this site is wonderful, and will no doubt be visiting it for years to come, learning and learning and learning more and more, as much as I can to help me in my endeavors.
    For that, I have real way to repay the good that has been done for me, I hope that in some way exlpains why I feel fine expressing my opinion concerning the processor wars and the handling of the same by this site.
    Thanks to all at anandtech and all the rest of the fans out there.
  • BlackOmega - Thursday, November 8, 2007 - link

    Very useful article.

    Anyway, I would sugest you guys posted the diference in the minimum framerate attained by both processors...

    I'm running a Athlon 4200+ overclocked to 2.8ghz, and after some benchmarking I found out that there are certain areas in game where the frame rate would drop severely. In fly by runs I get 100 fps + running in 1024x768, but in actual game play, places like the ShockRifle/Helmet in Shangrila make the frame rate drop to ~40 fps.

    It would be nice if you guys could test those areas and see how the diferent processors affect minimum frame rate / specially heavy areas of the map.

    I'm also very interested in how cache affects AMD's processors performance.
  • Nil Einne - Saturday, October 27, 2007 - link

    Historically, AMD's A64 architecture has been a lot less cache sensitive then the Intel's C2. It would have been interesting to see how the A64 performance depended on cache akin to the C2 but sadly you didn't test this.
  • Tuvokkk - Sunday, October 21, 2007 - link

    please anand let us know the command to run the flyby benchs and the settings u used so that we can compare our results
  • Zoomer - Friday, October 19, 2007 - link

    How much disk space does the beta demo take up? I tried to install the 700mb installer and it complains that there's insufficient disk space, even though I have 20gb free in my partition containing my programs and user data, and >200GB spread out over a few other partitions.

    Or does it do a dumb check of c:\ ? That could be a problem; my c partition is only 4 gigs big and contains only windows files.
  • TSIMonster - Thursday, October 18, 2007 - link

    I'd like to see how the 2900 does with the addition of AA & AF. Typically, that is where it falls behind slightly. The architecture seems fine, its just the power usage and lack of AA & AF support that gets me thus far.
  • poohbear - Thursday, October 18, 2007 - link

    thanks anandtech for a great article!!!! very detailed and informative stuff. cheers again for the article and i hope u revisit it when the full game comes out. Mind u, were u using DX10 or 9 and can u do a comparison on this end too?
  • mongoosesRawesome - Thursday, October 18, 2007 - link

    I'm wondering if the large discrepancy in performance between 1M and 4M cache CPU's remains when you turn up the frequency? It seems that a lot of people are buying the lower clocked pentium dual cores and overclocking them to 3GHz speeds. Could you compare the chips in that situation in order to see if the cache matters as much at high frequencies as it does at low frequencies?
  • shuffle2 - Wednesday, October 17, 2007 - link

    I would love to see the game run with hardware physics enabled - with both the card installed and also with only the software installed. I currently run the beta demo on highest settings available, including hardware acceleration, and no errors are thrown up at any time. also, the game performs very, very smoothly.
  • Ryan Smith - Wednesday, October 17, 2007 - link

    See the post right above yours. ;-)
  • CRimer76 - Wednesday, October 17, 2007 - link

    Isn't this game supposed to use the Ageia crap to improve performance? Would love to see some benchies on that.
  • Ryan Smith - Wednesday, October 17, 2007 - link

    Yes, it can use the PhysX hardware to improve performance. However flybys are completely useless for testing PhysX, because there is no physics work going on. Since I cover the PPU side of things, once we have the ability to play with demos we'll have an article up.
  • MadBoris - Wednesday, October 17, 2007 - link

    Great work showing all the different gradients on vid card, cache, cpu, threading.
    The game really scales impressively on hardware.
    How did Sweeney manage to make the 2900xt shine? hehe

    I've got some small reservations using the flyby's, hopefully you guys will have a demo run for next time around. Theoretically, a demo loop should be pretty similar, mainly more CPU intensive, but I'm curious to see.

    Good work showing off the games scaling.
  • Sunrise089 - Wednesday, October 17, 2007 - link

    Not only does the X1950XT absolutely destroy the 7900GTX (the ATI card REALLY ended up being the better long-term buy), but the HD 2900XT looks absolutely great here. If this sort of performance is indicative of the future (and it may not be - perhaps there is something that will favor nVidia when the release copy of the game arrives with better graphics) then ATI looks much better than it did even last month.

    PLEASE do a follow up with this game at launch/when the new midrange GPUs launch. It's going to be very interesting to see the price/performance between the new 8800GT, an overclocked HD 2900pro, and the new ATI midrange card (HD 2950pro?)
  • tmx220 - Wednesday, October 17, 2007 - link

    they used an X1950XTX, though it would be about the same
    they should have used the XT because it can still be found, and for a decent price
  • Darth Farter - Wednesday, October 17, 2007 - link

    Anand,
    the last 2 game reviews are nice to have 1024x768 CPU comparisons, but I think we're seeing too many pages of 1024x768 of that which probably only 2% of the guys planning on running these games will use.

    I'd rather suggest showing at least the scaling if there's an intention of showing CPU limitations to the common 1280x1024, the 1680x1050(1600x1200) and the 1920x1200 to reflect what we're also actually are going to see on screen as user by swapping/Upgrading a CPU/Platform to another model/platform while planning on playing with our obviously high end GFX card at 1920x1200.

    an Overall CPU comparison with at only 1024x768 left me severely disappointed though I can understand the time constraints. This regrettably reminds me at Toms with their beside-the-point tactics in articles.

    Just my .02 for making Anands a bit better.

    Ty
  • IKeelU - Wednesday, October 17, 2007 - link

    I'd really like to see 1280x720. I usually play on my 720p TV and I imagine that resolution would be CPU-bound as well, though probably less so than 1024x768.
  • Roy2001 - Wednesday, October 17, 2007 - link

    I am more interest to see Phenom data.
  • hubajube - Wednesday, October 17, 2007 - link

    I'm officially done with Anandtech's conclusions on benchmarks. Why are you making these blanket "AMD is not competitive" statements? A 9% difference in fps performance and now AMD is shit?! Not to mention that whopping 9% difference only works out to roughly 5 whole frames per second! ROFLMAO!!!!!

    Brad: Hey Johnny! Did you see the new UT3 review from Anandtech? They showed the Intel CPU's KILLING AMD by five frames per second!

    Johnny: Holy shit Brad!!!!! AMD is going out of business with piss poor performance like that. I'll NEVER buy AMD again.
  • kmmatney - Wednesday, October 17, 2007 - link

    The benchmarks show that AMD cpu's are not performing as well as they should here. This will hopefully be fixed in the future.

    You sound like someone who has an AMD processor and is bitter...
  • clairvoyant129 - Wednesday, October 17, 2007 - link

    These are the same people who said there is a big difference using Netburst CPUs and K8s. Right, if a Netburst CPU coupled with a 7800GTX got 60 FPS when a K8 got 90 FPS, it was a huge difference to them but now it doesn't seem like it.
  • hubajube - Wednesday, October 17, 2007 - link

    quote:

    The benchmarks show that AMD cpu's are not performing as well as they should here.
    And how should they be performing in your opinion? 100 fps is not good enough for you? How about 500 fps? Is that better?

    quote:

    You sound like someone who has an AMD processor and is bitter...
    I'm definitely not bitter, just realistic. The difference between 90 and 180 fps is totally irrelevant. An Intel E2140 gets over 90fps. Hell, a Sempron with a decent video card could play this game extremely well.

    Benchmarks are great in that you can use them to judge how your system will perform with a game but they're not the be all end all of performance nor is a CPU that does 100 fps a pile of shit because it doesn't do 105 fps.
  • JarredWalton - Wednesday, October 17, 2007 - link

    The point is that at 1920x1200 we're at a completely GPU-limited resolution (as shown by the fact that the difference between E6550 and X6850 is only 1%). AMD still runs 9% slower, so it seems that architecture, cache, etc. means that even at GPU limited resolutions AMD is still slower than we would expect. Is it unplayable? No, but we're looking at the top-end AMD CPU (6400+) and in CPU-limited scenarios it's still 10% slower than an E6550.

    It seems to me that we're in a similar situation to what we saw at the end of the NetBurst era: higher clock speeds really aren't bringing much in the way of performance improvements. AMD needs a lot more than just CPU tweaks to close the gap, which is why we're all waiting to see how Phenom compares.
  • clairvoyant129 - Wednesday, October 17, 2007 - link

    That 9% was using 1920x1200. Majority of PC users use a much lower resolution than that. At 1024x768, it's much much higher.

    Think again moron.
  • KAZANI - Wednesday, October 17, 2007 - link

    And most people don't care about framerates higher than their monitor's refresh rate. Both processors were well above 100 frames in 1024*768.
  • hubajube - Wednesday, October 17, 2007 - link

    No moron, 1024x768 on a 8800GTX is NOT what "most PC users users" are going to be using. The video cards that "most PC users users" will be using was not tested in this benchmark. YOU need to actually THINK next time.
  • clairvoyant129 - Wednesday, October 17, 2007 - link

    Where did I say majority of PC users with an 8800GTX use 1024x768? What's your idea of testing CPUs? Benchmark them by using GPU limited resolutions? What a joke. You people never complained when Anand compared Netburst CPUs to K8s at 1024x768 or lower resolutions.

    Don't get your panties twisted AMD fanny.
  • IKeelU - Wednesday, October 17, 2007 - link

    Ummm...how do you launch the flybys used in this analysis?
  • customcoms - Wednesday, October 17, 2007 - link

    You mention that you cranked the resolution to 1920x1200, but the charts still say 1024x768...the results look like those at 1920x1200 though, so I'm guessing its a typo. GPU bound CPU Comparison charts here: http://anandtech.com/video/showdoc.aspx?i=3127&...">http://anandtech.com/video/showdoc.aspx?i=3127&...
  • Super Nade - Wednesday, October 17, 2007 - link

    Hi,
    A very interesting article. I was wondering if you could pit Intel and AMD chips having the same L2 cache against each other? You can clock up the lower specced Intel part if need be. Since the game is L2 bound, this experiment would provide a more balanced outlook in a GPU bound or a clock-for-clock situation.

    Cheers,

    S-N
    OCForums
  • johnsonx - Wednesday, October 17, 2007 - link

    If this is a beta demo (the multiplayer server browser does scream BETA! doesn't it?), does anyone know if there will be an updated 'final' demo? Further, will there be any Onslaught maps in said demo?
  • gramboh - Wednesday, October 17, 2007 - link

    Does AA work in the demo? I've downloaded it but due to Orange Box I haven't even bothered installing it.

    I'm mainly interested to know if the 8800GTS/GTX blow away the 2900XT or not once AA is enabled at 1920x1200 so that I feel better about my purchase of a 8800GTS 640 :)
  • NullSubroutine - Wednesday, October 17, 2007 - link

    I dont think the purpose of reviews is to make you feel better about what you already bought. But hey, to each his own.
  • jamori - Wednesday, October 17, 2007 - link

    I appreciated the inclusion of some of the previous-gen cards here. Some of the recent graphics card reviews completely left them out, which essentially leaves me guessing as to how much improvement I might see by upgrading my graphics card.

    30 fps at 1920x1200 on my 7900GTX isn't great, but it's still at the lower threshold of playable, which is nice to know.
  • p30n - Wednesday, October 17, 2007 - link

    What does this show us? At least for UT3 quad (vs dual) is rather a waste. If it shows so small and increase when its not gpu limited (ie low res 1024 tests) one wonders what will happen at higher res where the gpu is actually pressured.

    And even more so when your already crossing 100fps does it even matter?

    Now if quad helped us go from say 20 fps to 30fps I would say hell yeah its actually making a difference, but as is 155 fps to 186 fps...omg now I have to get a quad!

    Of course there were no tests at high res to compare core effect too.
  • RamarC - Wednesday, October 17, 2007 - link

    You have to remember that these were flybys with no activity. UT3 is making little of use of the extra cores now, but I'll bet that a quad core will flex its muscle once you add players, vehicles, explosions, and all the mayhem that goes along with a good UT match.
  • jebo - Wednesday, October 17, 2007 - link

    I thought the CPU tests were interesting, but largely useless! It reminds me of the UT2K3 flyby vs botmatch days. Why didn't Derek and Anand use Fraps to run a few botmatches. Sure the results aren't 100% repeatable, but if you run a couple botmatches on each setup and average the fps you'd get a fairly accurate result.

  • Anand Lal Shimpi - Wednesday, October 17, 2007 - link

    I actually played with this, the frame rates weren't actually much different at all. You also have to keep in mind that in real world usage there won't be any bots, just other human opponents.

    Take care,
    Anand
  • DerekWilson - Wednesday, October 17, 2007 - link

    just to underline this, running through a level with fraps yielded very similar framerates to the flyby tests.

    but the problem here is that even our run throughs were not very taxing. We can't add 9 thousands bots to a game and test anything, as they will cause problems with the run through adding variability and removing the possibility of repeatability.

    we don't have a doezen people we can start up a multiplayer game with and choreograph a scene that we can run every time we need another number.

    the real solution to the issue is the demo record and playback functionality. until we get that, flyby testing is the best we can offer and just isn't any better than a fraps run.
  • p30n - Wednesday, October 17, 2007 - link

    very good point.
  • retrospooty - Wednesday, October 17, 2007 - link

    "What does this show us? At least for UT3 quad (vs dual) is rather a waste."

    ya, thats pretty much what they said in the article. They tested it so the results can be known.
  • thompsjt1 - Wednesday, October 17, 2007 - link

    "THERE ARE NO HIGH RESOLUTION TEXTURES" in the UT3 BETA demo. They didn't include them for download size reasons and I am sure they WILL include them in the real official demo and I think once you are running these high resolution textures and settings are maximized, we will see bigger difference of Nvidia vs AMD numbers.
  • pnyffeler - Wednesday, October 17, 2007 - link

    The article spent a lot of time on the effect of the size of the cache on an Intel processor, but what about AMD? Does the size of the cache matter, or is this yet another example of Intel's Northbridge system being trumped by AMD's advantage of having the memory controller on the CPU?

    I have no misconceptions that AMD has a chance of topping an Intel here. I'm just curious to see how much better Nehalem will be.

    P.S. Thumbs down on the CPU comparison. You said in the setup you were going to test an X2 4200, but it never made the charts. And what about an 8600 GT? I'm going to be running this game at 640x480, aren't I....
  • NullSubroutine - Wednesday, October 17, 2007 - link

    I believe they did not really test AMD's cpus right now due to the awaiting arrival of Phenom which is less than a month away.

    Looks as though the 200-250 range (RV670 bins) are going to kick some bahooty given their higher core speeds, especially at the 1280x1024, 1600x1200, and 1680x1050 resolutions.
  • Chaser - Wednesday, October 17, 2007 - link

    quote:

    but the real surprise is how competitive AMD is with the Radeon HD 2900 XT.


    I knew with mature drivers this card would rock. It only too a short amount of time. Good job ATI and Anandtech for demonstrating this.
  • aka1nas - Wednesday, October 17, 2007 - link

    That's mainly because there is no AA applied because the UT3 engine doesn't support it in Dx9 mode. AA has been the R600s stumbling block.
  • NullSubroutine - Wednesday, October 17, 2007 - link

    Multi-Sampling seems to run fine, it is Super Sampling that seems to be broken.
  • shabby - Wednesday, October 17, 2007 - link

    The beta demo looks nothing like epic wanted us to think, these pics are back from july.
    http://ve3d.ign.com/images/fullsize/143/PC/Unreal-...">http://ve3d.ign.com/images/fullsize/143/PC/Unreal-...
  • swaaye - Wednesday, October 17, 2007 - link

    It looks nothing like that because that is a ultra supersampled bullshot. Just like every single other game gets these days. In reality, UT3 looks as good as Gears of War right now, and will look a lot better once we get the high quality assets with the full game.
  • retrospooty - Wednesday, October 17, 2007 - link

    The demo is not full detail, its a playable test with lower image settings than the final game. Wait and see the final before we see what its "supposed" to look like.
  • shabby - Wednesday, October 17, 2007 - link

    If its really going to look like these pics then this performance review is useless, fps will drop by half. Im hoping you're right, but im betting epic simply polished up those pics to wow everyone.
  • imaheadcase - Wednesday, October 17, 2007 - link

    Yah, i downloaded the demo. I was really disappointed. It gets boring really fast, like Quake Wars does. Nothing really innovating in terms of graphics, or fun factor, or "repeatability" over long term.

    I think TF2 ruined all these games. lol
  • Pjotr - Wednesday, October 17, 2007 - link

    quote:

    Despite being a mostly GPU-bound scenario, Intel still managed a 9% performance advantage over AMD at 3.0GHz.


    This text does not match the graphs posted, where the max gain was 4.4% for ShangriLa, the other two tests showed close to 0% difference. Analysis mistake, graph error, typo or what?
  • Frallan - Wednesday, October 17, 2007 - link

    Well since many of us today (yes even I bought one) has started to use Lappys and there is ones out now that where it is possible to play a view on how a 8600 GT w 512DDR2 and 256 DDR3 does would be nice and from 1680*1050 downwards. There are tons of these in lappys now so..

    Pretty please :)
  • Lifted - Wednesday, October 17, 2007 - link

    Seconded.
  • Blacklash - Wednesday, October 17, 2007 - link

    Just grab ATiTool from Techpowerup! and reduce the clocks of the XT.

    Interesting article, and as someone said you have a chart with text referring to AMD above it and the actual CPU numbers labelled as Intel. The last four charts on the clock for clock page say AMD yet list Intel CPUs.
  • dvinnen - Wednesday, October 17, 2007 - link

    Last time I checked, ATiTool dosen't support the HD 2900xt. Only one that I know of is a buggy peice of software from AMD to change clock speeds.
  • NullSubroutine - Wednesday, October 17, 2007 - link

    Ati Tools works with 2900, I use it for my overclocking/fan contol.

    Ati Tray Tols does not work with the 2900. They are two different programs, Ati Tools is mainly for the things I use them for, while Tray Tools is a replacement for Catalyst Control Center.
  • Bjoern77 - Wednesday, October 17, 2007 - link

    why bench the 2900 pro? is there a 2900pro which can't be clocked back to xt level? :)


    I'm glad i got a 2900pro for myself, regarding this benches i get 95% performance of a 8800gtx for 50% of it's price.
  • decalpha - Wednesday, October 17, 2007 - link

    Why not compare the CPUs with similar cache size, since Athlon 64 X2 6000 has 2MB cache whereas the Core 2 Duo E6850 has 4MB and cache size does seem to matter.
  • drebo - Wednesday, October 17, 2007 - link

    I think it's even more relevant to point out that clock-for-clock comparisons have been worthless for a very long time, and only seem to have come back on this site now that Intel has a more efficient pipeline.
  • PrinceGaz - Wednesday, October 17, 2007 - link

    The X2 6000+ actually has 2x 1MB cache, which in most cases is worse than 2MB shared, so the cache situation is even worse for AMD in the comparison that was performed.
  • drebo - Thursday, October 18, 2007 - link

    Well, cache size in general is less important for AMD processors, as the path from CPU to RAM is much, much quicker. It would be interesting (and very, very difficult to gauge) what the difference would be. This is most likely why they left AMD off of the cache comparison charts. It's impossible, due to far too dissimilar architectures, to isolate ONLY the memory subsystems, which is what a cache comparison would be attempting to do.

    Cache misses on an Intel architecture are far more expensive than on AMD's architecture. But, without otherwise identical chips, there's simply no way to make a comparison.
  • bloc - Wednesday, October 17, 2007 - link

    I think if you compared the 8600 gts and x2600 xt, the perf would be pretty close, with the x2600 xt being $50 cheaper.

    The architecture is there. Some games like cod4 hasn't taken advantage of it yet.
  • ImmortalZ - Wednesday, October 17, 2007 - link

    The second set of graphs on page 3 seem to be all confused. Mixed up title text?

    Also, regarding the ATI midrange part, surely you guys have heard about the 2900PRO?
  • JarredWalton - Wednesday, October 17, 2007 - link

    P3 graphs fixed. I'd imagine trying to get a 2900 Pro for testing is proving more difficult than anticipated. I know looking online that the few places I've seen that list them are out of stock.
  • ImmortalZ - Wednesday, October 17, 2007 - link

    Well, it's easy to test a 2900PRO. Underclock a 2900XT to 600Mhz core and 1600Mhz memory and test away! :D (there are 512MB GDDR3 and 1GB GDDR4 versions, so...). Just change the price from 389.99 to 249.99 for the 512MB and 319.99 for the 1GB.

    Of course, I'd personally wait for the 2950s to show up - single slot coolers are teh win :P
  • Bremen7000 - Wednesday, October 17, 2007 - link

    What about the page 6 graphs? Am I missing something or are they lacking something?
  • RobberBaron - Wednesday, October 17, 2007 - link

    Second that. The second set of charts on page 6 is Intel CPu's only. Little confusing
  • Anand Lal Shimpi - Wednesday, October 17, 2007 - link

    Woooooooops, fixed :)
  • Olaf van der Spek - Wednesday, October 17, 2007 - link

    quote:

    Over a 66.5% increase in clock frequency, overall performance goes up less than 28%.

    So your comment about overclocking a 1 mb cache CPU 20% to get the performance of a 2 mb cache CPU isn't valid.

Log in

Don't have an account? Sign up now