I'm usually against AnandTech straying away from their core hardware reviews they've become famous for in the first place, but this is the best, most thorough, in-depth game review I have ever read! Very well done, most enjoyable reading. Thanks:)
I'm curious why Anandtech recommended Vista without comparing the performance of Windows XP. They didn't even have a test box running XP or did I miss it.
From my experience and experience from friends, Vista is still behind XP in gaming performance. In some cases, far behind.
With modern DX10 GPUs, Vista is required to even get DX10 support. Having looked at DX9 Assassin's Creed, I can't say the difference is all that striking, but the DX10 mode did seem to run faster. (I could test if there's desire, but it will have to wait as I'm traveling this week and don't have access to the test systems used in this article.)
Personally, while Vista had some issues out of the gate, drivers and performance are now much better. XP may still be faster in some situations, but if you're running a DX10 GPU I can't see any reason to stick with XP. In fact, there are plenty of aspects of Vista that I actually prefer in general use.
Since this was primarily a game review, and I already spent 3x as much time benchmarking as I actually did beating the game, I just wanted to get it wrapped up. Adding in DX9 Vista vs. DX9 XP would have required another 20-30 hours of benchmarking, and I didn't think the "payoff" was worthwhile.
[quote]Unlike Oblivion, however, all of the activity you see is merely a façade. The reality is that all the people are in scripted loops, endlessly repeating their activities.[/quote]
...which is exactly what Oblivion NPCs do (compounded by the fact that they all have the same handful of voices, that all voices use exactly the same sentences, and that some characters change voice completely depending on which scripted line they're repeating).
If anything, Oblivion's world feels even more artificial than Morrowind. One thing is the AI we were promised for Oblivion while the game was in development, another is what actually shipped. Most of the behaviors shown in the "preview videos" simply aren't in the game at all.
Even with the (many, and very good) 3rd party mods out there, Oblivion NPCs feel like robots.
Oblivion NPCs can actually leave town, they sleep at night, they wander around a much larger area.... Yes, they feel scripted, but compared to the AC NPCs they are geniuses. The people in AC walk in tight loops - like imagine someone walking a path of about 500-1000 feet endlessly, with no interruptions for food, bed, etc. I'm not saying Oblivion is the best game ever, but it comes a lot closer to making you feel like it's a "real" world than Assassin's Creed.
This is a very old new, the DX10.1 suppor of this game eliminate one render pass BUT with a cost, a inferior quality image.
The render image isn´t equal to DX10 version, Ubisoft then dropped suport for DX10.1 in 1.02 patch.
A story very simple, nothing about conspiracy theory, or phantoms.
Anandtech guys, if you believe in these phantoms, then make a review with 1.01 patch (this is yet on this world, men, download and test the f***ing patch), otherwise, your credibility will disminish thanks to this conspiracy theory.
I tested with version 1.00 and 1.02 on NVIDIA and ATI hardware. I provided images of 1.00 and 1.02 on both sets of hardware. The differences in image quality that I see are at best extremely trivial, and yet 1.02 in 4xAA runs about 25% slower on ATI hardware than 1.00.
What is version 1.01 supposed to show me exactly? They released 1.01, pulled it, and then released 1.02. Seems like they felt there were some problems with 1.01, so testing with it makes no sense.
Well, you writed several pages about the suspicious reasons of the dropped support of DX10.1.
If you sow the seeds of doubt, then you´ld have done a test for it.
The story of this dropped suport has a official version (graphical bugs), and in many forums users reported this with 1.01 patch (and DX10.1). Another version is the conspiracy theory, but this version hasn´t proof.
¿This is the truth? I don´t know, I can´t test this with my computer, but if you publish the conspiracy theory and test the performance and quality of 1.0 and 1.02 version, why don´t you do the same with 1.01 patch?
This is not about performance, this is to endorse your version of the story. With this, your words earn respect, without the test, your words are transformed into bad rumors.
I still don't get what you're after. Version 1.00 has DirectX 10.1 support; version 1.02 does not. Exactly what is version 1.01 supposed to add to that mix? Faulty DX10.1? Removed DX10.1 with graphical errors? I don't even know where to find it (if it exists), so please provide a link.
The only official word from Ubisoft is that DX10.1 "removed a rendering pass, which is costly." That statement doesn't even make sense, however, as what they really should have said is DX10.1 allowed them to remove a rendering pass, which was beneficial. Now, if it was beneficial, why would they then get rid of this support!?
As an example of what you're saying, Vista SP1 brings together a bunch of updates in one package and offers better performance in several areas relative to the initial release of the OS. So imagine we test networking performance with the launch version of Vista, and then we test it with SP1 installed, and we conclude that indeed somewhere along the way network performance improved. Then you waltz in and suggest that our findings are meaningless because we didn't test Vista without SP1 but with all the other standard updates applied. What exactly would that show? That SP1 was a conglomerate of previous updates? We already know that.
So again, what exactly is version 1.01 supposed to show? Version 1.02 appears to correct the errors that were seen with version 1.00. Unless version 1.01 removed DX10.1 and offered equivalent performance to 1.00 or kept DX10.1 and offered equivalent performance to 1.02, there's no reason to test it.
Maybe the issue is the version numbers we're talking about. I'm calling version 1.0.0.1 of the game - what the DVD shipped with - version 1.00. The patched version of the game is 1.0.2.1, so I call that 1.02. Here's what the 1.02 patch officially corrects:
------------------
* Fixed a rare crash while riding the horse in Kingdom
* Fixed a corruption of Altair’s robe on certain graphics hardware
* Cursor is now centered when accessing the Map
* Fixed a few problems with Alt-Tab
* Fixed a graphical bug in the final fight
* Fixed a few graphical problems with dead bodies
* Fixed pixellation with post-FX enabled on certain graphics hardware
* Fixed a small bug in the DNA Menu that would cause the image to disappear if the arrow was clicked rapidly
* Fixed some graphical corruption in Present Room with low Level Of Detail
* Character input is now canceled if the controller is unplugged while moving
* Added support for x64 versions of Windows
* Fixed broken post-effects on DirectX 10.1 enabled cards
------------------
I've heard more about rendering errors on NVIDIA hardware with v1.00 than I have of ATI hardware having problems. I showed a (rare) rendering error in the images that happens with ATI and 4xAA, but all you have to do is lock onto a target or enter Eagle Vision to get rid of the error (and I never saw it come back until I restarted the game).
Bottom line is I have PROOF that v1.00 and v1.02 differ in performance, specifically in the area of anti-aliasing on ATI 3000 hardware. If a version 1.01 patch ever existed, it doesn't matter in this comparison. The conspiracy "theory" part is why Ubisoft removed DX10.1 support. If you're naive enough to think NVIDIA had nothing to do with that, I wish you best of luck in your life. That NVIDIA and Ubisoft didn't even respond to our email on the subject speaks volumes - if you can't say anything that won't make you look even worse, you just ignore the problem and go on your merry way.
In this one Rage3D (a proATI website) analyzes the reason of the dropped support of DX10.1, with a comparation of images of the different rendering modes.
In this article Rage3D people found several graphical bugs of the dx10.1, they described them as minor bugs, BUT I don´t think that the lack of some effects in the DX10.1 are minor bugs.
The DX10.1 with activated AA lacks of dust effect, and the HDR rendering is different from the DX10 version.
In Rage3D thinks that this show a DX10 bug in HDR rendering, because they said that Ubisoft declared that HDR rendering in DX9 and DX10 paths are identical, and they tested that DX10 and DX9 HDR rendering are different. This point could be true, but it´s something strange that the DX10 HDR rendering path was buggy in the release version of the game, and in the 1.01 patch too.
It´s more logic that the DX10 HDR was correct and the difference with DX10.1 HDR reflects different and buggy render path (Do you remember the lack of one render pass?).
The speedup of performance of 1.01 patch (in 3DRage test) in the game looks like your test results. Then, the lack of DX10.1 support in 1.02 patch doesn´t affect the performance. Yes, in DX10.1 looks like that the AA is better than in other paths, but with this version you have lack of dust effect and different (buggy or not?) HDR rendering. Good reasons for the dropped support, I think.
Consequences of your rumors about sinister dropped support:
Some people believe this version because you defend it in your review, but you didn´t test the veracity of this. The truth is that DX10.1 render path had bugs, and when you made the review, you didn´t know if the dropped support reason was the conspiracy theory or other reason, but YOU chose one by personal election.
That Ubisoft and Nvidia didn´t respond to your email post proved nothing. At most, they were bad-mannered guys with you.
AC was only an average game overall. Graphics presentation was it's strong suit but gameplay was lacking and the game seemed to drag on as the player progresses through the game. Undoubtedly the repetitiveness was the worst part of AC. Not simply the fact that is seemed like some tasks were a redue of previously accomplished tasks but the shear fact that tasks were repeated verbatim with same characters and voices, only changing dialogue to create variation. Some characters players will have to kill multiple times under the guise of actually killing different characters in the game. AC was also disappointing in the fact that it mislead gamers in presenting its self a a somewhat stealth game. Nothing could be further from the truth. In AC players will often be forced into full on combat with multiple opponents to progress in the storyline. In vary select situations players have the choice of using stealth as a viable option. Ironically the last 5 or 10 minutes at the very end of the game are the most compelling. After the credits roll players are left in the main room to explore and decrypt code and hidden messages. It's unfortunate AC's developers couldn't have spent more time on puzzles that actually pertained to gameplay. Out of a possible 10 I give Assassin's Creed a 6.0 barely coming in at par, bordering on subpar.
It does have a Thief feel but after playing it on Xbox, I found it to be more Thief meets BloodRayne meets Splinter Cell with some of the best graphics I have seen in a while. It is sorta repetitive,but the violence cut away any boredom I had. I enjoyed it a lot.
But I guess they could have done what other Rockin high-profile companies do and make an even more repetitive game exactly like its previous versions, but with a much worse soundtrack. Throw in some terribly low-res character models and reuse the same, bottom-of-the-barrel, cartoon-looking cutscenes and they would have a perfect 10 as well.
this is the kind of game reviews i'd like to see, wherein hardware is tested w/ a game to show performance. I mean, you guys ARE a hardware site and there are'nt many sites that do game reviews w/ hardware testing shortly after. I dont think u should do game reviews without hardware testing cause there are a ton of game review sites, but your niche shines when you do these hardware and DX analysis. cheers and thanks for a very informative article.
The article talks about wanting to check the performance of Assassin's Creed on a Phenom processor (and its 4 cores). I can speak to that to some degree.
I have a Phenom 9600 (2.3 MHz) on an AM2 board (Asus M2N-SLI Deluxe) with an EVGA 8800GT OC (650 MHz). I play at 1280x1024 so I play in a letterbox mode. This processor is enough to run the game at acceptable frame rates but I would tend to think that a fast dual core would do just as well (like was found with the Intel processors in the article).
With the performance hurting TLB patch enabled, I noticed one area where frame rates truly took a nosedive: when doing the "look" pan from the top of one of those towers. I didn't have an fps counter on at the time, but I'm guessing it was in the 12-16 fps range based on the chunkiness I experienced. I got similar slow framerates when diving from those spots into the hay (especially the really high towers).
With the TLB patch disabled on my Phenom, those two low fps spots were much better. I'm guessing that those areas were in the low 20's of fps. The rest of the game was smooth as silk and probably above 40 fps. I do not, however, see high utilization on any particular core when I've checked.
If I was to point to areas that really stress a system in AC, I would say that the tower pan shots are the most common. (*minor spoiler ahead*) King Richard's speech from horseback about 3/4 the way through the game is also very intensive.
I was going to mention that as well. Anamorphic means the pixels making up the image are stretched either horizontally (as with wide-screen DVDs) or vertically when displayed. If the game were anamorphic, it would be like it running on a monitor at 1920x1080 but being rendered internally at some other resolution such as 1440x1080 and stretched to the displayed 1920x1080.
The correct description is what you said originally, that it allows only a 16:9 aspect-ratio view, so if I ran it on my monitor (1600x1200 native) the game itself would only use the central 1600x900 of that.
Sorry - I saw the original comment and thought I corrected it. Missed the other two occurrences. I wasn't thinking and just used the word after reading the thread on widescreengamingforum.com about AC. (I was hoping someone had found a way around the locked letterbox view.)
Actually it's terrible, I cant read the graphs AT ALL.
seriously my eyes just glazed over those terrible charts..completely unreadable. I still, have no idea what I'm looking at. Is ATI supposed to be faster in this game? Why did they test with version 1.00 on ATI and 1.2 on Nvidia? I dont know because the graphs are totally useless.
I second that. The graphs are terrible. Maybe bar graphs would have been better?
Sometimes when you're the one making the graph it's hard to imagine what other people are seeing when they look at them. I suggest having another pair of eyes check the graphs out for readability.
Besides that, I loved the review. Especially the performance part and the 10.1 controversy.
Charts are colored with similar colors used either for ATI vs. NVIDIA, 1.00 vs. 1.02, or dual-GPU vs. single-GPU. I could have generated four times as many graphs to show the same data, but I figure most people are capable of reading the labels on a chart and figuring out what they mean. Here's a hint: when you can't see the difference between two lines because they overlap, it's a tie.
If you want to give specific examples and recommendations on what would look better and still convey the same amount of information, I'm all ears. However, simply stating that "the graphs are terrible" does little to help. Tell me what graph specifically is terrible, and tell me why it's terrible.
As an example of why I used these graphs, page 9 has two charts showing 40 total data points. You can get a clear idea of how performance scales with single or dual GPUs at the various detail settings looking at a single chart. Green is NVIDIA, Red is ATI. That makes a lot of sense to me. Creating ten different bar charts with four lines in each to show the same data makes it more difficult to compare how Medium graphics compares to High graphics performance, and it takes up five times as much space to tell the same "story".
Page 6 is the same thing, but with green used for dual-GPUs (light and dark for 1.00 and 1.02) and red for single GPUs. 24 data points in two charts instead of using six charts. Having established that 1.00 doesn't perform any different than 1.02 on NVIDIA hardware, I skipped the 1.00 NVIDIA numbers to make those charts easier to read on page 7. Then I put in the four standard test system (0xAA and 4xAA, ATI and NVIDIA) on 1.02, with 1.00 4xAA ATI in blue as a reference.
Lastly, on page 8 I have two clock speeds on NVIDIA, three on ATI, with different base colors for single and dual GPUs. ATI and NVIDIA are in separate charts, and brighter colors are for a higher overclock.
There's method to my graphing madness. Are the charts immediately clear to a casual glance? No, but then that's really difficult to do while still conveying all of the information. I spent a lot of time trying to make comprehensible charts, and settled on these as the best option I could come up with. Again, if they're so bad, it must be easy to generate something clearly better - have at it, and I'll be happy to use any sensible suggestions. However, if the only complaint is that you actually have to look at the charts and think for a minute before you understand, I'm not likely to be very sympathetic. I think our readers are smart enough to digest these graphs.
While I appreciate the detailed review, isn't it a little irrelevant now? I mean, the game's been out for nearly 2 months now and it's been reviewed everywhere. The only thing new about this review are the performance benchmarks, in which case I would have have made the review solely about performance instead of gameplay.
For what it's worth, another forum I read had some screenshots comparing DX10 and DX10.1. The problems the poster had managed to find involved trees; there was some kind of post-processing rendering going on with trees that wasn't occurring with DX10.1, which made them look weird.
Not fixing 10.1 may be an NVIDIA thing, but there was definitely a problem with it as-is.
Maybe because Nvidia GPUs cant support AA through shaders. So no use supporting dx 10.1. ATI GPUs have 320 stream processors so it can utilize for shaders and etc. Nvidia cards have less SPs but more ROPs, TMUs which translates to more brute power if games dont use shaders or SPs much. Technology wise, I think ATI is ahead but NVIDIA GPUs have game developer support and more raw horsepower so performance wise NVIDIA is ahead and I think this trend will continue with GTX200 series. I choosed G92 over RV670 because the raw performance is much better even though on paper HD 3800 series look great.
The original halo had performance issues but they weren't alarming, halo was actually not too bad port compared to many other console to PC disasters. Halo 1 got 'better with hardware' advancing. Halo 2 on the other hand is just all around atrocious. Halo 2 was just not a very well made game, period, despite the addition of cutscenes, etc. Halo 1 had a much better feel and better vehicle design IMHO, I hated how the warthog looked in Halo 2, it annoyed me to no end.
Thats no excuse. Halo sucked performance and gameplay wise compared to the PC-first titles of then - and that is what matters. In essence, the game is bad when you're used to play that genre on the PC. Same is true for gears of war but that port is lackluster in many more ways.
I fell two times for console to PC ports. Never again.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
32 Comments
Back to Article
Zak - Tuesday, June 3, 2008 - link
I'm usually against AnandTech straying away from their core hardware reviews they've become famous for in the first place, but this is the best, most thorough, in-depth game review I have ever read! Very well done, most enjoyable reading. Thanks:)Zak
mustardman - Tuesday, June 3, 2008 - link
I'm curious why Anandtech recommended Vista without comparing the performance of Windows XP. They didn't even have a test box running XP or did I miss it.From my experience and experience from friends, Vista is still behind XP in gaming performance. In some cases, far behind.
Am I missing something?
JarredWalton - Tuesday, June 3, 2008 - link
With modern DX10 GPUs, Vista is required to even get DX10 support. Having looked at DX9 Assassin's Creed, I can't say the difference is all that striking, but the DX10 mode did seem to run faster. (I could test if there's desire, but it will have to wait as I'm traveling this week and don't have access to the test systems used in this article.)Personally, while Vista had some issues out of the gate, drivers and performance are now much better. XP may still be faster in some situations, but if you're running a DX10 GPU I can't see any reason to stick with XP. In fact, there are plenty of aspects of Vista that I actually prefer in general use.
Since this was primarily a game review, and I already spent 3x as much time benchmarking as I actually did beating the game, I just wanted to get it wrapped up. Adding in DX9 Vista vs. DX9 XP would have required another 20-30 hours of benchmarking, and I didn't think the "payoff" was worthwhile.
Justin Case - Monday, June 2, 2008 - link
[quote]Unlike Oblivion, however, all of the activity you see is merely a façade. The reality is that all the people are in scripted loops, endlessly repeating their activities.[/quote]...which is exactly what Oblivion NPCs do (compounded by the fact that they all have the same handful of voices, that all voices use exactly the same sentences, and that some characters change voice completely depending on which scripted line they're repeating).
If anything, Oblivion's world feels even more artificial than Morrowind. One thing is the AI we were promised for Oblivion while the game was in development, another is what actually shipped. Most of the behaviors shown in the "preview videos" simply aren't in the game at all.
Even with the (many, and very good) 3rd party mods out there, Oblivion NPCs feel like robots.
JarredWalton - Tuesday, June 3, 2008 - link
Oblivion NPCs can actually leave town, they sleep at night, they wander around a much larger area.... Yes, they feel scripted, but compared to the AC NPCs they are geniuses. The people in AC walk in tight loops - like imagine someone walking a path of about 500-1000 feet endlessly, with no interruptions for food, bed, etc. I'm not saying Oblivion is the best game ever, but it comes a lot closer to making you feel like it's a "real" world than Assassin's Creed.But I still enjoyed the game overall.
erwendigo - Monday, June 2, 2008 - link
This is a very old new, the DX10.1 suppor of this game eliminate one render pass BUT with a cost, a inferior quality image.The render image isn´t equal to DX10 version, Ubisoft then dropped suport for DX10.1 in 1.02 patch.
A story very simple, nothing about conspiracy theory, or phantoms.
Anandtech guys, if you believe in these phantoms, then make a review with 1.01 patch (this is yet on this world, men, download and test the f***ing patch), otherwise, your credibility will disminish thanks to this conspiracy theory.
JarredWalton - Tuesday, June 3, 2008 - link
I tested with version 1.00 and 1.02 on NVIDIA and ATI hardware. I provided images of 1.00 and 1.02 on both sets of hardware. The differences in image quality that I see are at best extremely trivial, and yet 1.02 in 4xAA runs about 25% slower on ATI hardware than 1.00.What is version 1.01 supposed to show me exactly? They released 1.01, pulled it, and then released 1.02. Seems like they felt there were some problems with 1.01, so testing with it makes no sense.
erwendigo - Tuesday, June 3, 2008 - link
Well, you writed several pages about the suspicious reasons of the dropped support of DX10.1.If you sow the seeds of doubt, then you´ld have done a test for it.
The story of this dropped suport has a official version (graphical bugs), and in many forums users reported this with 1.01 patch (and DX10.1). Another version is the conspiracy theory, but this version hasn´t proof.
¿This is the truth? I don´t know, I can´t test this with my computer, but if you publish the conspiracy theory and test the performance and quality of 1.0 and 1.02 version, why don´t you do the same with 1.01 patch?
This is not about performance, this is to endorse your version of the story. With this, your words earn respect, without the test, your words are transformed into bad rumors.
JarredWalton - Tuesday, June 3, 2008 - link
I still don't get what you're after. Version 1.00 has DirectX 10.1 support; version 1.02 does not. Exactly what is version 1.01 supposed to add to that mix? Faulty DX10.1? Removed DX10.1 with graphical errors? I don't even know where to find it (if it exists), so please provide a link.The only official word from Ubisoft is that DX10.1 "removed a rendering pass, which is costly." That statement doesn't even make sense, however, as what they really should have said is DX10.1 allowed them to remove a rendering pass, which was beneficial. Now, if it was beneficial, why would they then get rid of this support!?
As an example of what you're saying, Vista SP1 brings together a bunch of updates in one package and offers better performance in several areas relative to the initial release of the OS. So imagine we test networking performance with the launch version of Vista, and then we test it with SP1 installed, and we conclude that indeed somewhere along the way network performance improved. Then you waltz in and suggest that our findings are meaningless because we didn't test Vista without SP1 but with all the other standard updates applied. What exactly would that show? That SP1 was a conglomerate of previous updates? We already know that.
So again, what exactly is version 1.01 supposed to show? Version 1.02 appears to correct the errors that were seen with version 1.00. Unless version 1.01 removed DX10.1 and offered equivalent performance to 1.00 or kept DX10.1 and offered equivalent performance to 1.02, there's no reason to test it.
Maybe the issue is the version numbers we're talking about. I'm calling version 1.0.0.1 of the game - what the DVD shipped with - version 1.00. The patched version of the game is 1.0.2.1, so I call that 1.02. Here's what the 1.02 patch officially corrects:
------------------
* Fixed a rare crash while riding the horse in Kingdom
* Fixed a corruption of Altair’s robe on certain graphics hardware
* Cursor is now centered when accessing the Map
* Fixed a few problems with Alt-Tab
* Fixed a graphical bug in the final fight
* Fixed a few graphical problems with dead bodies
* Fixed pixellation with post-FX enabled on certain graphics hardware
* Fixed a small bug in the DNA Menu that would cause the image to disappear if the arrow was clicked rapidly
* Fixed some graphical corruption in Present Room with low Level Of Detail
* Character input is now canceled if the controller is unplugged while moving
* Added support for x64 versions of Windows
* Fixed broken post-effects on DirectX 10.1 enabled cards
------------------
I've heard more about rendering errors on NVIDIA hardware with v1.00 than I have of ATI hardware having problems. I showed a (rare) rendering error in the images that happens with ATI and 4xAA, but all you have to do is lock onto a target or enter Eagle Vision to get rid of the error (and I never saw it come back until I restarted the game).
Bottom line is I have PROOF that v1.00 and v1.02 differ in performance, specifically in the area of anti-aliasing on ATI 3000 hardware. If a version 1.01 patch ever existed, it doesn't matter in this comparison. The conspiracy "theory" part is why Ubisoft removed DX10.1 support. If you're naive enough to think NVIDIA had nothing to do with that, I wish you best of luck in your life. That NVIDIA and Ubisoft didn't even respond to our email on the subject speaks volumes - if you can't say anything that won't make you look even worse, you just ignore the problem and go on your merry way.
erwendigo - Tuesday, June 3, 2008 - link
Well, you talk me about some links, then here have some of them:The first and foremost:
http://www.rage3d.com/articles/assassinscreed%2Dad...">http://www.rage3d.com/articles/assassinscreed%2Dad...
In this one Rage3D (a proATI website) analyzes the reason of the dropped support of DX10.1, with a comparation of images of the different rendering modes.
In this article Rage3D people found several graphical bugs of the dx10.1, they described them as minor bugs, BUT I don´t think that the lack of some effects in the DX10.1 are minor bugs.
The DX10.1 with activated AA lacks of dust effect, and the HDR rendering is different from the DX10 version.
In Rage3D thinks that this show a DX10 bug in HDR rendering, because they said that Ubisoft declared that HDR rendering in DX9 and DX10 paths are identical, and they tested that DX10 and DX9 HDR rendering are different. This point could be true, but it´s something strange that the DX10 HDR rendering path was buggy in the release version of the game, and in the 1.01 patch too.
It´s more logic that the DX10 HDR was correct and the difference with DX10.1 HDR reflects different and buggy render path (Do you remember the lack of one render pass?).
The speedup of performance of 1.01 patch (in 3DRage test) in the game looks like your test results. Then, the lack of DX10.1 support in 1.02 patch doesn´t affect the performance. Yes, in DX10.1 looks like that the AA is better than in other paths, but with this version you have lack of dust effect and different (buggy or not?) HDR rendering. Good reasons for the dropped support, I think.
Consequences of your rumors about sinister dropped support:
http://forums.vr-zone.com/showthread.php?t=283935">http://forums.vr-zone.com/showthread.php?t=283935
Some people believe this version because you defend it in your review, but you didn´t test the veracity of this. The truth is that DX10.1 render path had bugs, and when you made the review, you didn´t know if the dropped support reason was the conspiracy theory or other reason, but YOU chose one by personal election.
That Ubisoft and Nvidia didn´t respond to your email post proved nothing. At most, they were bad-mannered guys with you.
geogaddi - Tuesday, June 10, 2008 - link
...now, what did i do with that babelfish...
ssgoten00 - Monday, June 2, 2008 - link
AC was only an average game overall. Graphics presentation was it's strong suit but gameplay was lacking and the game seemed to drag on as the player progresses through the game. Undoubtedly the repetitiveness was the worst part of AC. Not simply the fact that is seemed like some tasks were a redue of previously accomplished tasks but the shear fact that tasks were repeated verbatim with same characters and voices, only changing dialogue to create variation. Some characters players will have to kill multiple times under the guise of actually killing different characters in the game. AC was also disappointing in the fact that it mislead gamers in presenting its self a a somewhat stealth game. Nothing could be further from the truth. In AC players will often be forced into full on combat with multiple opponents to progress in the storyline. In vary select situations players have the choice of using stealth as a viable option. Ironically the last 5 or 10 minutes at the very end of the game are the most compelling. After the credits roll players are left in the main room to explore and decrypt code and hidden messages. It's unfortunate AC's developers couldn't have spent more time on puzzles that actually pertained to gameplay. Out of a possible 10 I give Assassin's Creed a 6.0 barely coming in at par, bordering on subpar.Donkey2008 - Monday, June 2, 2008 - link
It does have a Thief feel but after playing it on Xbox, I found it to be more Thief meets BloodRayne meets Splinter Cell with some of the best graphics I have seen in a while. It is sorta repetitive,but the violence cut away any boredom I had. I enjoyed it a lot.But I guess they could have done what other Rockin high-profile companies do and make an even more repetitive game exactly like its previous versions, but with a much worse soundtrack. Throw in some terribly low-res character models and reuse the same, bottom-of-the-barrel, cartoon-looking cutscenes and they would have a perfect 10 as well.
poohbear - Monday, June 2, 2008 - link
this is the kind of game reviews i'd like to see, wherein hardware is tested w/ a game to show performance. I mean, you guys ARE a hardware site and there are'nt many sites that do game reviews w/ hardware testing shortly after. I dont think u should do game reviews without hardware testing cause there are a ton of game review sites, but your niche shines when you do these hardware and DX analysis. cheers and thanks for a very informative article.DesertCat - Monday, June 2, 2008 - link
The article talks about wanting to check the performance of Assassin's Creed on a Phenom processor (and its 4 cores). I can speak to that to some degree.I have a Phenom 9600 (2.3 MHz) on an AM2 board (Asus M2N-SLI Deluxe) with an EVGA 8800GT OC (650 MHz). I play at 1280x1024 so I play in a letterbox mode. This processor is enough to run the game at acceptable frame rates but I would tend to think that a fast dual core would do just as well (like was found with the Intel processors in the article).
With the performance hurting TLB patch enabled, I noticed one area where frame rates truly took a nosedive: when doing the "look" pan from the top of one of those towers. I didn't have an fps counter on at the time, but I'm guessing it was in the 12-16 fps range based on the chunkiness I experienced. I got similar slow framerates when diving from those spots into the hay (especially the really high towers).
With the TLB patch disabled on my Phenom, those two low fps spots were much better. I'm guessing that those areas were in the low 20's of fps. The rest of the game was smooth as silk and probably above 40 fps. I do not, however, see high utilization on any particular core when I've checked.
If I was to point to areas that really stress a system in AC, I would say that the tower pan shots are the most common. (*minor spoiler ahead*) King Richard's speech from horseback about 3/4 the way through the game is also very intensive.
aguilpa1 - Monday, June 2, 2008 - link
I know I've had it for a long time?emboss - Monday, June 2, 2008 - link
The word you *are* looking for is possibly letterboxing?PrinceGaz - Monday, June 2, 2008 - link
I was going to mention that as well. Anamorphic means the pixels making up the image are stretched either horizontally (as with wide-screen DVDs) or vertically when displayed. If the game were anamorphic, it would be like it running on a monitor at 1920x1080 but being rendered internally at some other resolution such as 1440x1080 and stretched to the displayed 1920x1080.The correct description is what you said originally, that it allows only a 16:9 aspect-ratio view, so if I ran it on my monitor (1600x1200 native) the game itself would only use the central 1600x900 of that.
JarredWalton - Tuesday, June 3, 2008 - link
Sorry - I saw the original comment and thought I corrected it. Missed the other two occurrences. I wasn't thinking and just used the word after reading the thread on widescreengamingforum.com about AC. (I was hoping someone had found a way around the locked letterbox view.)AnnihilatorX - Monday, June 2, 2008 - link
This review is great! I have never read a game review that includes all the analysis, benchmarks, gameplay video conveniently presented.Excellent work!
bill3 - Monday, June 2, 2008 - link
Actually it's terrible, I cant read the graphs AT ALL.seriously my eyes just glazed over those terrible charts..completely unreadable. I still, have no idea what I'm looking at. Is ATI supposed to be faster in this game? Why did they test with version 1.00 on ATI and 1.2 on Nvidia? I dont know because the graphs are totally useless.
Nihility - Monday, June 2, 2008 - link
I second that. The graphs are terrible. Maybe bar graphs would have been better?Sometimes when you're the one making the graph it's hard to imagine what other people are seeing when they look at them. I suggest having another pair of eyes check the graphs out for readability.
Besides that, I loved the review. Especially the performance part and the 10.1 controversy.
JarredWalton - Tuesday, June 3, 2008 - link
Charts are colored with similar colors used either for ATI vs. NVIDIA, 1.00 vs. 1.02, or dual-GPU vs. single-GPU. I could have generated four times as many graphs to show the same data, but I figure most people are capable of reading the labels on a chart and figuring out what they mean. Here's a hint: when you can't see the difference between two lines because they overlap, it's a tie.If you want to give specific examples and recommendations on what would look better and still convey the same amount of information, I'm all ears. However, simply stating that "the graphs are terrible" does little to help. Tell me what graph specifically is terrible, and tell me why it's terrible.
As an example of why I used these graphs, page 9 has two charts showing 40 total data points. You can get a clear idea of how performance scales with single or dual GPUs at the various detail settings looking at a single chart. Green is NVIDIA, Red is ATI. That makes a lot of sense to me. Creating ten different bar charts with four lines in each to show the same data makes it more difficult to compare how Medium graphics compares to High graphics performance, and it takes up five times as much space to tell the same "story".
Page 6 is the same thing, but with green used for dual-GPUs (light and dark for 1.00 and 1.02) and red for single GPUs. 24 data points in two charts instead of using six charts. Having established that 1.00 doesn't perform any different than 1.02 on NVIDIA hardware, I skipped the 1.00 NVIDIA numbers to make those charts easier to read on page 7. Then I put in the four standard test system (0xAA and 4xAA, ATI and NVIDIA) on 1.02, with 1.00 4xAA ATI in blue as a reference.
Lastly, on page 8 I have two clock speeds on NVIDIA, three on ATI, with different base colors for single and dual GPUs. ATI and NVIDIA are in separate charts, and brighter colors are for a higher overclock.
There's method to my graphing madness. Are the charts immediately clear to a casual glance? No, but then that's really difficult to do while still conveying all of the information. I spent a lot of time trying to make comprehensible charts, and settled on these as the best option I could come up with. Again, if they're so bad, it must be easy to generate something clearly better - have at it, and I'll be happy to use any sensible suggestions. However, if the only complaint is that you actually have to look at the charts and think for a minute before you understand, I'm not likely to be very sympathetic. I think our readers are smart enough to digest these graphs.
mpjesse - Monday, June 2, 2008 - link
While I appreciate the detailed review, isn't it a little irrelevant now? I mean, the game's been out for nearly 2 months now and it's been reviewed everywhere. The only thing new about this review are the performance benchmarks, in which case I would have have made the review solely about performance instead of gameplay.Just my 2 cents.
ImmortalZ - Monday, June 2, 2008 - link
Its sad that the companies with money always manage to suppress innovation.I hope this article by AT will raise some ruckus in the collective Interwebs and cause something. But I doubt it.
ViRGE - Monday, June 2, 2008 - link
For what it's worth, another forum I read had some screenshots comparing DX10 and DX10.1. The problems the poster had managed to find involved trees; there was some kind of post-processing rendering going on with trees that wasn't occurring with DX10.1, which made them look weird.Not fixing 10.1 may be an NVIDIA thing, but there was definitely a problem with it as-is.
tuteja1986 - Monday, June 2, 2008 - link
Well why where the hell is nvidia dx10.1 support if dx10.1 actually brings some kind of performance improvement in AA.Why aren't GT200 series have DX10.1 ?
I thought PC gaming was all about being the cutting edge on all technology front...
Anyways , this is not the 1st time Ubisoft or Nvidia have done this.
wyemarn - Monday, June 2, 2008 - link
Maybe because Nvidia GPUs cant support AA through shaders. So no use supporting dx 10.1. ATI GPUs have 320 stream processors so it can utilize for shaders and etc. Nvidia cards have less SPs but more ROPs, TMUs which translates to more brute power if games dont use shaders or SPs much. Technology wise, I think ATI is ahead but NVIDIA GPUs have game developer support and more raw horsepower so performance wise NVIDIA is ahead and I think this trend will continue with GTX200 series. I choosed G92 over RV670 because the raw performance is much better even though on paper HD 3800 series look great.SteelSix - Monday, June 2, 2008 - link
Worthy of a thread in Video. I just started one..Gannon - Monday, June 2, 2008 - link
The original halo had performance issues but they weren't alarming, halo was actually not too bad port compared to many other console to PC disasters. Halo 1 got 'better with hardware' advancing. Halo 2 on the other hand is just all around atrocious. Halo 2 was just not a very well made game, period, despite the addition of cutscenes, etc. Halo 1 had a much better feel and better vehicle design IMHO, I hated how the warthog looked in Halo 2, it annoyed me to no end.Griswold - Monday, June 2, 2008 - link
Thats no excuse. Halo sucked performance and gameplay wise compared to the PC-first titles of then - and that is what matters. In essence, the game is bad when you're used to play that genre on the PC. Same is true for gears of war but that port is lackluster in many more ways.I fell two times for console to PC ports. Never again.
bill3 - Monday, June 2, 2008 - link
The even worst shooter is Resistance on PS3.