The only conclusion that can be taken from this article is that PowerDVD uses a very poor h.264 decoder. You got obsessed with comparing different bits of hardware and ignored the real weak link in the chain - the software.
Pure software decoding of 1080-res h.264 can be done even on a PentiumD if you use a decent decoder such as CoreAVC or even just the one in ffdshow. You also ignored the fact that these different decoders definitely do differ in the quality of their output. PowerDVD's output is by far the worst to my eyes, the best being ffdshow closely followed by CoreAVC.
The article mentions that the amount of decoding offloaded by the GPU is directly tied into core clock speed (at least for Nvidia)... If this is true, why not throw in the 6600GT for comparison?? They usually come clocked at 500 mhz stock, but I am currently running mine at 580 with no modifications or extra case cooling.
In my opinion, if you were primarily interested in Blu-Ray/HD-DVD watching on your computer or HTPC and gaming as a secondary pastime, the 6600GT would be a great inexpensive approach to supporting a less powerful CPU.
Derek, any chance we could see some benches of this GPU thrown into the mix?
Could somebody tell me what the framerate is of the outgoing signal from the video card? I know that the Playstation 3 can only output a 60 fps signal, but some standalone palyers can output 24 fps.
From the 50,000 foot view, it seems just about right, or "fair" in the eyes of a new consumer. HD-DVD and BluRay just came out. It requires a new set-top player for those discs. If you built a new computer TODAY, the parts are readily available to handle the processing needed for decoding. One cannot always expect their older PC to work with today's needs - yes, even a PC only a year old. All in all, it sounds about right.
I fall into the category as most of the other posters. My PC can't do it. Build a new one (which I will do soon), and it will. Why all the complaining? I'm sure most of us need to get a new HDCP video card anyway.
i can play a High Profile 1080p(25) AVC video on my X2-4600 at maybe 40-70 CPU max (70% being a peak, i think it averaged 50-60%) with CoreAVC...
now the ONLY difference is my clip was sans audio and 13mbit (i was simulating a movie at a bitrate if you were to try to squeeze The Matrix onto a single layer HD DVD disc). i doubt 18mbit adds TOO much more computation...
How come the results of this tests are so different from http://www.pcper.com/article.php?aid=328&type=...">this PC Perspective review? I realize they tested HD-DVD, and this review is for Blu-Ray, but H.264 is H.264. Of note is that nVidia provided an E6300 and 7600GT to them to do the review with and it worked great (per the reviewer). Also very interesting is how the hardware acceleration dropped CPU usage from 100% down to 50% in their review on the worst case H.264 disc, but only reduced CPU usage by ~20% with a 7600GT in this review.
HD-DVD movies even using H.264 are not as stressful. H.264 decode requirements depend on the bitrate at which video is encoded. Higher bitrates will be more stressful. Blu-ray disks have the potential for much higher bitrate movies because they currently support up to 50GB (high bitrate movies also require more space).
Maybe the bitrate of their disk is not as high as the bitrate of that part of XMEN III.
I would not say it completely inadequate. According to the Anandtech review the E6300 with the 8800GTX could remain under 100% CPU utilisation even under the highest bitrate point (the 8800GTX and the 7600GT had the same worst case CPU utilisation in the tests).
Also, there's http://www.avsforum.com/avs-vb/showthread.php?p=91...">this post on AVSforum. The poster had no problems playing back Xmen-3 with a "P4 3.2Ghz HT system and a Radeon X1950Pro". Clearly a 3.2gHz HT P4 isn't nearly as powerful as any of those C2D processor nor was the X1950Pro as the various nVidia cards.
Perhaps, but nVidia intentionally sent them a H.264 torture test disc that's not available in the US. That also doesn't explain why the 7600GT nearly cut the CPU usage in half for one review, but only helped 20% in the other.
Also, nVidia says an E6330 or X2 4200+ with a 7600GT is adequate for the most demanding H.264 titles. That sure doesn't agree with the conclusion of this Anandtech piece, which says you need a 8800GTX card to use a E6300.
Will you start using more updated/modern encoding CPU tests for H.264 encoding? Currently you use Quicktime right? That doesn't use many of H264's advanced features.
Have you considered using x264 (an open source encoder of H264 that generates the best quality encodes of publicly available H264 encoders) using a standard set of encoding parameters?
Im little bit sceptic about those test results. Becuse my Home computer on the subject line played Dejavu clip (downloaded from Apple website trailer 1 - 1080p) with CPU usage 40..60% and with current version of NVIDIA drivers. Wiht older drivers (dont know excact version, installed those over a year ago) average farame rate was between 50...70%.
For a decoder used PowerDVD 7, installed trial and even when cyberlinks webpage says that H.264 codec doesnt work with trial version i had now problems with it. Gspot reported for a default rendering path Cyberlinks H.264 codec. For fulscreen capability used BSPlayer, strange was that Windows mediaplayer didnt want to play that trial eventhough all other players had no problem finding installed codecs.
TIP: with BSPlayer you can see droped frame rate count.
this is what we have found as well, and is also why looking at BD and HDDVD performance is more important than when we've looked at downloaded clips in the past
Have you guys tried this configuration out? I have 2 Geforce 8800 GTXs in SLI, and using either the 97.02 or 97.44 driver and a 30" Dell monitor with a resolution capability of 2560x1600, I found I cannot play Bluray content at anything higher than a desktop resolution of 1280x800 (exactly half that resolution because of the way the dual-DVI bandwidth is setup). This means I cannot even experience full 1080p!
Try anything higher than that and Cyberlink BD complains and says, set your resolution to less than 1980x1080. This sucks. I hope there is a fix on the way.
"The Dell 3007WFP and Hewlett Packard LP3065 30" LCD monitors require a graphics card with a dual-link DVI port to drive the ultra high native resolution of 2560x1600 which these monitors support. With the current family of NVIDIA Geforce 8 & 7 series HDCP capable GPU's, playback of HDCP content is limited to single-link DVI connection only. HDCP is disabled over a dual-link DVI connection. The highest resolution the Dell 30" 3007WFP supports in single-link DVI mode is 1280x800 and therefore this is the highest resolution which HDCP playback is supported in single-link DVI mode on current Geforce 8 &7 series HDCP capable GPU's. On other 3rd party displays with a native resolutions of 1920x1200 and below, the graphics card interfaces with the monitor over a single-link DVI connection. In this case, playback of content protected Blu-Ray and HD-DVD movies is possible on HDCP capable Geforce 8& 7 series GPU's."
Someone needs to tip nvidia and other graphics card manufacturers that this is unacceptable. If I shell out $4000 ($2000 monitor, $1400 for 2 8800GTXsli, and $600 blueray drive) IT SHOULD WORK.
agreed, but don't blame NVIDIA -- blame the MPAA ... HDCP was designed around single link dvi and hdmi connections and wasn't made to work with dual link in the first place. I wouldn't be suprised if the problem NVIDIA is having has absolutely nothing to do with their hardware's capability.
in addition, dell's design is flawed -- they only support resolutions above 12x8 with dual link dvi. it may have taken a little extra hardware, but there is no reason that they should not support up to at least 1920x1080 over a single link.
Wasn't blaming anyone in particular (although I'm always happy to bash the MPAA) just noting how stupid the situation is. Supporting a max of 12x8 over single link is inexcusable as far as I'm concerned.
That is ridiculous! That's the problem with tech you can't take anything for granted these days. Things that seem obvious and sensible often turn out to be not as they seem. What a joke!
Any chance of doing a quick test on quad-core to see how many threads powerdvd can generate unless you know already? At the very least it can from what you've said evenly distribute the load across 2 threads which is good.
This isn't a GPU review that tests a new game that we know will be GPU limited. This is a review of a technology that relies on the CPU. Furthermore, this is a tech that obviously pushes CPUs to their limit, so the legions of people without Core2Duo based CPUs would probably love to know whether or not their hardware is up to the task of decoding these files. I know any AMD product is slower than the top Conroes, but since the hardware GPU acceleration obviously doesn't directly coorespond to GPU performance, is it possible that AMD chips may decode Blue Ray at acceptable speeds? I don't know, but it would have been nice to learn that from this review.
I agree completely... I have an X2 3800+ clocked at 2500mhz that is not about to get retired for a Core 2 Duo.
Why are there no AMD numbers? Considering the chip was far and away the fastest available for several years, you would think that they would include the CPU numbers for AMD considering most of us with fast AMD chips only require a new GPU for current games/video. I've been waiting AGES for this type of review to decide what video card to upgrade to, and anandtech finally runs it, and I still can't be sure. I'm left to assume that my X2 @ 2.5ghz is approximately equivalent to an E6400.
If you only wanted to focus on the GPUs, then why test different CPUs? If you wanted to find out info about GPUs, why not look into the incredibly inconsistant performance, centerned around the low corelation between GPU performance in games versus movie acceleration? Finally, why not CHANGE the focus of the review when it became apparent that which GPU one owned was far less important then what CPU you were using?
Was it that hard to throw in a single X2 product rather than leave the article incomplete?
With all due respect, if that is the case then why did you even use different cpus? You should have kept that variable the same throughout the article by sticking with the 6800. Instead, what I read seemed to be 50% about GPU's and 50% about Core 2 Duo's.
I'd really appreciate an update with AMD numbers, even if you only give 1 that would at least give me a reference point. Thanks.
So is the output resolution via HDCP DVI really not able to output at high res? I heard a few rumors about when HDCP was enabled the output resolution of these cards was less than 1080p. Is there any true to this?
" Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames. "
"Thus we chose the Core 2 Duo X6800 for our tests."
"we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames"
"The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario"
"All video cards that have an HDMI connection on them should support HDCP, but the story is different with DVI. Only recently have manufacturers started including the encryption keys required for HDCP. Licensing these keys costs hardware makers money, and the inclusion of HDCP functionality hasn't been seen as a good investment until recently (as Blu-ray and HDDVD players are finally available for the PC). While NVIDIA and ATI are both saying that most (if not all) of the cards available based on products released within the last few months will include the required hardware support, the final decision is still in the hands of the graphics card maker. "
"It is important to make it clear that HDCP graphics cards are only required to watch protected HD content over a digital connection. Until movie studios decide to enable the ICT (Image Constraint Token), HD movies will be watchable at full resolution over an analog connection. While analog video will work for many current users, it won't be a long term solution."
This here:
"Fortunately, it seems that studios are making the sacrifices they need to make in order to bring a better experience to the end user."<-lol
..so tell me. What kind of processor and video graphics processor does a blue-ray 'box have ?
Somehow I dont buy the analog,as a premise to the test results...yet still you are probably correct in its detail.
Seems the person on the computer made all the sacrifices here.Those are thousand dollar rigs.! 'With a Blue-ray player there ?
I would be happy with 30 gigs of MPEG2 in this matter.Maybe use a fresnel lense or something for the Display.Or put a opaque projector up in some way.
Desktop Resolution:1920x1080 - 32-bit @ 60Hz
Doesn't your lab got enough money for a High resolution HDTV wDVI connectors ?
Im not understanding all the fat in the fire here.Studios.
__________________
Saw your last comment there Mr.Wilson. Noticed that some .avis have higher data rate than others. Why should 'I pay (in dollars,and technology) for that kind of pony ?
Thanks for reply.Good to see you are on top of things. Yeah I simply didn't include the first portion of the sentence in quoting you there.
The article has so many baselines in it. For now I guess that Xmen BR disk is the trait holder of wich did a performance consideration criteria then. Venturing to say then,that all BR are not created alike. That is other BR disks will not have the same characteristics as 'would have necesitated a 6800 CPU.
That is something of a rig that costs 2 grand.To include as well the BR player itself. Now a BR shelf box,it does not have a X6800 CPU,nor Nvidia,or ATI high-end graphics chips.(speaking comparately). And a Monitor capable of doing the resolution will certainly do so using only that set of components.
So I think that perhaps it would be question enough to say that BR 'penalizes'a computer using it as an accessory.
I dont know if this is true. Still if it was necesary to have BR accesories as you had listed them,BR would have had to have had them listed in the BR patent itself.BR is a fully capable technology w/o the computer.
So frankly the penalty here 'must be the DRM involved.Since BR does that on-the-fly incryption.'I'll just speculate.
Look at the power consumption there ! True the Sony Notebook I showed does only 1080i(check it if Im wrong). But the graphics there will run on a freekin battery! AND the notebook (with Sony liscencing power no doubt)can be hooked up to the family television setup - maybe in HD.
Lets face it though.I dont see the reason so timid to conduct comparisions to HDTV sets.? A Dell 30" monitor ,or something such as this ? Run the comparisons with the HDMI out to them from these bottle cork (the notebook I showed) technologies.
As in the same light,if an HDTV can display 1080P with an OTA,you've got to suppose that the 'bus'in wich is being conductive to it may or may have nothing or something in common to what the computer is doing.True you may laugh as there is nothing such as an HD OTA 1080P,and I dont know if there is.
Yet HDMI,HDCP etc thats a really fancy chance I either spend performance per dollar where It does me some good,or waste away to consider whom and of what technology can be participated to no avail.
If the whole industry is waiting on a freaking HDMI cable.WTF is wrong with you people.
I get a light on the players you'll hear some more.Why the timidity to put the archetecture on the block for testing !!!.Computers and more. And I dont give a farts rott for what RIAA,or MPAA sais. They dont have that life to lead.
Tujan, I don't mean to appear rude but I find your posts very hard to understand and from one of the posts above I'm not the only one. Am I right in thinking english isn't your first language? If so perhaps you could make your posts a bit simpler and more concise. Again I don't mean to cause offence as I think you may have some interesting points to make I just can't quite understand them.
There is several topics covered in each paragraph.Put a [ ] at end of each paragraph.
The article said a lot more than just conduction to the BR. For the most part I am darn tee'd off because you just dont take and burn 700 watts to run a BR. Now I dont know what the specs.are for the on-the-shelf BR players. But as I explained they do not have any components of the computer,and they sure dont take up that much power to run using them. A screen simply does its stuff and that is it.The screen should be capable of doing exactly what the BR disk was mentioned to be in HD WITHOUT A GRAPHICS CARD !!!.
Now I dont know what the parameters for this descretion is. I know that Walmart has a 4$ HDMI cable. And that HDCP Graphics cards do not use HDMI.
Your last post consisted of differences between running Quad Core with the BR. Well you can do Dual-server Quad-Core with the same graphics but if the connection,and testing cannot be done with HD screens,there is not much to have been said about them.
Especially the detail in the power utilization with the CPUs.So where's the descreprency ?
I am happy with an HDMI and Battery power on a notebook attached to a HDTV screen. Just how happy remains to be accessed. By both the graphics card vendors,and the authors of these not so confounding computer articles.
P.S.If I could have edited out the last paragraph from the last post I would have done so.It does not lack its substance though,since there must be a penalty to run BR on the computer.BR does not need this type of technology to conduct a standard session- as could be seen if the testing would be done.Then we could reason the problem.
BTW. I cannot figure out exactly the reason the tested BR player was not listed in the test setup.The brand name,and type. And wether it was using a Sata connection (wich it probably was). It should not be long before anyone should be able to conduct data transfer tests between different computer based players.
I dont know if I can wait or not. Since still we are dealing with certain media.And criteria specific to it. As well as the performance of them. So without that,a computer capable BR player should be the least of considerations.
Plextors got a blue ray player. Yeah think so. The specs. of the drive should have been announced.
See I just dont get the confoundedness of the HDCP and kissy feel good about the light hollywood puts out.For me,i have other considerations beyond them for the space,and the conduction of the technology.
Has anybody done a comparison of the bandwidth difference between HDMI,and DVI ? Ive only seen a lowly 1600 (Saphire-ATI) with HDMI. Although behind the scenes there are others. Had no idea of HD-DVD,or Blue-Ray were such 'system eating bandwidth hogs.
Either way.Its a knuckle sandwich for the studios.
Hope everybody runs out an grabs all the DVDs that can fit in a shopping cart.
So I guess the 'hi-def resolutions is a 'studio spec. too ?
So what is my question here. This is a Centrino Core Duo for a notebook. With graphics enough to run using only battery power .
As well the notebook has a Blue-Ray drive wich can be written to.AND watch blue-ray titles.
Is this mostly in the liscencing ? How can it be when the processor used,and graphics cards used are such absolute 'top notch'for the desktop. And the notebook puts the works of them to shame.
Blue-ray,and HDMI on battery power.
This was one of AnandTechs Adds.Incodently - Hi Anandtech(Add-Click),HI Sony.
When CoreAVC will support HD-DVD and bluray H.264, I'd be very interested in seeing this article updated with the comparison.
Regarding the article itself, I thought it wasn't up to normal anandtech standards.. skimping on the H.264 details which makes it better and giving the reason as "but these are a little beyond the scope of this article." - What is anandtech coming to? That's like saying "we're going to compare graphic cards with directx9 capabilities, but explaining what directx is, is a little beyond the scope of this article"
Also, not comparing amd cpus? What's up with that?
And I find it odd that you didn't comment on the strangeness that nvidia has better acceleration across the board than the ATI cards, especially as the ATI cards have better shader throughput, so probably most likely hampered by software rather than hardware.. so this: "ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware." - only paints the incorrect picture.
I would give this article a 2 out of 5.. 1 for at least covering the basics (h.264 is a better codec than mpeg2) and 1 for showing that ati needs to improve it's decoder.. even though you don't point it out.
I had a question about why you chose the golden-gate bridge scene to stress test the decoding capabilities of the various setups.
You said that you chose that point in the movie because it had the highest bitrate (41Mbps), indicating a more complex scene.
To me though that would indicate LESS encoding done by H.264, and subsequently LESS decoding work needed to be done for playback of that particular scene.
I justify that by thinking with a very complex scene the codec cannot compress the stream as much because it would introduce too many artifacts, so the compression rate is dropped and the data rate increased to compensate for that particular section in time.
Is my reasoning correct? If not, can someone explain to me why?
I don't think choice of scene should change the graphs in terms of relative performance between setups, but it would affect absolute numbers - an easy way to check whether my thinking is wrong or not is to see if there are more dropped frames in the Golden Gate scene on the software-decoded E6600 vs. other less busy scenes.
we tried to explain this a little bit, so I'm sorry if we didn't get it across well enough.
I'm not an expert on H.264 by any means, but I can talk about other types of decoding as they relate to still images.
The issue isn't really less compression -- when using H.264, we are always using H.264 complexity to encode the bitstream. We don't fall back to just saving raw pixel data if a scene is overly complex -- we encode more detailed information about the scene.
For instance, with still images, run length encoding can be performed with huge compression especially in images with large blocks of identical colors (like logos or images on a solid background color). Basically, the idea is to list a color and then a number of pixels that use that color. For an image that is a single solid color, you could list the color and then the number of pixels in the image. This is a very small file with little processing requirement that represents a full image. If, on the other hand, we have a checker board pattern with every other pixel being a different color, we have to list the color of every pixel, BUT we also have to process every color to makes sure of how many consecutive pixels it represents (even if it only represents one). Thus, we end up donig more processing than we would on a smaller (lower "bitrate") file.
This example is very fabricated as sophisticated run lenth encoding can handle more complex patterns, but it serves to illustrate the point: when using a specific type of encoding, higher bitrates can (and usually do) mean more complexity and processing.
As we mentioned, using no encoding requires zero processing. MPEG-2 can compress the data to lower the bitrate while increasing computational complexity. But higher bitrate MPEG-2 means more data to process per frame -- which means more CPU overhead for higher bitrates under MPEG-2. The same is true with H.264 -- bitrates are genearlly lower than MPEG-2 and require more processing power, but as H.264 encoded movies use more bitrate (more data per frame), more processing is required.
I hope this helps.
Also, to clarify -- the spot at the video that reaches 41Mbps corresponds to the highest CPU utilization (we can see this on a the perfmon timeline).
The PS3 is able to play Blu-Ray back at 50% over normal speed without dropping frames. That gives an idea of how much power these consoles are capable of.
Some interesting tidbits from a translation of an article interviewing PS3 developers.
-H.264 decoding itself was not very difficult for Cell with moderate optimization and they could play a movie in realtime at the first try unlike very difficult SACD optimization. However, because they began the development without knowing the final Blu-ray standard, they set the goal very high for decoding 2 full HD H.264 streams at 40Mbps simultaneously. Besides the clockspeed of the devkit was lower than the final product which made the development difficult. The current decoder can decode full HD H.264 with 3 SPEs.
-An SCE developer recommends trying 1.5x fast-forward playback in the PS3 BD player to see the power of Cell. When it's connected to a display via 1080/60p, it becomes very smooth as Cell has an enough margin for video decoding. In 1.5x fast-forward playback it decodes all frames then inserts them into 60fps with sped up audio.
I don't believe you can specify what clock ATI uses when decoding video -- I think this is handled internally. It may be that the hardware that helps accelerate MPEG-2 the most is tied to clock, while the majority of what benefits H.264 is not. We'll have to dig further to really know.
It was the same thing when MPEG2 came out. Heck, even in the old days of 386s, PCs are too slow to decode MPEG1 VCDs, to the point that we have a seperate MPEG1 decoder cards. Remember when DVD came out, there was a big push for GPU accelerated hardware iDCT. Today, most CPUs are powerful enough to decode MPEG2 on its own. The same thing agian with MPEG4. By the time 4-core/8-core CPUs become mainstream, we won't be hearing the need for GPU acceleration as much anymore. And by that time, there will be probably the next next gen HD format that is too powerful for CPUs for that time, cycle and repeat.
MPEG-4 contains many advanced features not currently in use. We first saw MPEG-4 part 2 in the form of DivX, but MPEG-4 part 10 takes quite a bit more work. Some of the profiles and levels of H.264/AVC will be too much for quad core CPUs to handle. These may not be adopted by studios for use on physical media, but the codec itself is very forward looking.
But in the end, you are correct -- the entire MPEG-4 spec will be a simple matter in a handful of years.
This is the case with everything though. Even if something will one day pose no trouble to computers, we can't ignore current performance. Studios must balance current performance with the flexibility to support the type of image quailty they will want near the end of the life cycle of BD and HDDVD formats.
I always look forward to this kind of thing, and it's why I test hardware -- I want to know what my PC can currently do with what is out there.
I suppose the "news" is that we've got something everyone wouldn't mind having that very few will be able to use for the time being.
This is good news that MPEG2 won't become the standard for BD. Until today, I figured all movies were in MPEG2 and if this became standard and won the format war, we would be stuck with what could arguably give a worse picture than HDDVD using VC1.
How do you know what movies are 50gb and or h264? Does it usually say on the box or does the player tell you?
In our experience with Blu-ray, the format is listed on the box. HDDVDs have been a little more cryptic and we are having to ask for help determining format.
For our X-Men BD, the back of the case stated AVC @18 Mbps.
I don't think disk size has been listed on the case, and we've had to ask for this info from industry sources.
IT's logical to be worse, but most users are using these processors and they really wanna know if there rig's can handle it...
it's not about AMD only, there's plenty of Pentium 4, Pentium D in these rigs, even Athlon XP still rocks in some..
what about core scaling test ?? I mean
1- Single Core
2- Single Core with Hyper Threading
3- Two Cores
4- Two Cores with Hyper Threading
5- Four Cores
it will be hard to do this scale as they are not from one arch. ( 1 to 4 are NetBurst with Pentium 4, Pentium D, Pentium EE while the last is Core Arch. )
I don't know why Anand these days does not care about AMD, I just hope they don't think that every body in the world has Core 2...
I'm not fan of AMD, but the benefit of this kind of articles is to see how much power do you need to handle these scenarios, and I guess the magority of peoples today still have older CPU's
these test must in my opinion cover wider range of CPU's Pentium 4 (with HT and without ), Pentium D, Athlon 64, Athlon 64 X2 even the Quad FX platform this will help reader very well knows if there system can handle these thing or not
great! what file foot-print advantage's in h.264? 1/4? 1/6? 1/10 compare to MPEG2? and if so can't you store h.264 on 'ol DVD? i've read HD/BD has way more space to offer than movie along needs. for that reason HD/BD will include: games, extra 'endings', rating-film options, trails....
Yeah, that it curious. Besides, if you're serious about HD-movies, all the highest picture-quality films currently are encoded using VC-1. Sure, H.264 has the potential to be the best, but it hasn't been demonstrated yet. VC-1 also takes less grunt to decode, so the article could pander to many more users than just X6800 owners...
I was under the impression that for HD decoding with advanced video codecs, memory bandwidth was actually fairly important. However, I can't see to find a link to support this (or to support the opposite).
In the 2nd trial, the 8800GTX post 10.9% higher cpu utilization than the 8800 GTS (the top performing card this trial). Is there any reason for this, as the post itself makes no mention of this anomoly.
The maximum CPU utilization is a little less consistent than average CPU utilization. Such is one of the issues with using max CPU... these numbers are more for reference -- average should be used to determine the general performance of the hardware.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
86 Comments
Back to Article
charleski - Tuesday, December 19, 2006 - link
The only conclusion that can be taken from this article is that PowerDVD uses a very poor h.264 decoder. You got obsessed with comparing different bits of hardware and ignored the real weak link in the chain - the software.Pure software decoding of 1080-res h.264 can be done even on a PentiumD if you use a decent decoder such as CoreAVC or even just the one in ffdshow. You also ignored the fact that these different decoders definitely do differ in the quality of their output. PowerDVD's output is by far the worst to my eyes, the best being ffdshow closely followed by CoreAVC.
tronsr71 - Friday, December 15, 2006 - link
The article mentions that the amount of decoding offloaded by the GPU is directly tied into core clock speed (at least for Nvidia)... If this is true, why not throw in the 6600GT for comparison?? They usually come clocked at 500 mhz stock, but I am currently running mine at 580 with no modifications or extra case cooling.In my opinion, if you were primarily interested in Blu-Ray/HD-DVD watching on your computer or HTPC and gaming as a secondary pastime, the 6600GT would be a great inexpensive approach to supporting a less powerful CPU.
Derek, any chance we could see some benches of this GPU thrown into the mix?
balazs203 - Friday, December 15, 2006 - link
Could somebody tell me what the framerate is of the outgoing signal from the video card? I know that the Playstation 3 can only output a 60 fps signal, but some standalone palyers can output 24 fps.valnar - Wednesday, December 13, 2006 - link
From the 50,000 foot view, it seems just about right, or "fair" in the eyes of a new consumer. HD-DVD and BluRay just came out. It requires a new set-top player for those discs. If you built a new computer TODAY, the parts are readily available to handle the processing needed for decoding. One cannot always expect their older PC to work with today's needs - yes, even a PC only a year old. All in all, it sounds about right.I fall into the category as most of the other posters. My PC can't do it. Build a new one (which I will do soon), and it will. Why all the complaining? I'm sure most of us need to get a new HDCP video card anyway.
plonk420 - Tuesday, December 12, 2006 - link
i can play a High Profile 1080p(25) AVC video on my X2-4600 at maybe 40-70 CPU max (70% being a peak, i think it averaged 50-60%) with CoreAVC...now the ONLY difference is my clip was sans audio and 13mbit (i was simulating a movie at a bitrate if you were to try to squeeze The Matrix onto a single layer HD DVD disc). i doubt 18mbit adds TOO much more computation...
plonk420 - Wednesday, December 13, 2006 - link
http://www.megaupload.com/?d=CLEBUGGH">http://www.megaupload.com/?d=CLEBUGGHgive that a try ... high profile 1080p AVC, with all CPU-sapping options on except for B-[frame-]pyramid.
it DOES have CAVLC (IIRC), 3 B-frames, 3 Refs, 8x8 / 4x4 Transform
Spoelie - Friday, April 20, 2007 - link
CABAC is better and more cpu-sapping then CAVLCStereodude - Tuesday, December 12, 2006 - link
How come the results of this tests are so different from http://www.pcper.com/article.php?aid=328&type=...">this PC Perspective review? I realize they tested HD-DVD, and this review is for Blu-Ray, but H.264 is H.264. Of note is that nVidia provided an E6300 and 7600GT to them to do the review with and it worked great (per the reviewer). Also very interesting is how the hardware acceleration dropped CPU usage from 100% down to 50% in their review on the worst case H.264 disc, but only reduced CPU usage by ~20% with a 7600GT in this review.Lastly, why is nVidia http://download.nvidia.com/downloads/pvzone/Checkl...">recommending an E6300 for H.264 blu-ray and HD-DVD playback with a 7600GT if it's completely inadequate as this review shows?
DerekWilson - Thursday, December 14, 2006 - link
HD-DVD movies even using H.264 are not as stressful. H.264 decode requirements depend on the bitrate at which video is encoded. Higher bitrates will be more stressful. Blu-ray disks have the potential for much higher bitrate movies because they currently support up to 50GB (high bitrate movies also require more space).balazs203 - Wednesday, December 13, 2006 - link
Maybe the bitrate of their disk is not as high as the bitrate of that part of XMEN III.I would not say it completely inadequate. According to the Anandtech review the E6300 with the 8800GTX could remain under 100% CPU utilisation even under the highest bitrate point (the 8800GTX and the 7600GT had the same worst case CPU utilisation in the tests).
Stereodude - Wednesday, December 13, 2006 - link
Also, there's http://www.avsforum.com/avs-vb/showthread.php?p=91...">this post on AVSforum. The poster had no problems playing back Xmen-3 with a "P4 3.2Ghz HT system and a Radeon X1950Pro". Clearly a 3.2gHz HT P4 isn't nearly as powerful as any of those C2D processor nor was the X1950Pro as the various nVidia cards.Stereodude - Wednesday, December 13, 2006 - link
Perhaps, but nVidia intentionally sent them a H.264 torture test disc that's not available in the US. That also doesn't explain why the 7600GT nearly cut the CPU usage in half for one review, but only helped 20% in the other.Also, nVidia says an E6330 or X2 4200+ with a 7600GT is adequate for the most demanding H.264 titles. That sure doesn't agree with the conclusion of this Anandtech piece, which says you need a 8800GTX card to use a E6300.
balazs203 - Wednesday, December 13, 2006 - link
In the PC Perspective article they say:"In our testing the H.264 bit rates were higher than the VC-1 rates, in the high 18-19 Mbps up to 22 Mbps in some cases."
That is about half the maximum bitrate of the Anadtech tested disc.
Stereodude - Wednesday, December 13, 2006 - link
Since when does bitrate = difficulty to decode?DerekWilson - Thursday, December 14, 2006 - link
bitrate does equal difficulty to decode because it equals more to do per frame.frogge - Tuesday, December 12, 2006 - link
64 bit OS vs 32 bit...puffpio - Tuesday, December 12, 2006 - link
Will you start using more updated/modern encoding CPU tests for H.264 encoding? Currently you use Quicktime right? That doesn't use many of H264's advanced features.Have you considered using x264 (an open source encoder of H264 that generates the best quality encodes of publicly available H264 encoders) using a standard set of encoding parameters?
Nothing taxes a CPU better than video encoding :)
rain128 - Tuesday, December 12, 2006 - link
Im little bit sceptic about those test results. Becuse my Home computer on the subject line played Dejavu clip (downloaded from Apple website trailer 1 - 1080p) with CPU usage 40..60% and with current version of NVIDIA drivers. Wiht older drivers (dont know excact version, installed those over a year ago) average farame rate was between 50...70%.For a decoder used PowerDVD 7, installed trial and even when cyberlinks webpage says that H.264 codec doesnt work with trial version i had now problems with it. Gspot reported for a default rendering path Cyberlinks H.264 codec. For fulscreen capability used BSPlayer, strange was that Windows mediaplayer didnt want to play that trial eventhough all other players had no problem finding installed codecs.
TIP: with BSPlayer you can see droped frame rate count.
Renoir - Tuesday, December 12, 2006 - link
The h.264 clips on the apple website tend to have lower bit rates than those found on blu-ray discs so that explains your cpu usage.DerekWilson - Tuesday, December 12, 2006 - link
this is what we have found as well, and is also why looking at BD and HDDVD performance is more important than when we've looked at downloaded clips in the pastRenoir - Tuesday, December 12, 2006 - link
Which is exactly the reason why I've been waiting so long for an article like this!redpriest_ - Tuesday, December 12, 2006 - link
Have you guys tried this configuration out? I have 2 Geforce 8800 GTXs in SLI, and using either the 97.02 or 97.44 driver and a 30" Dell monitor with a resolution capability of 2560x1600, I found I cannot play Bluray content at anything higher than a desktop resolution of 1280x800 (exactly half that resolution because of the way the dual-DVI bandwidth is setup). This means I cannot even experience full 1080p!Try anything higher than that and Cyberlink BD complains and says, set your resolution to less than 1980x1080. This sucks. I hope there is a fix on the way.
redpriest_ - Tuesday, December 12, 2006 - link
I should add I found this on nvidia's website."The Dell 3007WFP and Hewlett Packard LP3065 30" LCD monitors require a graphics card with a dual-link DVI port to drive the ultra high native resolution of 2560x1600 which these monitors support. With the current family of NVIDIA Geforce 8 & 7 series HDCP capable GPU's, playback of HDCP content is limited to single-link DVI connection only. HDCP is disabled over a dual-link DVI connection. The highest resolution the Dell 30" 3007WFP supports in single-link DVI mode is 1280x800 and therefore this is the highest resolution which HDCP playback is supported in single-link DVI mode on current Geforce 8 &7 series HDCP capable GPU's. On other 3rd party displays with a native resolutions of 1920x1200 and below, the graphics card interfaces with the monitor over a single-link DVI connection. In this case, playback of content protected Blu-Ray and HD-DVD movies is possible on HDCP capable Geforce 8& 7 series GPU's."
Someone needs to tip nvidia and other graphics card manufacturers that this is unacceptable. If I shell out $4000 ($2000 monitor, $1400 for 2 8800GTXsli, and $600 blueray drive) IT SHOULD WORK.
DerekWilson - Tuesday, December 12, 2006 - link
agreed, but don't blame NVIDIA -- blame the MPAA ... HDCP was designed around single link dvi and hdmi connections and wasn't made to work with dual link in the first place. I wouldn't be suprised if the problem NVIDIA is having has absolutely nothing to do with their hardware's capability.in addition, dell's design is flawed -- they only support resolutions above 12x8 with dual link dvi. it may have taken a little extra hardware, but there is no reason that they should not support up to at least 1920x1080 over a single link.
ssiu - Wednesday, December 13, 2006 - link
I would blame the 30" monitors -- they should at least support 1080p in single-link DVI mode, just like the way 24" monitors do.Renoir - Tuesday, December 12, 2006 - link
Wasn't blaming anyone in particular (although I'm always happy to bash the MPAA) just noting how stupid the situation is. Supporting a max of 12x8 over single link is inexcusable as far as I'm concerned.DerekWilson - Thursday, December 14, 2006 - link
then the problem you have is specifically with Dell.Renoir - Tuesday, December 12, 2006 - link
That is ridiculous! That's the problem with tech you can't take anything for granted these days. Things that seem obvious and sensible often turn out to be not as they seem. What a joke!poisondeathray - Monday, December 11, 2006 - link
sorry if this has been answered already...is powerDVD multithreaded? is your CPU utilization balanced across both cores? what effect does a quadcore chip have on CPU utilization
thx in advance
Renoir - Tuesday, December 12, 2006 - link
poisondeathray you read my mind! I have exactly the same question.DerekWilson - Tuesday, December 12, 2006 - link
CPU utilization would be 50% if a single core was maxed on perfmon --PowerDVD is multithreaded and 100% utilization represents both cores being pegged.
Renoir - Tuesday, December 12, 2006 - link
Any chance of doing a quick test on quad-core to see how many threads powerdvd can generate unless you know already? At the very least it can from what you've said evenly distribute the load across 2 threads which is good.DerekWilson - Thursday, December 14, 2006 - link
We are looking into this as well, thanks for the feedback.mi1stormilst - Monday, December 11, 2006 - link
Who the crap cares, stupid movies are freaking dumb gosh! I mean who the crap watches movies on their computers anyway...freaking dorks. LOL!Sunrise089 - Monday, December 11, 2006 - link
This isn't a GPU review that tests a new game that we know will be GPU limited. This is a review of a technology that relies on the CPU. Furthermore, this is a tech that obviously pushes CPUs to their limit, so the legions of people without Core2Duo based CPUs would probably love to know whether or not their hardware is up to the task of decoding these files. I know any AMD product is slower than the top Conroes, but since the hardware GPU acceleration obviously doesn't directly coorespond to GPU performance, is it possible that AMD chips may decode Blue Ray at acceptable speeds? I don't know, but it would have been nice to learn that from this review.abhaxus - Monday, December 11, 2006 - link
I agree completely... I have an X2 3800+ clocked at 2500mhz that is not about to get retired for a Core 2 Duo.Why are there no AMD numbers? Considering the chip was far and away the fastest available for several years, you would think that they would include the CPU numbers for AMD considering most of us with fast AMD chips only require a new GPU for current games/video. I've been waiting AGES for this type of review to decide what video card to upgrade to, and anandtech finally runs it, and I still can't be sure. I'm left to assume that my X2 @ 2.5ghz is approximately equivalent to an E6400.
DerekWilson - Tuesday, December 12, 2006 - link
Our purpose with this article was to focus on graphics hardware performance specificallySunrise089 - Tuesday, December 12, 2006 - link
Frankly Derrick, that's absurd.If you only wanted to focus on the GPUs, then why test different CPUs? If you wanted to find out info about GPUs, why not look into the incredibly inconsistant performance, centerned around the low corelation between GPU performance in games versus movie acceleration? Finally, why not CHANGE the focus of the review when it became apparent that which GPU one owned was far less important then what CPU you were using?
Was it that hard to throw in a single X2 product rather than leave the article incomplete?
smitty3268 - Tuesday, December 12, 2006 - link
With all due respect, if that is the case then why did you even use different cpus? You should have kept that variable the same throughout the article by sticking with the 6800. Instead, what I read seemed to be 50% about GPU's and 50% about Core 2 Duo's.I'd really appreciate an update with AMD numbers, even if you only give 1 that would at least give me a reference point. Thanks.
DerekWilson - Thursday, December 14, 2006 - link
We will be looking at CPU performance in other articles.The information on CPU used was to justify our choice of CPU in order to best demonstrate the impact of GPU acceleration.
Chucko - Monday, December 11, 2006 - link
So is the output resolution via HDCP DVI really not able to output at high res? I heard a few rumors about when HDCP was enabled the output resolution of these cards was less than 1080p. Is there any true to this?harijan - Monday, December 11, 2006 - link
I guess that no integrated graphic solutions will be able to decode them without dropping frames?Tujan - Monday, December 11, 2006 - link
" Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames. ""Thus we chose the Core 2 Duo X6800 for our tests."
"we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames"
"The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario"
"All video cards that have an HDMI connection on them should support HDCP, but the story is different with DVI. Only recently have manufacturers started including the encryption keys required for HDCP. Licensing these keys costs hardware makers money, and the inclusion of HDCP functionality hasn't been seen as a good investment until recently (as Blu-ray and HDDVD players are finally available for the PC). While NVIDIA and ATI are both saying that most (if not all) of the cards available based on products released within the last few months will include the required hardware support, the final decision is still in the hands of the graphics card maker. "
"It is important to make it clear that HDCP graphics cards are only required to watch protected HD content over a digital connection. Until movie studios decide to enable the ICT (Image Constraint Token), HD movies will be watchable at full resolution over an analog connection. While analog video will work for many current users, it won't be a long term solution."
This here:
"Fortunately, it seems that studios are making the sacrifices they need to make in order to bring a better experience to the end user."<-lol
..so tell me. What kind of processor and video graphics processor does a blue-ray 'box have ?
Somehow I dont buy the analog,as a premise to the test results...yet still you are probably correct in its detail.
Seems the person on the computer made all the sacrifices here.Those are thousand dollar rigs.! 'With a Blue-ray player there ?
I would be happy with 30 gigs of MPEG2 in this matter.Maybe use a fresnel lense or something for the Display.Or put a opaque projector up in some way.
Desktop Resolution:1920x1080 - 32-bit @ 60Hz
Doesn't your lab got enough money for a High resolution HDTV wDVI connectors ?
Im not understanding all the fat in the fire here.Studios.
__________________
Saw your last comment there Mr.Wilson. Noticed that some .avis have higher data rate than others. Why should 'I pay (in dollars,and technology) for that kind of pony ?
DerekWilson - Tuesday, December 12, 2006 - link
you mis quoted me --**Not even an** Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames.
Tujan - Thursday, December 14, 2006 - link
Thanks for reply.Good to see you are on top of things. Yeah I simply didn't include the first portion of the sentence in quoting you there.The article has so many baselines in it. For now I guess that Xmen BR disk is the trait holder of wich did a performance consideration criteria then. Venturing to say then,that all BR are not created alike. That is other BR disks will not have the same characteristics as 'would have necesitated a 6800 CPU.
That is something of a rig that costs 2 grand.To include as well the BR player itself. Now a BR shelf box,it does not have a X6800 CPU,nor Nvidia,or ATI high-end graphics chips.(speaking comparately). And a Monitor capable of doing the resolution will certainly do so using only that set of components.
So I think that perhaps it would be question enough to say that BR 'penalizes'a computer using it as an accessory.
I dont know if this is true. Still if it was necesary to have BR accesories as you had listed them,BR would have had to have had them listed in the BR patent itself.BR is a fully capable technology w/o the computer.
So frankly the penalty here 'must be the DRM involved.Since BR does that on-the-fly incryption.'I'll just speculate.
Look at the power consumption there ! True the Sony Notebook I showed does only 1080i(check it if Im wrong). But the graphics there will run on a freekin battery! AND the notebook (with Sony liscencing power no doubt)can be hooked up to the family television setup - maybe in HD.
Lets face it though.I dont see the reason so timid to conduct comparisions to HDTV sets.? A Dell 30" monitor ,or something such as this ? Run the comparisons with the HDMI out to them from these bottle cork (the notebook I showed) technologies.
As in the same light,if an HDTV can display 1080P with an OTA,you've got to suppose that the 'bus'in wich is being conductive to it may or may have nothing or something in common to what the computer is doing.True you may laugh as there is nothing such as an HD OTA 1080P,and I dont know if there is.
Yet HDMI,HDCP etc thats a really fancy chance I either spend performance per dollar where It does me some good,or waste away to consider whom and of what technology can be participated to no avail.
If the whole industry is waiting on a freaking HDMI cable.WTF is wrong with you people.
I get a light on the players you'll hear some more.Why the timidity to put the archetecture on the block for testing !!!.Computers and more. And I dont give a farts rott for what RIAA,or MPAA sais. They dont have that life to lead.
Renoir - Thursday, December 14, 2006 - link
Tujan, I don't mean to appear rude but I find your posts very hard to understand and from one of the posts above I'm not the only one. Am I right in thinking english isn't your first language? If so perhaps you could make your posts a bit simpler and more concise. Again I don't mean to cause offence as I think you may have some interesting points to make I just can't quite understand them.Tujan - Thursday, December 14, 2006 - link
There is several topics covered in each paragraph.Put a [ ] at end of each paragraph.The article said a lot more than just conduction to the BR. For the most part I am darn tee'd off because you just dont take and burn 700 watts to run a BR. Now I dont know what the specs.are for the on-the-shelf BR players. But as I explained they do not have any components of the computer,and they sure dont take up that much power to run using them. A screen simply does its stuff and that is it.The screen should be capable of doing exactly what the BR disk was mentioned to be in HD WITHOUT A GRAPHICS CARD !!!.
Now I dont know what the parameters for this descretion is. I know that Walmart has a 4$ HDMI cable. And that HDCP Graphics cards do not use HDMI.
Your last post consisted of differences between running Quad Core with the BR. Well you can do Dual-server Quad-Core with the same graphics but if the connection,and testing cannot be done with HD screens,there is not much to have been said about them.
Especially the detail in the power utilization with the CPUs.So where's the descreprency ?
I am happy with an HDMI and Battery power on a notebook attached to a HDTV screen. Just how happy remains to be accessed. By both the graphics card vendors,and the authors of these not so confounding computer articles.
P.S.If I could have edited out the last paragraph from the last post I would have done so.It does not lack its substance though,since there must be a penalty to run BR on the computer.BR does not need this type of technology to conduct a standard session- as could be seen if the testing would be done.Then we could reason the problem.
Tujan - Thursday, December 14, 2006 - link
BTW. I cannot figure out exactly the reason the tested BR player was not listed in the test setup.The brand name,and type. And wether it was using a Sata connection (wich it probably was). It should not be long before anyone should be able to conduct data transfer tests between different computer based players.I dont know if I can wait or not. Since still we are dealing with certain media.And criteria specific to it. As well as the performance of them. So without that,a computer capable BR player should be the least of considerations.
Plextors got a blue ray player. Yeah think so. The specs. of the drive should have been announced.
See I just dont get the confoundedness of the HDCP and kissy feel good about the light hollywood puts out.For me,i have other considerations beyond them for the space,and the conduction of the technology.
happy holidays.
johnsonx - Monday, December 11, 2006 - link
I had a lot less trouble following the article than I did that post. What?Tujan - Monday, December 11, 2006 - link
Has anybody done a comparison of the bandwidth difference between HDMI,and DVI ? Ive only seen a lowly 1600 (Saphire-ATI) with HDMI. Although behind the scenes there are others. Had no idea of HD-DVD,or Blue-Ray were such 'system eating bandwidth hogs.Either way.Its a knuckle sandwich for the studios.
Hope everybody runs out an grabs all the DVDs that can fit in a shopping cart.
So I guess the 'hi-def resolutions is a 'studio spec. too ?
Tujan - Monday, December 11, 2006 - link
So heres a Sony notebook. It probably uses less than 40 or 50 watts. Has an HDMI connector on it. And runs on a battery. No less.
http://www.learningcenter.sony.us/assets/itpd/note...">http://www.learningcenter.sony.us/asset...CMP=vaio...
So what is my question here. This is a Centrino Core Duo for a notebook. With graphics enough to run using only battery power .
As well the notebook has a Blue-Ray drive wich can be written to.AND watch blue-ray titles.
Is this mostly in the liscencing ? How can it be when the processor used,and graphics cards used are such absolute 'top notch'for the desktop. And the notebook puts the works of them to shame.
Blue-ray,and HDMI on battery power.
This was one of AnandTechs Adds.Incodently - Hi Anandtech(Add-Click),HI Sony.
cmdrdredd - Monday, December 11, 2006 - link
I too wonder how a laptop can play blue-ray fine but a $400+ video card with a CPU probably 2x+ more powerful and more memory...can't.fanbanlo - Monday, December 11, 2006 - link
most efficient software decoder! Maybe we don't need Core 2 Duo after all!http://www.coreavc.com/">http://www.coreavc.com/
DerekWilson - Monday, December 11, 2006 - link
my understanding is that coreavc doesn't work in conjunction with HDDVD/BD -- that it doesn't support AACS.totalcommand - Monday, December 11, 2006 - link
BluRay support will be added to CoreAVC soon.KashGarinn - Tuesday, December 12, 2006 - link
When CoreAVC will support HD-DVD and bluray H.264, I'd be very interested in seeing this article updated with the comparison.Regarding the article itself, I thought it wasn't up to normal anandtech standards.. skimping on the H.264 details which makes it better and giving the reason as "but these are a little beyond the scope of this article." - What is anandtech coming to? That's like saying "we're going to compare graphic cards with directx9 capabilities, but explaining what directx is, is a little beyond the scope of this article"
Also, not comparing amd cpus? What's up with that?
And I find it odd that you didn't comment on the strangeness that nvidia has better acceleration across the board than the ATI cards, especially as the ATI cards have better shader throughput, so probably most likely hampered by software rather than hardware.. so this: "ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware." - only paints the incorrect picture.
I would give this article a 2 out of 5.. 1 for at least covering the basics (h.264 is a better codec than mpeg2) and 1 for showing that ati needs to improve it's decoder.. even though you don't point it out.
K.
ninjit - Monday, December 11, 2006 - link
I had a question about why you chose the golden-gate bridge scene to stress test the decoding capabilities of the various setups.You said that you chose that point in the movie because it had the highest bitrate (41Mbps), indicating a more complex scene.
To me though that would indicate LESS encoding done by H.264, and subsequently LESS decoding work needed to be done for playback of that particular scene.
I justify that by thinking with a very complex scene the codec cannot compress the stream as much because it would introduce too many artifacts, so the compression rate is dropped and the data rate increased to compensate for that particular section in time.
Is my reasoning correct? If not, can someone explain to me why?
I don't think choice of scene should change the graphs in terms of relative performance between setups, but it would affect absolute numbers - an easy way to check whether my thinking is wrong or not is to see if there are more dropped frames in the Golden Gate scene on the software-decoded E6600 vs. other less busy scenes.
DerekWilson - Monday, December 11, 2006 - link
we tried to explain this a little bit, so I'm sorry if we didn't get it across well enough.I'm not an expert on H.264 by any means, but I can talk about other types of decoding as they relate to still images.
The issue isn't really less compression -- when using H.264, we are always using H.264 complexity to encode the bitstream. We don't fall back to just saving raw pixel data if a scene is overly complex -- we encode more detailed information about the scene.
For instance, with still images, run length encoding can be performed with huge compression especially in images with large blocks of identical colors (like logos or images on a solid background color). Basically, the idea is to list a color and then a number of pixels that use that color. For an image that is a single solid color, you could list the color and then the number of pixels in the image. This is a very small file with little processing requirement that represents a full image. If, on the other hand, we have a checker board pattern with every other pixel being a different color, we have to list the color of every pixel, BUT we also have to process every color to makes sure of how many consecutive pixels it represents (even if it only represents one). Thus, we end up donig more processing than we would on a smaller (lower "bitrate") file.
This example is very fabricated as sophisticated run lenth encoding can handle more complex patterns, but it serves to illustrate the point: when using a specific type of encoding, higher bitrates can (and usually do) mean more complexity and processing.
As we mentioned, using no encoding requires zero processing. MPEG-2 can compress the data to lower the bitrate while increasing computational complexity. But higher bitrate MPEG-2 means more data to process per frame -- which means more CPU overhead for higher bitrates under MPEG-2. The same is true with H.264 -- bitrates are genearlly lower than MPEG-2 and require more processing power, but as H.264 encoded movies use more bitrate (more data per frame), more processing is required.
I hope this helps.
Also, to clarify -- the spot at the video that reaches 41Mbps corresponds to the highest CPU utilization (we can see this on a the perfmon timeline).
ninjit - Monday, December 11, 2006 - link
Thanks for the explanation Derek. That was very helpful.jeffbui - Monday, December 11, 2006 - link
The PS3 is able to play Blu-Ray back at 50% over normal speed without dropping frames. That gives an idea of how much power these consoles are capable of.Some interesting tidbits from a translation of an article interviewing PS3 developers.
-H.264 decoding itself was not very difficult for Cell with moderate optimization and they could play a movie in realtime at the first try unlike very difficult SACD optimization. However, because they began the development without knowing the final Blu-ray standard, they set the goal very high for decoding 2 full HD H.264 streams at 40Mbps simultaneously. Besides the clockspeed of the devkit was lower than the final product which made the development difficult. The current decoder can decode full HD H.264 with 3 SPEs.
-An SCE developer recommends trying 1.5x fast-forward playback in the PS3 BD player to see the power of Cell. When it's connected to a display via 1080/60p, it becomes very smooth as Cell has an enough margin for video decoding. In 1.5x fast-forward playback it decodes all frames then inserts them into 60fps with sped up audio.
DerekWilson - Monday, December 11, 2006 - link
cool -- we'll have to investigate this.liquidaim - Monday, December 11, 2006 - link
Did you use the 3d clocks for ati cards or the normal 2d?Just wondering if that was taken into account for the MPEG-2 tests previously and not here, which is why ati cards didn't perform as well.
Not a fanboy, just asking for clarification.
DerekWilson - Monday, December 11, 2006 - link
I don't believe you can specify what clock ATI uses when decoding video -- I think this is handled internally. It may be that the hardware that helps accelerate MPEG-2 the most is tied to clock, while the majority of what benefits H.264 is not. We'll have to dig further to really know.pata2001 - Monday, December 11, 2006 - link
It was the same thing when MPEG2 came out. Heck, even in the old days of 386s, PCs are too slow to decode MPEG1 VCDs, to the point that we have a seperate MPEG1 decoder cards. Remember when DVD came out, there was a big push for GPU accelerated hardware iDCT. Today, most CPUs are powerful enough to decode MPEG2 on its own. The same thing agian with MPEG4. By the time 4-core/8-core CPUs become mainstream, we won't be hearing the need for GPU acceleration as much anymore. And by that time, there will be probably the next next gen HD format that is too powerful for CPUs for that time, cycle and repeat.DerekWilson - Monday, December 11, 2006 - link
MPEG-4 contains many advanced features not currently in use. We first saw MPEG-4 part 2 in the form of DivX, but MPEG-4 part 10 takes quite a bit more work. Some of the profiles and levels of H.264/AVC will be too much for quad core CPUs to handle. These may not be adopted by studios for use on physical media, but the codec itself is very forward looking.But in the end, you are correct -- the entire MPEG-4 spec will be a simple matter in a handful of years.
This is the case with everything though. Even if something will one day pose no trouble to computers, we can't ignore current performance. Studios must balance current performance with the flexibility to support the type of image quailty they will want near the end of the life cycle of BD and HDDVD formats.
I always look forward to this kind of thing, and it's why I test hardware -- I want to know what my PC can currently do with what is out there.
I suppose the "news" is that we've got something everyone wouldn't mind having that very few will be able to use for the time being.
Staples - Monday, December 11, 2006 - link
This is good news that MPEG2 won't become the standard for BD. Until today, I figured all movies were in MPEG2 and if this became standard and won the format war, we would be stuck with what could arguably give a worse picture than HDDVD using VC1.How do you know what movies are 50gb and or h264? Does it usually say on the box or does the player tell you?
DerekWilson - Monday, December 11, 2006 - link
In our experience with Blu-ray, the format is listed on the box. HDDVDs have been a little more cryptic and we are having to ask for help determining format.For our X-Men BD, the back of the case stated AVC @18 Mbps.
I don't think disk size has been listed on the case, and we've had to ask for this info from industry sources.
CrystalBay - Monday, December 11, 2006 - link
Are AMD X2's unable to efficiently work in these scenarios ?DerekWilson - Monday, December 11, 2006 - link
AMD CPUs will very likely perform worse than Core 2 Duo CPUs.We are considering doing a CPU comparison.
Xajel - Monday, December 11, 2006 - link
IT's logical to be worse, but most users are using these processors and they really wanna know if there rig's can handle it...it's not about AMD only, there's plenty of Pentium 4, Pentium D in these rigs, even Athlon XP still rocks in some..
what about core scaling test ?? I mean
1- Single Core
2- Single Core with Hyper Threading
3- Two Cores
4- Two Cores with Hyper Threading
5- Four Cores
it will be hard to do this scale as they are not from one arch. ( 1 to 4 are NetBurst with Pentium 4, Pentium D, Pentium EE while the last is Core Arch. )
Xajel - Monday, December 11, 2006 - link
I don't know why Anand these days does not care about AMD, I just hope they don't think that every body in the world has Core 2...I'm not fan of AMD, but the benefit of this kind of articles is to see how much power do you need to handle these scenarios, and I guess the magority of peoples today still have older CPU's
these test must in my opinion cover wider range of CPU's Pentium 4 (with HT and without ), Pentium D, Athlon 64, Athlon 64 X2 even the Quad FX platform this will help reader very well knows if there system can handle these thing or not
michael2k - Monday, December 11, 2006 - link
I would hazard most AMDs won't be fine; if most Intel CPUs won't be fine and if the E6600 outclasses all AMD CPUs...But I was just looking at the AMD/C2D comparison from July, the newest AMD CPUs may do fine.
mino - Monday, December 11, 2006 - link
The same here?What about QuadFX ? (under Vista)
FX-70 at $500 it is at level with E6700...
AlexWade - Monday, December 11, 2006 - link
The HD DVD 360 add-on works on a PC, why wasn't that tested too?DerekWilson - Monday, December 11, 2006 - link
We are going to do a followup using the 360 HDDVD drive (actually, I'm working on it right now).ShizNet - Tuesday, December 12, 2006 - link
great! what file foot-print advantage's in h.264? 1/4? 1/6? 1/10 compare to MPEG2? and if so can't you store h.264 on 'ol DVD? i've read HD/BD has way more space to offer than movie along needs. for that reason HD/BD will include: games, extra 'endings', rating-film options, trails....great write-up as usual
artifex - Tuesday, December 12, 2006 - link
I would love to see that article include visual comparisons with a 360 running the HD-DVD adapter. If I buy the adapter, I may be using it on both.therealnickdanger - Monday, December 11, 2006 - link
Yeah, that it curious. Besides, if you're serious about HD-movies, all the highest picture-quality films currently are encoded using VC-1. Sure, H.264 has the potential to be the best, but it hasn't been demonstrated yet. VC-1 also takes less grunt to decode, so the article could pander to many more users than just X6800 owners......just a thought.
Orbs - Monday, December 11, 2006 - link
I'd love to see that tested and compared.Eug - Monday, December 11, 2006 - link
If an E6400 2.13 GHz is OK, is a T7400 2.16 also OK? The T7400 is faster, but it has a slower memory bus.ss284 - Monday, December 11, 2006 - link
Should be, since memory bandwidth performance usually doesn't play a big factor in decode performance.Eug - Monday, December 11, 2006 - link
Do you have a link for that?I was under the impression that for HD decoding with advanced video codecs, memory bandwidth was actually fairly important. However, I can't see to find a link to support this (or to support the opposite).
Aikouka - Monday, December 11, 2006 - link
In the 2nd trial, the 8800GTX post 10.9% higher cpu utilization than the 8800 GTS (the top performing card this trial). Is there any reason for this, as the post itself makes no mention of this anomoly.Orbs - Monday, December 11, 2006 - link
I noticed this too. The GTS was better than the GTX and this was not explained.DerekWilson - Monday, December 11, 2006 - link
The maximum CPU utilization is a little less consistent than average CPU utilization. Such is one of the issues with using max CPU... these numbers are more for reference -- average should be used to determine the general performance of the hardware.bluh264 - Saturday, December 3, 2011 - link
Just find a site about blu ray to h.264http://www.bluraytoh264.com/