Comments Locked

50 Comments

Back to Article

  • drayzen - Wednesday, July 11, 2007 - link

    Reading the article it occurred to me that basically what Nvidia are trying to achieve is to simplify the process for the und user.

    I thought that you could have an ordered list of driver features that would be progressively turned off based upon a target FPS rating. As for most people, when all is said and done they want a reasonable FPS with as much eye candy as is possible.

    It would basically function as a dynamic fallback system. Similar to the way that modern CPU's adjust their settings based upon load only here we are talking about rendering features rather than speed.

    It would be possible to have a basic and advanced version.

    The Basic version might look like:
    1> MipMap Detail Level
    2> Anisotropic Filtering
    3> Anti-Aliasing *Possibly with: (Method 'a'/'b')
    4> Vertical Sync

    While an Advanced version would be like:
    1> MipMap Low
    2> MipMap Medium
    3> Anisotropic Filtering 2X
    4> MipMap High
    5> Anisotropic Filtering 4X
    6> Anti-Aliasing 2X Method 'a'
    7> Anisotropic Filtering 8X
    6> Anti-Aliasing 2X Method 'b'
    8> .....

    - For both Basic and Advanced the user is able to change the order of the items in the list to keep the features they deem most iportant.
    - In the Basic version the driver would automatically fallback through the levels of a feature. e.g. for Anti-Aliasing: 4x('a') -> 4x('b') -> 2x('a') -> 2x('b')
    - In the Advanced settings the user is specifying the order of each individual setting so no fallback is required.

    The ability to log events would enable users to analyse their results and better tailor their settings.

    As far as notifications go, I guess you could have a tick box on each row as to whether a notification will appear for a status change on that feature.

    I'm not sure how well this would work as a system with current API's, maybe with DX10 where there are no more API's (correct?) it would be better, as some of the changes are not high speed dynamic changes and seem to require a restart of the video system.
    With some changes to programming I'm sure it could be achieved.
  • instant - Wednesday, April 4, 2007 - link

    Coolbits + the normal display properties is all you need.

    Hopefully Nvidia will let those of us who hate their new display properties manager use the old one...

  • Gherault - Monday, March 26, 2007 - link

    Why not initialize a profile dialog box when a program tries to go to use DirectX/OpenGL? You could use checkboxes for every time, first time, etc. located in driver settings- probably first time as a default. I hate opening the Nvidia profiles, but it if automatically popped up, and integrated the warning boxes/list of programs known not to function with MSAA (better if integrated directly into a profile) then I would love it. Manually wading through the profiles as-is stinks, therefore I do not use them AT ALL. If you wanted to experiment with performance by tweaking profiles you could set the checkbox to open the profile every time a program tried to go to use DirectX/OpenGL and play with the settings each time, until you found what you wanted, then "lock" the profile with a checkbox or whatnot that would cause it not to re-open automatically. Also... please don't force me to use the bloated interface that came about in the 90/100 series of drivers. I like the old control panel, and being able to see my GPU temp without having to load Ntune (which is incompatible with my PC- causes bluescreens constantly).
  • DasFox - Monday, March 26, 2007 - link

    I don't see why this seems so complicated.

    I personally leave all Nvidia driver settings on default, and I use nHancer to enable game profiles, and when AnandTech said;

    "Profiles don't seem like the right fit"

    I guess they haven't used nHancer, because life in gaming with it, can't get any easier.

  • mindless1 - Wednesday, March 21, 2007 - link

    I think I speak for a lot of users in that I am not wanting the popup box solution mentioned in the article, many people are so incredibly tired of popup messages. This is not a fault of nVidia, there are other companies that have worn out the popup box already.
  • neofit26 - Tuesday, March 20, 2007 - link

    I am usually changing the AA setting from the driver's panel, since very few games have them in the interface. Now with nHancer it's even easier. I wish NVidia copied that interface into their drivers.

    And, unlike a previous poster said, nowadays more and more games do not allow AA to be used and even forced in the driver's panel. I am talking about Gothic 3, Vanguard, Silent Hunter 4 and Stalker apparently. And they don't give anything in exchange. Why would I choose a "David Thompson erotic photography" blurry bloom-like effect in a combat game, instead of non-jaggy lines?

    Anyway, for me the choice is simple: if forums say that the game has no AA and it cannot be forced through the driver panel - I don't buy it.
  • AnnonymousCoward - Sunday, March 18, 2007 - link

    Obviously one of the concerns here is maintaining good support for older games: "...driver AA settings are a necessity for enabling the functionality on older games or current games..."

    Currently the 4:3 scaling option for widescreen monitors is broken for the Geforce 8 on WinXP. In the Nvidia control panel the option is called "Change flat panel scaling", and it simply doesn't work. Here's a forum post of others with the problem: http://forums.nvidia.com/lofiversion/index.php?t23...">http://forums.nvidia.com/lofiversion/index.php?t23...

    What is more important for image quality in games: AA settings, or not stretching 4:3 images to 16:10? Clearly the latter. Fix the basics first.

    Additionally, the Nvidia's nTune program is also broken, since it forgets your settings every time you reboot.
  • chizow - Monday, March 19, 2007 - link

    The latest 100+ series beta drivers fix 1:1 scaling along with a host of other problems in XP. They're beta, but a helluva lot better than the 2 month+ old 97.92 WHQL. Pretty sure 100.65 WHQL fixed 1:1 in Vista, but not totally sure since I haven't made the move yet.
  • AnnonymousCoward - Monday, March 19, 2007 - link

    Oh, one other thing. You said 100-series drivers fixed 1:1 scaling in XP. Do they have 100-series for XP? http://www.nzone.com/object/nzone_downloads_rel70b...">http://www.nzone.com/object/nzone_downloads_rel70b...
  • AnnonymousCoward - Monday, March 19, 2007 - link

    Oh really, 100.65 fixes the scaling? Poster "thredge" on my link said "Was hoping the 100.65 newest driver I just got would solve it, but was again dissapointed." I haven't tried it since I assumed he was right.
  • fyord - Sunday, March 18, 2007 - link

    The issue has a larger perspective than what anandtech and nvidia has stated. Having game developers, gpu developers, and other parties second guessing each other is a recipe for disaster. They simply need to have a standard, just like we have the 802.11 standards for wireless. To me, it seems that you would want the control to be in the operating system since the scenario is application dependent. I expect more software providers will begin to use the gpu capabilities, after all that is what the Aero interface that Microsoft created with Vista is all about. I can easily see other applications beyond video games such as Google Earth, Apple Itunes, Adobe Premier, gps navigation software, etc. taking advantage of it. Video is the next killer application and interactivity is what will help drive it. It will enable new gaming features. How about a first person shooter where you put down a sensor to monitor a given area of the map and it shows up as a picture in-picture display just like on tv? Video has been shaping what companies do. For example, just look at what Apple is doing with the iPod and the fact that Nokia is releasing a cell phone that also is a mini camcorder. It why the media companies place a high value on things like YouTube in the stock market. The pc in many ways is the centralized server for the consumer as well as a video/gaming system, a typewriter, and a web browser. Companies talk about digital convergence, but they have not fully achieved it because they have not collaborated on software and firmware interactivity standards because they were simply not aware of it. However, as a designer and a consumer, I applaud companies such as Nvidia and AMD that are coming to realize this potential and actively look for a solution.
  • Trisped - Monday, March 19, 2007 - link

    My thoughts exactly. Get one system where information can be pulled from the drivers as to what options are available and pass it to the game's user interface.
  • Jodiuh - Friday, March 16, 2007 - link

    If "enhance the application" should be used w/ HL2, why doesn't super sampling work unless "override the application" is selected? Or GTR2's 5 levels of AA when using an 8800GTX which has 6? Things like this force us all to either experiment, ie waste more time, or give up entirely.

    Placing the burden on Devs seems like a great way to push them over to the consoles. IMO, driver options should be able to tell whether override, enhance, or app pref need to be applied. All the user should have to do is pick a level, save it to a profile, and forget about it. Unfortunately it doesn't end w/ AA alone, we have:

    Anisotropic Filtering
    Texture and AS Quality
    Transparency Super/Multi sampling
    Gamma AA
    LOD Bias
    Vsync

    to worry about. It can become truly frustrating at times. Then combine all that w/ the piss poor overly sized, scrolling, scattered NV/ATI control panels for more good times. Why is it so tough to have all the settings viewable in one page (for 3d)? No scrolling, no moving cars or NV cubes, just a description of what it can do w/ benefits and possible disadvantages.

    And yes, it sure would be nice to have the option of changing the settings while in game w/ a nice transparent GUI overlay that's SMALL and on the BOTTOM of the screen...kind of like a good TV menu.
  • bojaka - Saturday, March 17, 2007 - link

    Surfing via my cell I haven't read all posts. Sorry if I'm re-posting... WWW.NHANCER.COM that's all you GeForce-guys need!
  • Fant - Friday, March 16, 2007 - link

    I think there should just be a graphics settings API that the games develop that hooks into the control panel AA. So whenever you go to an in-game graphics settings, up pops the control panel AA that everyone is used to seeing. The game doesnt need to know anything and everything is maintained in one place. As graphics drivers develop new features, it would be shown in game because the game would just be overlaying the actual control panel graphics settings. Whenever a game does this, it can register with the graphics driver so that a "profile" is automatically created for that game and the settings are remembered. The graphics driver would be remembering the settings for the profile, not the game. But I do agree that all the major settings need to be consolidated into 1 user panel that it easy to control.
  • Gstanfor - Friday, March 16, 2007 - link

    I would like to see an option where the user can force supersampling AA in games such as STALKER, GOTHIC3, GRAW etc that disallow other AA methods if the user has SLI or crossfire. The forced AA would be totally transparent to the game and SLI would provide the required rendering grunt.

    I cannot stand scenes/games that are not AA'd, completely ruins the experience. If I wanted junk like that I'd purchase a console...
  • KashGarinn - Friday, March 16, 2007 - link

    For some reason, I've always wondered why AA wasn't just implemented as a shader in games..

    All it's really doing is giving you a more natural blend between contrasting colours, as when you blow up a picture and then shrink it down to it's right resolution, you're not placing the items in the picture more accurately, you're just blending the boundaries between contrasting colourshapes into a more natural blend.

    I don't have the skills myself, so I'm don't know if it's at all possible to let a shader get more detail of a boundary area, or just HDR the edges of things with high contrast (the pixel problem is always a problem between 3d items), but if you can do HDR for extreme light, why not with "normal" light?

    K.
  • MadAd - Friday, March 16, 2007 - link

    Profiles/aa
    I think the problem with profiles are that they are presented in one humongous list when few people would own more than 10% of them. More likely 3 to 10 installed games would represent most gamers (just a guess)

    so why not make profiles based on a drive search and put each game on a tab with an active panel, hide away the big list but still allow a manual addition and while giving the full selection on each tab, grey out the higher functions such as unsupported features (or the underlying AA settings declared in this article) but allow forcing, ungreying all options by clicking a NO disclaimer/warning box rather than an OK box for access to the higher options on each tab.

    Coolbits
    No matter how many years I have worked on and around pcs I still dont like messing with the registry, not unless I have to, and coolbits I found worrying because of the lack of a single official point of info, its almost undergound and who knows if the version we happen to find on site X is a) the latest b) compatible with my version of NV control panel c) will conflict with any other version if I find a version I think is newer.
  • miahallen - Thursday, March 15, 2007 - link

    To reminisce:
    I love technology, and I read online reviews and editorials religiously (I have read Tom's Hardware Guide since it started back in 1996, I was 16 at the time). I am always interested in the new thing, and I especially find the gaming hardware fascinating. But playing games has always left me wanting.

    I've always wanted to be able to play the latest titles, at the highest possible graphical settings. Back in 1998 I built my first computer (for college ;) yah right), it used a Matrox 2D video card with 4MB of RAM (upgradable to 8MB), plus two Voodoo II 12MB 3D accelerator cards in SLI, plus an mpeg2 decoder card. So I had two VGA pass through cables hanging out of the back of my machine, it was crazy! But it was the envy of all my friends at the LAN parties, and ran games like Quake II and Unreal at 1024x768 with amazing frame rates.

    In 2002 I purchased the very first DX9 graphics card on the market, the Radeon 9700PRO with 128MB of RAM and 8 pixel pipelines. It was incredible, I could play Far Cry at 1600x1200 with decent frame rates, and even some AA and AF.

    Two months ago, I once again took the plunge with the very first DX10 graphics card on the market, the 8800GTX with 768MB of RAM and 128 stream processors, this thing is just amazing by todays standards, and I play Oblivion at 2048x1536 with all the eye candy, and AA + AF + HDR...incredible!


    Gaming with the 8800GTX (or all past graphics cards) is incredible - until it crashes...this has always been the problem for me, I push, and push, and push, until I find the limit. This article has really opened my eyes to part of my problem. I thought AA was implemented in the hardware only, and somehow was blind to the fact that a game has to support it. Especially when HDR showed up, and wouldn't work with AA on nVidia HW. This confirmed my thoughts that this was only a hardware issue. But when Rainbow Six Vegas showed up, and the editors were all saying it didn't support AA, I was thinking, "Duh! Just force it through the control panel." That's what I would've done, and when it didn't work, or the game crashed, I'd spend hours trying to figure out why, but eventually get frustrated, and move on to another game. Thanks Anandtech for educating me!

    I'm very glad that nVidia is actively trying to rectify this situation, and I hope they can help. But, after reading this, I would put 100% of the burden on the game developers, they need to get their collective heads' out of their collective a$$es and properly implement AA (and AF and HDR) within the games. Of course many of them are doing quite well, and I hope they only continue to improve.

    Going forward, I think I will be much more inclined to use in game settings, and only go into the control panel as a last resort. My advice to nVidia would be to hide the settings within the control panel, so they are only available through a hot key combo (shift F10 or something), so that the average user doesn't see or care. But the enthusiast, who knows what to do with these settings will be able to access them. And, I do like profiles, so please don't ditch them. Thanks nVidia!
  • brian26 - Thursday, March 15, 2007 - link

    I use the driver panel mostly, mainly because often the ingame option doesnt even do anything. Its like that turbo button on the old 386's. heh
  • Trisped - Thursday, March 15, 2007 - link

    Why not write a specification that allows developers to create a generic hardware option system, then let the drivers decide what is and is not available in game?

    The developer would be responsible for providing the basic interface components including button, slider, and box graphics, as well as fonts, back grounds, and what ever else is natively visual in operation.

    The drivers would provide a list of options that can be configured for the hardware. This list would include the display name of the option, and what user interface it would use.

    The list would then be compiled by the game, and would then generate the control panel for the user, just like a web page takes a list of HTML instructions and blends them with pictures to generate the page we view.

    Since the page is dynamic, old games could take advantage of new hardware abilities. Also, the driver could specify which settings were not compatible with others, for example if AA is not compatible with HDR the driver would state this in the list. The list would then be used to generate the control panel where if the gamer selected AA, HDR would be grayed out, or if the gamer selected HDR, AA would be grayed out.

    This is great for gamers, as it gives them complete control, and great for developers as it allows them to use hardware not yet available. Hardware manufactures also win as they don't have to wait for new games to come out which will utilize the new abilities of their cards.
  • Scrogneugneu - Thursday, March 15, 2007 - link

    That's exactly what I was thinking about :)

    A generic (programming) interface, allowing for games to register themselves and specify which feature they do NOT support whatsoever, which features they do support, and which feature they allow even if it's unsupported. "Feature" is regarded as a kind of feature, not as a level of feature (meaning, the application can support AA, but can't specify it supports AA only up to 8x).

    Just make sure to include a "required computation power" system. Some kind of measurement where the application can say that to perform well under X settings, it would require a fill rate about that high, would require that much shaders and such to get around X FPS on average.

    The application can then just define what it believes is a "Fluid playing experience", a "Decent playing experience" and a "Beautiful playing experience" in terms of FPS (which is bound to differ between games styles, as a shooter will require higher FPS than a RTS) and shoot what the user decides to choose, along with the performance evaluation ratings. Mister Joe Schmoe then just has to choose "Fluid playing experience", and the application will give the raw computing power requirements as well as the target FPS average to the drivers.

    The driver then shoots back the settings to adopt in order to achieve that level of gameplay. Since the driver knows what card the application will run on, the driver also knows what the capabilities of the card are. The task of the driver is then to evaluate what settings the card can support, considering the amount of computing power the application requires (as defined by the application) and considering the target performance (defined by the application too)­.


    In the end, the application developers have to implement an interface of their own on top of the graphics programming interface offered by the drivers, as well as perform several tests to pinpoint the real computational power required to run the application. Having this in hands, they blindly shoot that at the driver, who chooses the best settings. The driver only has to know the specific distribution of computing capacity of every card it supports, which will allow it to simply generate the best settings for the required application. Should the developers tweak their application, they can rerun their tests to get a more accurate picture of the requirements of their application, and the settings will adjust themselves.

    As drivers and graphics cards evolve, so does their computational power. But since the driver is deciding itself what settings it should use (on a per-application basis), these new advances can and will be used by older games. Say we create AAx64 in a year. If a game implemented the interface right, it will have to ask the driver what features are supported, and at what generic level. Any card supporting AA will answer they support AA, and this new super card will let the application know that it supports it up to "64" level. Therefore, the AAx64 mode will be available right away. An all new feature, already in use everywhere.

    Application developers are happy since their product get the advantages of new technologies without them even moving a finger. Driver developers are happy since their new technology is being used very fast, removing that annoying "it's great, but no game uses it yet" period. Average users are happy because they get way more out of their money. And as long as there is that magic "Custom" button available, enabling anyone to choose what settings they want, gamers will be happy too.

    The only drawback : this has to be implemented by every vendor. So NVidia and AMD should work together to produce the basis of that interface (which has to be XML-like, as in extensible so a new option by a vendor won't change the interface all over again, and generic, as in the application should never have to handle special cases). After that, I promise, you can tweak the hell out of your own drivers to get the best match for your cards capabilities :)
  • waika - Thursday, March 15, 2007 - link

    This would be an excellent arcation for NVIDIA to overhaul the entire driver interface; the current approach offers no less then three separate interfaces (tray menu, legacy control panel, and new interface) is just a confusing mess to the novice user, and is cumbersome, poorly organized and of dubious utility to the more experienced and technically savvy Fan of the technology.

    None of NVIDIA's current driver interface presentations offers a clear, well organized, and consistent paradigm over driver feature control -- and adding new features to any of the current interfaces is sure to just make matters worse. It's a very sad state of affairs when many third party driver control interfaces developed by amateurs offer better interface design that is easier to understand for end users at all levels.

    This isn't a very good venue for addressing better approaches interface design, as that requires illustration with mock-ups and working examples of good interface design; but I'd certainly welcome any opportunity to apply some of my skills and knowledge to NVIDIA products if such a venue avails itself or if they contact me through these forums.

    I have a strong affection for NVIDIA products and hope they'll be addressing the issues of driver interface design and feature control in future iterations of their driver products as this is one place NVIDIA products have not been improving.
  • CrashMaster - Thursday, March 15, 2007 - link

    One thing that could help is to have an overlay on the screen (like Fraps) at least for a few seconds after a game loads (or loads a map), that would tell the user what res and what AA/AF mode the card is rendering.

    One other thing that would help is if the setting is being overridden by something to have note of that on the overlay (eg: the game is forcing it to 4x AA when you have it set to 8xaa in the control panel)
  • BrightCandle - Thursday, March 15, 2007 - link

    Vista comes with a performance testing tool to determine what OS features should be enabled. They completely missed a trick though when it comes to DX10. DX10 lacks optional features so all the cards are guaranteed to have them all, so why not have a more detailed performance test that set all these settings up for you by default?

    Humour for a minute and consider if games came with just a few measures of their usage on CPU, hard drive and graphics power. They could include information on the settings they use and corresponding the number of shaders it would then use to achieve the effect. A simple developer tool could be used to capture the numbers across the entire usage (debug version of DirectX which all developers use anyway).

    The OS (Vista) has also captured a set of performance metrics against all these particular metrics, it knows how your PC compares to the performance usage of the game. It can combine that with the known native resolution of your monitor and choose the appropriate sound mode for your speakers.

    The OS can compare its own stats against the game, set up the graphics, sound etc and only return the information the game needs to either choose any settings or that it needs (such as the sound mode). This completely removes the need for the graphics options in the game at all, indeed sound could go as well, as do the "game settings". All that really needs to be there in the future is for Vista to support user setup gaming profiles such that if one game did perform badly (or you wanted to try eye candy that the PeeCee says you don't have hardware to run but you think you know better) you could override it. Maybe after you have installed/launched a game it shows you what settings you choose and you could override them.

    To me it is a very telling sign of bad design that every game has to implement its own menu of audio/video settings. If you've ever tried to program DirectX games you'll know there is a lot of boiler plate code and global variables for all this information and its all there because no one has tried to pull it into the OS yet. It is the best place for it, users shouldn't need to know about these things to just play a game.

    I think this also opens another avenue where you could ask the system - "OK what would it take to be able to use soft shadows?" and it could come back and say you need 128 shaders running with a performance of blah. Graphics card companies publish those stats and you have the ability to shift hardware and for people easily know what they need to do to play games as they want with decent performance.
  • kevjay - Thursday, March 15, 2007 - link

    I agree with the guy who said SSAA is the way to go. It's pathetic that nothing has approached 3dfx's level 7 long years later and that people are so happy with crap like MSAA and CSAA.
    Here is exactly what they should do :
    Ressurect 3dfx's FSAA which is 4x RGSS
    make a double mode for transparent textures.
    give option to disable color compression!
    don't force gamma correction.
  • DerekWilson - Thursday, March 15, 2007 - link

    I used to be a huge fan of things like SRAM for system memory and SSAA all the time, but more and more I see the need to maintain a balance of acceptible quality and performance.

    OK, so I'm still for SRAM as system memory and SSAA, but I've learned not to expect these options :-)

    And really, there is a better option than SSAA -- higher resolution, smaller pixel size monitors.

    As pixel size gets smaller, aliasing artifacts also get smaller and antialiasing becomes less useful. AA is a stop gap until we can reach sufficiently small pixel sizes.

    Really, MSAA and SSAA are just forms of edge blurring. This decreases detail even where we may not want detail decreased. MSAA sticks to polygon edges, but SSAA blurs all color edges.

    4xSSAA is managable, but going further than that is incredibly memory intensive and impractical unless the target resolution is very small (even rendering a 640x480 images with 8xSSAA would result in more pixels than a 30" 2560x1600 monitor).

    I'm all for options though, and it might be fun for graphics hardware makers to include full SSAA options just to play with even if performance suffers greatly.
  • mongo lloyd - Saturday, March 17, 2007 - link

    Hate to break it to you, but pixel pitch is getting LARGER.
    People are retards and buy larger monitors with the same resolution as their 2-4" smaller old monitors.

    There's still no LCD out which has a comparable pixel pitch to the dot pitch of my 2002 iiyama CRT... and the trend is actually going backwards.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    While I'm in full agreement with you that SSAA would just be a stopgap solution, with LCD TVs reaching sizes of 108" with -the same resolution- as 30" ones I don't think we should expect higher resolutions anytime soon. There just doesn't seem to be any demand for it right now.

    One thing that I think should be addressed on an operating system level is how small things get on the desktop at higher resolutions. I mean, what's stopping OS developers from using bigger fonts? It'll look better and stop people complaining that high resolutions make things tiny, something that should not be an argument.
  • Trisped - Friday, March 16, 2007 - link

    You can increase the font sizes for desktop etc in windows. Just right click on the desk top, select properties, Appearance tab, advanced, and select icon from the drop down menu. You can change a lot of things in there. Other OSes probably have the same thing, you just need to find it.
  • mino - Saturday, March 17, 2007 - link

    Yeah, you can do that. Except that setting works
    a) non-proportionally
    b) obly on some fonts at some locations

    The result being it is clearly inferior to big dot size.

    Reason is simlpe, Windows was inherently build for certain dpi and all those "big-font" options were just later added.

    Also for doing that you would need to have all desktop onjects vectorised - fonts, icons, images...
  • Optimummind - Thursday, March 15, 2007 - link

    This is my ideal scenario for how AA and AF should be handled for games.

    The game developers should include options for AA and AF in their games if it's possible with the way their game engines render graphics. If not, the game developers should inform the GPU manufacturers of these AA/AF limitations so the GPU manufacturers can use the information to find workarounds or use the info to inform the consumer about such limitations.

    Perhaps, both nVIDIA and ATI can have an applet within their respective driver control panels that is dedicated to a list of games that don't support AA/AF in an alphabetical list with a search option box.

    This way, if I can't find AA or AF settings in the game I'm playing, I can exit the game, open up the driver control panel, click on the applet with the list of games not supporting AA/AF, find my game on the list, then read about the situation. The information would be deemed informational and useful by me if it talked about such things as:

    (a) The game the way is rendered via its engine, it's not compatible with the AA and/or AF modes supported by our GPU. This issue may be resolved if the game developer decides to patch the game engine.

    (b) The game is compatible with our GPU hardware but due to driver bugs, it's currently not working as intended. The fix is coming from us (GPU company) in the next revision.

    (c) Although the game doesn't include AA/AF options, forcing AA and AF via our driver control panel will work as it has been tested and Q&A by us. Enjoy!!

    I think such a list via an optional applet within a driver control panel would be more visible and very informational. Sure, it would take some time to compile such a long list, but once you have the list, it's simply a matter of adding newer games to it and it can keep growing. It would also be doing the consumer a huge favor by taking the guesswork out of why AA/AF doesn't work and who's actions are needed before it can be made to work. This will save trips to forums where we might not even find the answer to our questions.

    As for the other concepts about game profiles and override settings, I think those two should be kept intact. The profiles are useful b/c I can simply launch a game on my desktop shortcut and the settings I've determined through experiments to work can be automatically launched. But I think they will be more powerful with a list of games I've mentioned earlier. If this "list" idea is too hard to integrate into the driver, then perhaps ATI/nvidia should have a link within the driver control panel to a website that has this list.

    -optimummind
  • crimson117 - Thursday, March 15, 2007 - link

    I'm confused by having AA settings both in game and in the driver. Which gets precedence? Last set? Always the driver? Always the game? If I set it in the driver, then load the game, does the game override the driver?

    I'm sufficiently confused by this that I don't think I turn it on at all. I wish it was easier to understand.
  • Imnotrichey - Friday, March 16, 2007 - link

    I never even knew there was an AA setting outside the game. I always did it through the game. Since I could change it within game, I guess that means my CCC has is set to off.
  • DigitalFreak - Thursday, March 15, 2007 - link

    As far as I know, the driver will always override the in game settings.
  • sc0rpi0 - Thursday, March 15, 2007 - link

    Why should I have to tell the game/driver what <xxx> settings to use? I only care about the level of play (easily expressed in FPS) of the game I'm currently in. All I should have to specify (in the game) is the min and max FPS settings. That's it! The game/driver should then go off and figure out how to give me my acceptable level of gaming "experience." Sheesh!
  • gmallen - Thursday, March 15, 2007 - link

    I find that graphics horsepower for the games I play (i.e., Oblivion) is better used for textures and resolution. I can't stand the poor color and lousy visual quality of LCD so I use CRTs (I have squirreled away a couple of good ones for the future.); thus I have no need for odd resolutions. Drivers still seem to be easy to use but the UI (Catalyst for my ATI) needs to either be tossed or completely redesigned. I cannot believe that anyone with any UI design experience came near the Catalyst project.
  • VooDooAddict - Thursday, March 15, 2007 - link

    Personally, I like the idea of out of game profiles to set rez/AA/quality settings. The profiles should be managed by a separate app that can keep track of what games allow what settings.
  • Araemo - Thursday, March 15, 2007 - link

    Seriously, why won't someone seriously push super-sampling AA? If I'm not mistaken, SSAA doesn't require any fancy tricks, and will work correctly with any rendering technique as long as it renders to a 2d raster image(IE, the framebuffer) at the end of the pipeline.

    SSAA just requires you render at 4x the resolution (2x vert and 2x horizontal) for 4x anti-aliasing, internally to the card, and then down-sample it for output, which cleans up jaggies everywhere. With the extra power of modern video cards, I'd think ATI would be hyping this quite a bit if they had some way to do this with less performance hit than it takes on nvidia hardware(Or vice versa if nvidia has the hardware advantage), as a way to sell higher end cards.

    I'd love to go back to SSAA. My Voodoo5 spoiled me with image quality back in the day, even if driver compatibility was horrible. I generally don't have a 24" or 30" screen, so I don't need a card capable of rendering 2560x2048... unless you're going to let me use SSAA at that res, and downsample it for my 1280x1024 LCD. THAT would make me a happy, paying customer.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    I second this. Me and a friend of mine are both going to buy GeForce 8800s soon, but I know they'll be too powerful for the games we play :P My old CRT monitor can do 1600x1200, but resolutions even that high still cost a lot of money on LCDs.. so why not spice things up with some super-sampling? On LCDs you don't generally get or need refreshrates higher than 60Hz, and tend to feel smooth as low as 30FPS, so all the horsepower the newest cards have is just going to waste right now. (sure you can, say, tweak the Oblivion .ini file 'till you're cross-eyed, but I think SSAA would give better image quality painlessly)
  • chizow - Thursday, March 15, 2007 - link

    Such a headache over a feature that should be streamlined and integrated. Sums up the nature of PC gaming though really. Not enough standards and guidelines, so there's no consistency.

    Not sure what the best approach would be, but from an end-user standpoint, I'd like to see a dedicated GUI similar to the 3D model in control panel that allows you to adjust image quality settings. Only I'd like to see it reflect actual game performance and available AA settings for any particular game.

    I don't know if NV is willing to undertake that level of support, but it would certainly make it easier for the end-user. Either have pre-configured .inf-like profiles for games or the ability to scan and assess games on any given machine and demo/benchmark them.

    Maybe the easiest implementation would be a timedemo of sorts. Like you could enable NV CP to run a game in stress test mode where it would just cycle through the different AA settings while you're playing the game, then report a summary of relative image quality and performance. Its not perfect of course, but right now testing is either subjective or a huge PITA.

    I'd love to see something like setting a target FPS and then allowing the drivers to enable the highest level of AA that still meets that target FPS. Right now the only way to really do that is to run a lot of tests (or reference reviews) and spend a lot of time changing settings, which takes time away from what you should be doing: playing and enjoying the game.
  • michal1980 - Thursday, March 15, 2007 - link

    couldn't nvidia just provide profiles with some settings set for games? if theres a game that doesn't support something, then have that feature dissabled.

    bf2 which i play alot, I sett AA in game to 4x (though wish it could go futher),

    and then set transaperancy AA in the driver game profile, because the game has no optition for it.
  • soydios - Thursday, March 15, 2007 - link

    I really wouldn't mind having a user-customizable performance profile in the graphics driver for each game. It would give me more control over the game, which I never complain about. For instance, I sometimes use ATi CCC to enable antialiasing in older games, but then I have to delve back into the driver to change it to "Application Preference" when I boot up BF2 or other newer games. It would be far more convenient for me if the driver would automatically load that game's profile when I start the game. The profile should include 3D settings for sure, and maybe color, brightness, even overclocks.
  • poohbear - Thursday, March 15, 2007 - link

    i personally set my AA setting to "let the application decide" in the CCC for my x1900xt and in game i usually choose 4x or the "AA high" option. most games allow u to specifiy how much AA to use, so that's fabulous, otherwise if a game doesnt have AA entirely then i force it through CCC. It's kinda annoying, but MOST games support it so i hardly ever have to worry about forcing it through CCC. :)
  • nefariouscaine - Thursday, March 15, 2007 - link

    I also use modded drivers most of the time that include the coolbits reg hack in them from the start
  • nefariouscaine - Thursday, March 15, 2007 - link

    well, I myself when it comes to such settings usually play the hit or miss game on each app I use. 9 outta 10 times I tweak out the AA settings in the drivers as a number of games don't go up to 8x AA but drivers do as well as have "forced" multisampling and super-sampling options as well.

    I've not really had too many issues involving this but more so in enabling AF in the drivers. This caused me many a crash in BF2 until I figured it out. It would be great to see more developers including higher level advanced options for graphics in games as the level of hardware continues to increase. I'm a firm believer that hardware shouldn't be bottle necked by the software its running (i'm talking games). There aren't too many games out that tax a 8800GTX and I'd love to see that happen, soon...

    I say make some warings that pop up when you enable such changes - that might help some but won't be perfect. I'm happy with the layout of the "classic" nvidia drivers settings but the new gets a big thumbs down from me as its too clumsy to find the advanced settings.
  • munky - Thursday, March 15, 2007 - link

    Modern games not supporting AA are a minority, and I don't see a reason to disable driver-override AA settings.
  • DigitalFreak - Thursday, March 15, 2007 - link

    If it's available, I always use in-game AA settings. However, games that have this option are few and far between. Considering how poor Nvidia's driver update schedule has been the last 6 months for anything other than Vista/8800 series, I think Coolbits is the way to go.

  • VIAN - Thursday, March 15, 2007 - link

    I hate to be bothered to constantly going into the control panel and changing the settings. So I usually leave the control panel to the highest quality settings, but leave AA, AF, and Vsync as an application preference.

    I love to use in-game settings to set these. I won't go into the control panel unless I really feel the need to in a game that doesn't support it. Because I also anticipate compatibility issues when forcing something the game doesn't support, I seldom go into the control panel and acknowledge it as a limitation of the game.
  • mostlyprudent - Thursday, March 15, 2007 - link

    I am at best a casual gamer. It has been quite sometime since I tweeked driver settings. However, I have been unhappy with the limitations of in-game settings in several games. If I had the time, I would dig deeper into driver settings to maximize the gaming experience and I think it would be a mistake to limit users to the settings provided by the developer.

    I don't think there is a problem with having a complicated driver structure (from the user's perspective) as long as there is a relatively simple set of settings adjustments in the game. Those who want more control will have to pay the price of a steep learning curve - as long as there is a good payoff in the end.

    Basically, I don't really have a preference as to the approach the driver developer takes as long as it doesn't eliminate the ability to tweek.

Log in

Don't have an account? Sign up now