Announcement

Collapse
No announcement yet.

Nvidia already cheating to make 6800 look good.....

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Overlag
    started a topic Nvidia already cheating to make 6800 look good.....

    Nvidia already cheating to make 6800 look good.....

    Translated here

    Instead of as before about 23 pictures per second (1,600 x of 1,200 pixels, 32 bits, 4xAA and 8xAF) our GeForce reached 6800 Ultra of extremes edition of suddenly 40.3 pictures per second.

    If one however renames the file "FarCry.EXE" for example in "FartCry.EXE" over, the result of the bench mark in the dissolutions of 1.024 x 768, 1,280 x 1,024 and 1,600 x of 1,200 pixels worsens around in each case 10 pictures per second (!).

  • Overlag
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    a nice write up on some "issues" with Nvidia and ATI's drivers.....

    http://www.overclockers.com/articles1039/

    Leave a comment:


  • asch
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Found some articles from last year.... wanted to point out that both ATI and nVidia configure their video cards / drivers to perform better on benchmarks and games.

    ATI Technologies Admits Cheating the Drivers To Achieve Higher 3DMark03 Score
    Futuremark Caught NVIDIA and ATI Technologies On Cheating In 3DMark03

    Leave a comment:


  • jex
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    So both Ubi and Nvidia are conning us then - they should be investigated.

    Leave a comment:


  • Overlag
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Originally posted by en4rcment
    :icon9: It seems like a false benchmark result then right?

    If the video card can process the data at 40 fps but you rename the file and that SAME DATA can only be processed at 23 FPS, it looks like a huuuuge scam.

    The way I'm reading the results, the NVIDIA card is showing false FPS scores for the game FARCRY. Because if the card could process the data with a FPS in the 40 range, renaming the data should have NO different result. It's still processing the exact same data.

    If this is true, this is a bigger scam than some of you realize...


    ---------------------------------------------------------
    no, the point is when the drivers detect farcry, it turns off loads of details that YOU have selected to have on. The Same Data is getting used, yes but its results are different, ie a ultra blury picture with jaggles, no shadows or fog....much like how Hl1 looks like.....lol

    oh and yes, you can test it on ATI, whatever the name is its always the same result :D

    Originally posted by flux
    I also didn't get out of that report that anything was taken out of FarCry when the name change happened. If they are taking options out that you explicitly told them to put in, well I guess that is bad, but not sure I would be upset. Sure I would want to know about the change the driver made, so I would say they should just return in a DeviceCAPS call. This way the game could detect that they don't support that feature on your system so that you don't have the option to select, but that is a game feature to implement as well.

    To me this is similar to Dynamic Level of Detail which most game implement. No one complains about this when game developers do it, so why complain with the graphics card does it for a particular game? I think card mfgs are upfront about doing these types of optimizations, maybe they should be more detailed on what they are doing with each driver release.

    Again, I'm not saying nvidia is being completely upfront, but i also would not limit it to just nvidia.
    other tests by people on English speaking forums have noticed what is happening and explained it. Nvidia call it a *BUG* in that new beta driver release (which was released when ATI released there cards), but how does this *BUG* know to be BUGGY when the program is called farcry? Fact is Nvidia cards as a whole SUCK in farcry, Nvidiots say this is because farcry is a buggy, and crap game engine..............but then why does it work on everything else fine? even Geforce 4s? Well because Geforce 5 and 6 are odd designs that need odd drivers.....


    Nvidia will get there in the end, and hell, im even remotely thinking about getting a 6800 instead of a X800pro or XT....depends on them prices. :icon_wink
    Last edited by Overlag; 05-10-2004, 04:40 PM.

    Leave a comment:


  • asch
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Originally posted by jex
    I think the point Sarc is making is that the card is specifically looking for the game, meaning the card has been made to recognise far cry?

    It says it 'clearly' in the article. When the .exe was renamed it ran slower. So is there something more sinister behind this - are Ubi now telling the card manufacturers to make cards compatible with Ubi's games to stifle the competition?
    What I'm getting at is that if you only test one instance on one card, you can't compare it to any other card. Sure, I understand that changing the name of the executable caused a frame-rate change. No where in that article did it say why. There could be a number of reasons besides looking for a filename.

    What is to say that you would not have the same problem if you used an ATI card? There hasn't been any other comparison.

    Leave a comment:


  • en4rcment
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    :icon9: It seems like a false benchmark result then right?

    If the video card can process the data at 40 fps but you rename the file and that SAME DATA can only be processed at 23 FPS, it looks like a huuuuge scam.

    The way I'm reading the results, the NVIDIA card is showing false FPS scores for the game FARCRY. Because if the card could process the data with a FPS in the 40 range, renaming the data should have NO different result. It's still processing the exact same data.

    If this is true, this is a bigger scam than some of you realize...


    ---------------------------------------------------------

    Leave a comment:


  • jex
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Originally posted by asch
    I wouldn't say that it's a fact. Where was that declared? I couldn't completely understand the translated article... but I didn't see anything that stated why the performance decreased with the renamed file. And was there a similar test with an ATI card?
    I think the point Sarc is making is that the card is specifically looking for the game, meaning the card has been made to recognise far cry?

    It says it 'clearly' in the article. When the .exe was renamed it ran slower. So is there something more sinister behind this - are Ubi now telling the card manufacturers to make cards compatible with Ubi's games to stifle the competition?

    lol

    Leave a comment:


  • flux
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Originally posted by Overlag
    And Farcry isnt OpenGL, its D3d ;)
    I was just giving the best example I could. I'm more familiar with OpenGL and never meant to imply my example was what is happening.

    Originally posted by Overlag
    You are not bias towards nvidia, but dont know the full details. Nvidia have been doing this crap since they lost the performance lead, they been cutting corners and image quality so they look "ok" in benchmarks. The FutureMark 3Dmark 2003 scam proves this. Why do they need to optimize/cheat in a benchmark that you wont play? Because it makes the numbers better for reviews........
    I'm aware of the "scandal" last year and agree that these types of optimizations should not be done on benchmark tools. Again no one hides the fact that they do this. Even ATI made the same types of enhanements (just not as effectively).

    I also didn't get out of that report that anything was taken out of FarCry when the name change happened. If they are taking options out that you explicitly told them to put in, well I guess that is bad, but not sure I would be upset. Sure I would want to know about the change the driver made, so I would say they should just return in a DeviceCAPS call. This way the game could detect that they don't support that feature on your system so that you don't have the option to select, but that is a game feature to implement as well.

    To me this is similar to Dynamic Level of Detail which most game implement. No one complains about this when game developers do it, so why complain with the graphics card does it for a particular game? I think card mfgs are upfront about doing these types of optimizations, maybe they should be more detailed on what they are doing with each driver release.

    Again, I'm not saying nvidia is being completely upfront, but i also would not limit it to just nvidia.

    Leave a comment:


  • asch
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Originally posted by =Sarc=
    I think it's the fact that it must look at the filename to use the enhancements. Why does it run faster when the file is called FarCry.exe and slower when named FartCry.exe?
    I wouldn't say that it's a fact. Where was that declared? I couldn't completely understand the translated article... but I didn't see anything that stated why the performance decreased with the renamed file. And was there a similar test with an ATI card?

    Leave a comment:


  • Overlag
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Originally posted by flux
    Why is this cheating? So they made their card work better for a game, isn't that what you would want? They have been doing it with id games ever since they made a name for themselves. It has never been any kind of a secret. Each game packs their data differently. I don't fault anyone for peering into the data and trying to make their hardware faster for specific games. Makes my playing experience that much better.

    Game companies do it from the other end. They have a favorite set of API's that they know how to really tweak, so they use those. Most of the time that means their code runs really slick on one brand of card and not the other. Valve and id are perfect examples of this. Carmack loves OpenGL and nvidia, Gabe loves DirectX and ATI. Do you blame them for picking a favorite? Yeah it sucks, but they are more effective when they specialize.

    I'm not bashing anyone here, I'm just curious why people get so upset when they find out nvidia made their card faster for game XXX. It doesn't mean their cards sucks for all other games. Maybe that partiruclar games data was packed specifically to take advantage of OpenGL extensions that they didn't support fully.

    I will tell you I have a bias towards nvidia, but I would not hold it against ATI if they tweaked their drivers for a specific game. It's what you have to do sometimes.
    because if you WANT it to be high quality, with FSAAx4 on and lets say AFx8 you dont expect them to shut off FSAAx4 or AFx8. But they go one step further by removing shaddows, and fog as well. I want the game to look how I set it up to be (and how its supposed to look). And Farcry isnt OpenGL, its D3d ;)

    You are not bias towards nvidia, but dont know the full details. Nvidia have been doing this crap since they lost the performance lead, they been cutting corners and image quality so they look "ok" in benchmarks. The FutureMark 3Dmark 2003 scam proves this. Why do they need to optimize/cheat in a benchmark that you wont play? Because it makes the numbers better for reviews........

    Leave a comment:


  • =Sarc=
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    I think it's the fact that it must look at the filename to use the enhancements. Why does it run faster when the file is called FarCry.exe and slower when named FartCry.exe?

    Leave a comment:


  • flux
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Why is this cheating? So they made their card work better for a game, isn't that what you would want? They have been doing it with id games ever since they made a name for themselves. It has never been any kind of a secret. Each game packs their data differently. I don't fault anyone for peering into the data and trying to make their hardware faster for specific games. Makes my playing experience that much better.

    Game companies do it from the other end. They have a favorite set of API's that they know how to really tweak, so they use those. Most of the time that means their code runs really slick on one brand of card and not the other. Valve and id are perfect examples of this. Carmack loves OpenGL and nvidia, Gabe loves DirectX and ATI. Do you blame them for picking a favorite? Yeah it sucks, but they are more effective when they specialize.

    I'm not bashing anyone here, I'm just curious why people get so upset when they find out nvidia made their card faster for game XXX. It doesn't mean their cards sucks for all other games. Maybe that partiruclar games data was packed specifically to take advantage of OpenGL extensions that they didn't support fully.

    I will tell you I have a bias towards nvidia, but I would not hold it against ATI if they tweaked their drivers for a specific game. It's what you have to do sometimes.

    Leave a comment:


  • Overlag
    replied
    Re: Nvidia already cheating to make 6800 look good.....

    Post cleaned... lets keep it that way so dave can keep his hands off the nuke thread button :D
    Last edited by Overlag; 05-07-2004, 08:43 PM.

    Leave a comment:


  • Guest's Avatar
    Guest replied
    Re: Nvidia already cheating to make 6800 look good.....

    I'll keep my Nvidia FX Asylum 4600 256megs I see no need to upgrade right now. Plus the card works flawlessly and since I only paid $150 for it why bitch..... :D

    Leave a comment:

Connect

Collapse

TeamSpeak 3 Server

Collapse

Twitter Feed

Collapse

Working...
X