9800GTX GPU-z screen shot

GPU-Z Validation

Here is our 9800GTX GPU-Z screen shot. As you can see, the card’s core clock is only 25MHz higher than G92 based 8800GTS 512MB. But memory clock is as high as 1100MHz, so that’s why we said it will have a 0.8ns memory module. Because of the core clock’s raised, shader also went up to 1688MHz.

The amount of ROPs, shader and memory interface keeps unchanged compare to G92-400, 8800GTS 512MB’s core.

The GPU-Z 0.1.7 didn’t work with 9800GTX. After W1zzard@TPU make some change on his GPU-Z, the tool can finally read all the data of the card. Thanks, W1zzard.

26 Responses to “9800GTX GPU-z screen shot”

  1. Anon Says:

    But in comparison to an 8800 GTX, it has fewer ROPs (16 vs 24), Still only DX 10.0 Support, smaller bus (256bit vs 384Bit), less RAM (512mb vs 768mb), less bandwidth (70.4GB/sec vs 86.4GB/sec).

    But has higher clocks 675mhz vs 575mhz GPU, 1100mhz vs 900mhz Mem, 1688mhz vs 1350mhz Shader.

    Really doesn’t seem like a great jump to me, but specs aside i wonder how it runs.

  2. Jeff Says:

    you should compare it with the new 8800GTS 512MB because they both based on G92.

  3. Hassan Says:

    It can not even the Crysis to its full potential.

  4. Hassan Says:

    Sorry, i mean’t to use WORDS.

  5. [ExpRev] 9800GTX GPU-z screen shot! - Overclock.net - Overclocking.net Says:

    [...] 9800GTX GPU-z screen shot! Source __________________ Please ignor my spelling and grama! FAQ:s I have a slow computer can i run [...]

  6. TC Says:

    People make fun of ATI, yet ATI uses the latest technology in their cards. And that was months ago. Nvidia has yet to add directx 10.1, shader 4.1, DDR4, or even use the 55nm cores that ATI uses. If you ask me, ATI is much farther ahead in technology than Nvidia is.

  7. DarkElfa Says:

    I have a distinct feeling we’ve been had here follows. Beware false prophets. Besides, when was the last time Nvidia released a next gen card whose non clock ratings were in some places, 3/4 of the gen before it? The real giveaway that this is bull is the name. That isn’t the way GPUZ present names, and I know, I’m looking at a comparison of that shot and my own card right now. It’s an XFX 8800GTX XXX and the name comes up as “%NVIDIA_G80.DEV_0191.1%” not “Nvidia GeForce 8800GTX”. This kind of lie induced garbage is laughable.

  8. Bo_Fox Says:


    Me too–I thought this was too pathetic for an 9800GTX (only 25MHz increase, come on)! Still only 512MB ram? Nvidia would be shooting themselves in their foot if they did that, because it would only sell for $300 at the very most, from day one. And Nvidia was just saying that dual-gpu cards suck and that they want to go back to bigger-core single-chip cards?

    Pathetic, pathetic, just pathetic! Perhaps Nvidia is deliberately trying to see that the 8800GTX has the same “staying power” as the 9700Pro did from ATI, which was roughly 3 years as a relatively “high-end” card. Good for me, because I have one!

    A big surprise is that the 9600GT is so close in performance to an 8800GT that the core clock definitely seems to be incorrectly reported as 600 MHz when it should actually be 650Mhz due to that 27Mhz crystal multiplier instead of 25Mhz and Rivatuner reporting it as such. Anyway, the 9600GT is selling for as low as $150 already and never before has there been such a well-performing mid-range card (compared to 5700, 6600GT, 7600GT, and 8600GT). Very interesting.. but I think there’s a change going on around here, and of course, it’s good news for ATI that can do an R700 real soon, even with GDDR5 and 512-bit memory if they want to.

    Maybe it’s Nvidia’s “shrewd” move to try to get ATI more “relaxed” with their upcoming R700 and then launch an all-out offensive after the R700 comes out?

  9. DarkElfa Says:

    The funny thing is that it’s already been said by Nvidia that the 9800GTX will start with 768 GB of GDDR3. This kind of thing is why people have trouble finding facts on the web. No, it looks like what these guys did was take an 8800GTS and make up facts and charts about the 9800GTX. They obviously don’t have a real one and they never did. Like I said, God hasn’t been able to get his hands on this card. did anyone notice that other than charts and graphs, they didn’t show a single actual photo of the thing?

  10. NeoXF Says:

    @Bo_Fox, the 6600GT was the best mainstream card that nVidia ever released. Obviously you haven’t got a clue about what you are talking about.

    Yeah sure… they will launch a counter-attack after ATI get’s RV770 out yeah, sure. Go back to playing StarCraft dude…

  11. bigjdubb Says:

    It was stated in the article that GPU-Z didn’t worl properly with the card and that it had to be modified. There is a very good chance that the naming differences could be caused by this changed and the fact that this is most likely a pre-production sample and not a retail release card like the GTX you own.

    Either way it is a bit disappointing but not very surprising. Nvidia has no reason right now to shove the ultimate card into the marketplace. Until there is an ATI card that truly makes a top end Nvidia card nervous these are the kind of upgrades we will be seeing.

  12. John Says:

    DarkElfa – I don’t think Nvidia ever said the 9800GTX will start with 768MB of RAM… I’m assuming you meant MB instead of GB. AFAIK they haven’t released any official details about the 9800GTX. It seems in line with all the rumors that all 9800 cards (GT, GTX, GX2) will be G92 based and we won’t see new architecture until the GT200 arrives.

  13. Rappa Says:

    Agreed im thinking a similar revamp kind of thing as was done to the 8800GTS 320/640, just this time with the GTX going G92.
    Im sure however that there will have been some other revisions (despite what the screen shot shows) made so for people to base its performance on 8800 GTS 512 card beacuse of its currently shown clocks is just foolish.
    However I dont see there being that much of an increase ‘if’ those specs are anything to go by, just enough to differentiate the product line more than the 8800 currently is with respect to the 8800 GT/GTS/Ultra/GTX to actually give an incentive to buy over its cheaper 9600/9800 relatives.
    But who knows anything at this stage really.

  14. Ajai Says:

    I heard that 10.1 will be skipped and Nvidia will go direct to DX11 for any new cards so it will be hard to see any nvidia cards with 10.1

  15. Ajai Says:

    Ohh ya we all have seen what a 9600 GT can do with such limited shader’s it will be a blast to see 128 of those new shaders at work

  16. Richie Says:

    BS my 8800GS 384 (overclocked to hell) could OWN that if thats what will ship!!!

    Un-True! the 9800GTX should be a beast, atleast a 20% increase over the 8800GTX
    Unless this whole G9x family is a farce, DIE shrink of the same 80nm G8X
    and the G100 is just around june! And a HARD launch at that, top to bottom! Then its possible!

  17. Ironwulfen Says:

    I find it difficult to believe this as an actual 9800GTX card as the GF 9 Series run on G94 gpus… am I mistaken?

  18. 9800 GTX specs and dissapointing. Says:

    [...] sites that claim they have specs. KetZone.com GeForce 9800 GTX Card Photos & Specs Unveiled 9800GTX GPU-z screen shot – Expreview.com and 3d mark score. 9800GTX 3Dmark06 score here – Expreview.com If these rumors are true then ATI [...]

  19. rofl at the nvidia fanboys Says:

    Darkelfa: “The funny thing is that it’s already been said by Nvidia that the 9800GTX will start with 768 GB of GDDR3.”

    Yeah right… they never said such a thing. Making up stuff as you go doesn’t make you more believable, especially when you claim other stuff to be made up.
    Also concerning your other posts here, you’re the most pathetic fanboy I’ve ever seen.

  20. Undying Says:

    WoW 8800GTX will own 9800GTX haha nVIDIA is confused :D

  21. John M Says:

    “# TC Says:
    February 27th, 2008 at 2:26 am

    People make fun of ATI, yet ATI uses the latest technology in their cards. And that was months ago. Nvidia has yet to add directx 10.1, shader 4.1, DDR4, or even use the 55nm cores that ATI uses. If you ask me, ATI is much farther ahead in technology than Nvidia is.”

    It only makes me wonder why ATi aren’t much, much further ahead performancewise. NVIDIA are keeping up with ATi or rather the other way round.. They had what the 3870×2 that beat NVIDIA for all of 5mins, still does in the bang4buck area though. Kinda makes me wonder what ATi are doing wrong with their “advanced technology”, tbh..

  22. Matt W Says:

    I agree John M, When you look at the specs of the ATI cards they tend to blow away nvidia’s cards but when you plug it in it’s always a dud compared to nvidia’s stuff. It’ makes no sense. I hear people talking that it’s because their core gpu tech is more efficient or whatever but I just don’t understand why ati doesn’t work on getting their core tech up to spped with nvidia, then all this amazing new tech like gddr4 and 5 and the fact that they generally have way faster clock speeds, way more stream processors, 55nm process, etc ect shouold equate to destroying a the current nvidia stuff. I mean they must make so much less money than nvidia due to the fact that they have to add so much more specs to even come close to nvidia’s cards, that new tech aint cheap.

    Also, as far at the 9800gtx, what a disapointment… I bought a single 9800gx2 and and happy with it with my 790imb and 1800mhz ddr3 memory, pulling 19000 3dmarks in 3dmark06.

    I really think that nvidia has held off on implementing these new technologies because they know they are ahead in the game, I gaurantee they are waiting for ati to release a card that smashes anything they got them upgrade their line to ddr4 and 5 and start upgrading the basic specs to be more in line with what ati has now and it would without a doubt blow anything ati could come up with out of the water, esspecially with the new GX280′s coming out (they are amazing man) Still only ddr3 though! crazy right. imagine if nvidia took the gx280 and pumped it up with gddr5 and mass anmounts of stream processors and such, i think they know that it would be so powerful that games would be several years from being able to push it and they want people to have to upgrade constantly to keep up with games, it’s a scam…

  23. TC Says:

    John M says: “They had what the 3870×2 that beat NVIDIA for all of 5mins”

    Try more like at least over a month.

  24. TC Says:

    It just proves what a crappy company Nvidia is… when they refuse to install the latest standards.

  25. Especificaciones y fotos de la 9800GTX. - HardwareMX : HardwareMX Says:

    [...] Fuente: Expreview.   [...]

  26. Constellations in Ancient Egypt Says:

    I located you’re blog via Bing and I have to say. A Enormous Thank you very much, I thought that the article was incredibly enlightening I will come back to see what additional great information I can recieve here.

Leave a Reply