GTX 480 ja oder nein?

Kin Hell

Active member
Banned
Joined
Nov 25, 2007
Posts
6,970
Country
U.K.
Region
Cornwall
We've all seen this film footage in various disguises, but with the recent release of nVidia's GTX 480, I thought it was worth posting for all you ATI Fanboi's. :LOL:

http://www.youtube.com/watch?v=If0Bkfnifi4&NR=1

LMFAO!

Kin

PS. Idle temps of the GTX 480 are no where near 70 Degrees for me. :cool:
 

Nathanieltolbert

New member
Joined
Jun 12, 2008
Posts
1,340
Country
United States
Region
Olathe, Kansas
It's interesting that you point to idle temps of 70C. I never reached those temperatures at idle with the bench GTX 480 that we tested here. In the case or on the bench we idled between 45C and 50C. That wasn't the problem we had with the card. The problem we had with the card was the high 90's that we got when the card was being fully loaded. Don't get me wrong, the card is very nice, but the speed versus the price with the heat didn't make it a must have card. The GTX 470 though, that card runs considerably cooler, uses less power and is quite powerful coming solidly between the 5870 and the 5850 in all tests and exceeding the 5870 in several game tests. Not to mention that the Physx is an added bonus, and CUDA will be good for those who use it. For my personal use, I will stick to the ATI cards I have due to the fact that I'm a poor college student still. For those whom heat and price is not an issue, I heartily recommend the GTX 470 at least since it is a very good performer.
 

Harrison

Member
Joined
Dec 1, 2007
Posts
10,153
Country
UK
Region
West Sussex
:LOL: great link Kin. (y) All so true... hee hee...

BTW, I didn't know you had changed your haircut and grown a moustache? Or that you were so fluent in German... if you squint I can swear you almost look like... ;)
 

Kin Hell

Active member
Banned
Joined
Nov 25, 2007
Posts
6,970
Country
U.K.
Region
Cornwall
^^^^^^

Achtung - ATI fanboy alert! :LOL:

None of it's true Harrison. I can not wait to get this thing under water. There is also a soft Mod for voltage increase & boy do they over-clock.

See here for info. http://www.dangerden.com/New-Products/gtx480-update-5-overclocking.html

From 700Mhz to 900Mhz on the Core speed alone. :drool:
6 FPS in Heaven 2.0 Benchmark is NOT to be sniffed at.

Since I went from a 3.8Ghz Q6600 Quad to a Stock 2.8 i7 Chip & banged a GTX480 there, Crysis now thows back some flooring FPS counts. I'm peaking over 105 FPS @ 1600 x 1200 with everything I can turn on in the eye-candy department, set to MAX.

Motion Blurr....Not needed & set off.

Motion Blurr was something invented to compensate for crap hardware. This games turns on a six-pence in a nano-second. I am absolutely floored with this rig. Big Water Cooled Rig write up for the photo booth is imminent. :mrgreen:

Kin

GPUz.PNG


ExperTool 01.PNG


Everest01.PNG
http://www.guildserver.co.uk/KinHell/Pics/Myi7Rig/Everest01.PNG
 

Harrison

Member
Joined
Dec 1, 2007
Posts
10,153
Country
UK
Region
West Sussex
:LOL:

You know I'm price to performance all the way mate. If I were of the same metality as you and didn't care about cost, and just the performance, then I agree that the GTX480 does have a very slight edge over the HD5870. But from my perspective the huge price difference isn't worth it.

Slightly off topic. What do you think of the Gigabyte mobo you are currently using? And what about the i7 overall?

I've currently planning to purchase an i7 980x in the next couple of months, but haven't decided on a mobo yet. Probably stick with Asus, but I like to hear all experiences on different mobo's.

And how much triple channel ram are you currently running? And have you noticed a performance increase in the ram over older dual channel setups? I am currently also planning at least 24GB (6x4GB) triple channel when I build it as it will mainly be used for HD video editing and post production work. I noticed you can now get 32GB DDR3 sticks, but they are way over priced at the moment. 8GB sticks might fall at some point this summer though, allowing for 48GB, which would be perfect for more freedom wit preview renders in after effects CS5 64bit.
 

Merlin

Ministry of Retr0bright and Street Judge
VIP
Joined
Nov 24, 2007
Posts
15,597
Country
UK
Region
Manchester
^^^^
Electronic willy-waving, and no mistake........:rofl3
 

Kin Hell

Active member
Banned
Joined
Nov 25, 2007
Posts
6,970
Country
U.K.
Region
Cornwall
@ Harrison

Keep an eye out for my Photo Booth i7 Water Cooled monster thread. I will be making some HUGE comparisons between 780i & i7. ;)

^^^^
Electronic willy-waving, and no mistake........:rofl3

^^^^
Electronic willy-waving, and no mistake........:rofl3

i concur! :LOL:

Well chaps, as the saying goes, if you've got it, flaunt it. (y)

Now, where did I put my electronic pecker-discharge unit? - Debbie!? ..... :mrgreen:

Kin

PS Look at 6GB under XP 32Bit on X58. The industry have been talking crap for years...

SysProps.PNG
 

Harrison

Member
Joined
Dec 1, 2007
Posts
10,153
Country
UK
Region
West Sussex
Don't tell me you are still running a 32bit version of XP on your new rig?
 

Kin Hell

Active member
Banned
Joined
Nov 25, 2007
Posts
6,970
Country
U.K.
Region
Cornwall
Atm, yes. :nod:

As I've said before, 64Bit is just a load of cr4p on XP & actually unsupported by Microsoft now.

Vista...pile of tosh
Win7 the same as above, but that was prior to this i7 rig build.

I still have a lot of experimenting to do, but it's more important to get this i7 rig water cooled. Once that's done, I'll be investingating win7 further. ;)

3.5Gb ram under WinXP 32 is pretty impressive. I expected to see less than 2Gb tbh, especially with the GTX 480 having 1.5Gb alone. 4Gb ram on 780i with GTX295 1790Mb (shared accross 2 x GPU's) manages a mere 2.5Gb ram actually seen by the system. - 2 factors here are 6GB on X58 Tri Channel with a full 1.5Gb graphics ram for a single GPU.

Kin
 

Seaside

New member
AmiBayer
Joined
Jan 5, 2010
Posts
1,296
Country
Greece
Region
Athens
Kin i changed 11 VGA's in 2009 all in SLI configurations.

Once i overcloked my first 260 and i had in 1920x1200 24-26 FPS in warhead. Not bad for a 190 euro VGA. But not good also for smooth gaming.

I went from 260's to 275's. Then to 285 (1024MB) later to EVGA 285 (2048MB) 8) FTW. Crysis is a very good looking game but it's built only for one purpose.

To make you buy a new VGA.

My previous setup with 12GB@1600 (7-7-7-20) along with Intel 965Xtreme with the 285's on a 30 inch tft and running Warhead @ 2560x1600 with 16AA and everything else on, did not allow frame rates above 59 to 60. This is s.....t.

Believe or not i'm telling the truth. I will not mention further the 295's because they are useless at high resolutions for games. While it has 896mb for each core and games like Warhead and GTV IV with full view distance) eat 1108MB to 1254MB of video make the 295 slow because of microstuttering.

295's are doin'well with bechmarks and overclock but in the games they're stink.

So with this setup and with a 3xssd intel (Raid 0) Crysis loading time was from 20 to 22 seconds. The good point was with the 285's that they have plenty of memory thus allowing all the textures to be fully loaded.


As for the 480's it was a HUGE BANG accoding to NVIDIA but in the end was not so huge. Enthusiasts users and xtreme gamers plus a big part in the overcloking community expected more from the new VGA's. Personally at least for now i think it is waste of money if you already own serious DX10 VGA's.

With 12 or 24 gigs and 2x285 games run just fine. No need to spend 600-650 dollars or 550 euros for the "promising" 480.

The above is just my point of view.
 

Kin Hell

Active member
Banned
Joined
Nov 25, 2007
Posts
6,970
Country
U.K.
Region
Cornwall
Kin i changed 11 VGA's in 2009 all in SLI configurations.

Once i overcloked my first 260 and i had in 1920x1200 24-26 FPS in warhead. Not bad for a 190 euro VGA. But not good also for smooth gaming.

I went from 260's to 275's. Then to 285 (1024MB) later to EVGA 285 (2048MB) 8) FTW. Crysis is a very good looking game but it's built only for one purpose.

To make you buy a new VGA.

My previous setup with 12GB@1600 (7-7-7-20) along with Intel 965Xtreme with the 285's on a 30 inch tft and running Warhead @ 2560x1600 with 16AA and everything else on, did not allow frame rates above 59 to 60. This is s.....t.

Believe or not i'm telling the truth. I will not mention further the 295's because they are useless at high resolutions for games. While it has 896mb for each core and games like Warhead and GTV IV with full view distance) eat 1108MB to 1254MB of video make the 295 slow because of microstuttering.

295's are doin'well with bechmarks and overclock but in the games they're stink.

So with this setup and with a 3xssd intel (Raid 0) Crysis loading time was from 20 to 22 seconds. The good point was with the 285's that they have plenty of memory thus allowing all the textures to be fully loaded.


As for the 480's it was a HUGE BANG accoding to NVIDIA but in the end was not so huge. Enthusiasts users and xtreme gamers plus a big part in the overcloking community expected more from the new VGA's. Personally at least for now i think it is waste of money if you already own serious DX10 VGA's.

With 12 or 24 gigs and 2x285 games run just fine. No need to spend 600-650 dollars or 550 euros for the "promising" 480.

The above is just my point of view.

Probably because you had VSYNC enabled in the game & your poor TFT can only manage 60Hz @ that resolution, meaning all you will see is 60 FPS @ best.


Probably the rumours that ran for so long before it appeared. SLI has never scaled well on nVidia. Rather poor considering they bough STB (voodoo) out all those years ago & we know that Voodoo SLI was awesome for scaling. Definitely in the 95% FPS increase region for throwing a second card in.


However, X58 has brought SLI scaling much further forward, yet the ATI scales in X-Fire much better than nVidia in SLI. One of the reasons are the phenominal memory bandwidth on the memory alone, but jfhc @ the Bandwith on the L1 Cache of i7. 112GB/sec is mental. Once under water, there should be a pile more to come.

Kin
 
Last edited:

Seaside

New member
AmiBayer
Joined
Jan 5, 2010
Posts
1,296
Country
Greece
Region
Athens
Kin i changed 11 VGA's in 2009 all in SLI configurations.

Once i overcloked my first 260 and i had in 1920x1200 24-26 FPS in warhead. Not bad for a 190 euro VGA. But not good also for smooth gaming.

I went from 260's to 275's. Then to 285 (1024MB) later to EVGA 285 (2048MB) 8) FTW. Crysis is a very good looking game but it's built only for one purpose.

To make you buy a new VGA.

My previous setup with 12GB@1600 (7-7-7-20) along with Intel 965Xtreme with the 285's on a 30 inch tft and running Warhead @ 2560x1600 with 16AA and everything else on, did not allow frame rates above 59 to 60. This is s.....t.

Believe or not i'm telling the truth. I will not mention further the 295's because they are useless at high resolutions for games. While it has 896mb for each core and games like Warhead and GTV IV with full view distance) eat 1108MB to 1254MB of video make the 295 slow because of microstuttering.

295's are doin'well with bechmarks and overclock but in the games they're stink.

So with this setup and with a 3xssd intel (Raid 0) Crysis loading time was from 20 to 22 seconds. The good point was with the 285's that they have plenty of memory thus allowing all the textures to be fully loaded.


As for the 480's it was a HUGE BANG accoding to NVIDIA but in the end was not so huge. Enthusiasts users and xtreme gamers plus a big part in the overcloking community expected more from the new VGA's. Personally at least for now i think it is waste of money if you already own serious DX10 VGA's.

With 12 or 24 gigs and 2x285 games run just fine. No need to spend 600-650 dollars or 550 euros for the "promising" 480.

The above is just my point of view.

Probably because you had VSYNC enabled in the game & your poor TFT can only manage 60Hz @ that resolution, meaning all you will see is 60 FPS @ best.


Probably the rumours that ran for so long before it appeared. SLI has never scaled well on nVidia. Rather poor considering they bough STB (voodoo) out all those years ago & we know that Voodoo SLI was awesome for scaling. Definitely in the 95% FPS increase region for throwing a second card in.


However, X58 has brought SLI scaling much further forward, yet the ATI scales in X-Fire much better than nVidia in SLI. One of the reasons are the phenominal memory bandwidth on the memory alone, but jfhc @ the Bandwith on the L1 Cache of i7. 112GB/sec is mental. Once under water, there should be a pile more to come.

Kin

I have 3 monitors.

2x26 LG w2600HP with s-ips panel and one LG W3000H-BN 30 inch again with s-ips panel 115% color gamut.

All these have 5ms NO GHOSTING AND NO LAG.

Read the reviews at prad.de They are the best monitor far even better than Dell for games.

My point was that the game specifically Crysis is either bad programmed or in purpose runs that way to make you buy a new VGA.

For your information in Call Of Duty 4 & 5 (world at war) FPS was more than 120.
 
Last edited:

Justin

Active member
Joined
Sep 14, 2008
Posts
12,176
Country
United Kingdom
Region
UK
crysis is as buggy as hell (good game) but really sloppily programmed, give the code an old amiga developer and get it streamlined and the bugs taken out :)

seriously, if you look at the game and what it can do maxed out, then you realise that coded properly it would run well on a machine much less powerful than it seems to need.

Also there are a lot of "special" "fixed" versions of this game floating about where the cracks aren't all they are cracked up to be, so dump the download and go and buy the retail version if you don't have a legit version.

rofl, this is from the official Crysis website

Minimum Requirements

CPU: Intel Pentium 4 2.8 GHz (3.2 GHz for Vista), Intel Core 2.0 GHz (2.2
GHz for Vista), AMD Athlon 2800+ (3200+ for Vista) or better
RAM: 1GB (1.5GB on Windows Vista)
Video Card: NVIDIA GeForce 6800 GT, ATI Radeon 9800 Pro (Radeon X800 Pro for Vista) or better
VRAM: 256MB of Graphics Memory
Storage: 12GB
Sound Card: DirectX 9.0c Compatible
ODD: DVD-ROM
OS: Microsoft Windows XP or Vista
DirectX: DX9.0c or DX10


Recommended Requirements

CPU: Core 2 Duo/Athlon X2 or better
RAM: 1.5GB
Video Card: NVIDIA 7800 Series, ATI Radeon 1800 Series or better
VRAM: 512MB of Graphics Memory
Storage: 12GB
Sound Card: DirectX 9.0c Compatible
ODD: DVD-ROM
OS: Microsoft Windows XP or Vista
DirectX: DX9.0c or DX10
 
Last edited:

Seaside

New member
AmiBayer
Joined
Jan 5, 2010
Posts
1,296
Country
Greece
Region
Athens
crysis is as buggy as hell (good game) but really sloppily programmed, give the code an old amiga developer and get it streamlined and the bugs taken out :)

seriously, if you look at the game and what it can do maxed out, then you realise that coded properly it would run well on a machine much less powerful than it seems to need.

Also there are a lot of "special" "fixed" versions of this game floating about where the cracks aren't all they are cracked up to be, so dump the download and go and buy the retail version if you don't have a legit version.

My experience since the game is sold here (Greece) for 10 euros is only from the authentic disc.
 

tokyoracer

(Not actually from Tokyo).
Blogger
Joined
Jun 5, 2008
Posts
4,907
Country
United Kingdom
Region
Norfolk
Crysis is just a benchmark for your hardware. I actually spoke to many saying they could run Crysis and they where lost when I asked them what Crysis was like to play. :roll:

Graphics isn't everything to me.
 

Seaside

New member
AmiBayer
Joined
Jan 5, 2010
Posts
1,296
Country
Greece
Region
Athens
@JuvUK these are fake. I mean totally. I would like to see the Dual Core

Fermi when the new CryEngine is released to Crysis 2.

With one 8800GTX Ultra which the best G80 card in 1600x1200 with all on and AA at 8x Crysis just crawling. When i had this card i benched 9 to 12 FPS with one cards and 21 to 23 with 2 cards. But then i had the 940 Phenom II (Clocked to 3.6 stable) not the I7.
 

Justin

Active member
Joined
Sep 14, 2008
Posts
12,176
Country
United Kingdom
Region
UK
they not fake, these are from the official Crysis website, they are wrong and i think the developers should be shot for listing these specs but they aren't fake m8
 
Top Bottom