Official GTX 470/480 circle-jerk thread

The Fermi stock is pushed back to April 12-15... FFFFFFFFUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU. There's one GTX 480 for sale on ebay for $680 from an international seller lulz. Most probable scam.

The fan recording on HardOCP kinda reminded me of this video btw. It's from a review of 120mm fans. These (Delta) fans are famous for optimizing air flow (the left readout in CFM - cubic feet per meter) and static pressure and completely disregarding noise. They're really deep (about twice the depth of normal PC fans) to keep static pressure high. I don't know personally who would stand them for any longer than a benchmark, but there must be a market out there... Btw the right readout is voltage - the nominal voltage for PC fans is 12v. This specific fan spins at 3000RPM I believe at 12v. About 7v is where the madness really starts to kick in. Enjoy:

<object width="640" height="505">


<embed src="http://www.youtube.com/v/IgHKP_mBZJg&hl=en_US&fs=1&rel=0&color1=0x3a3a3a&color2=0x999999&hd=1" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="640" height="505"></object>
 
How do these compare to the 5xxx series ATI? Ill keep my twin Maxcore 260's still and save the cash. For a new build way down in the future. However these cards do look bad ass, although I dont see that much difference between them and a 295 other than a little increase in speed and video memory. One of these or three BFG Maxcore 216 OC2? That would be 648 cores vs 480.
 
testing results and comments from chris morley at Maingear

img0029lrd.jpg


http://www.maingearforums.com/entry.php?23-So-You-Want-To-Buy-A-GeForce

http://www.maingearforums.com/entry.php?24-So-You-Want-To-Buy-A-GeForce-Part-2
 
If they RMA that board of mine, IM just gonna get another 260, much cheaper.
 
Honestly, I don't even know why people spend so much dough on GPUs. I have a HD4870 that runs everything fine (got it from sixer on the cheap). And I never notice any difference between 4xAA and 8xAA. Same with AF and other filters. I can rarely tell the difference between "medium-high" settings and "high" settings. I'll save the $400 and deal with a tiny jaggy in the background that I will never even notice. Got better things to spend my money on. Like beer.
 
Corky, it's some people's "thing" you know. Some people spend lots on partying, some people on girls, some people on cars, TVs, speakers, vacations. Some choose to spend on computers :) .

It's like asking "why would people spend so much money to go to a $50/plate restaurant when they can go to Denny's." It's a connosieur thing.

Edit: BTW I think even nVidia recommends that the GTX 480 is not stacked for tri-sli like in maingear's pictures above. The cards are just too hot, so a slot distance between them is highly recommended for stability and longevity...

As far as how they compare to the 5xxx series, check this out:
http://www.hardwarecanucks.com/foru...iews/30297-nvidia-geforce-gtx-480-review.html

Pretty thorough review of the GTX 480 and 470.
 
Corky, it's some people's "thing" you know. Some people spend lots on partying, some people on girls, some people on cars, TVs, speakers, vacations. Some choose to spend on computers :) .

It's like asking "why would people spend so much money to go to a $50/plate restaurant when they can go to Denny's." It's a connosieur thing.

Oh, I totally understand...I am a moderator at a major performance computing forum, and own 5 decent PCs myself. I see why people get into it for different reasons, but I still don't see how some people can truly justify a $1000-1500 graphics setup that gets 150fps on a monitor that can only display 60 of those. Meanwhile my $75 hand-me-down gets me playable frames in every game I have thrown at it.

Different strokes I guess. I respect it....but don't always understand it. :p
 
LULZ this has got to be THE most ghetto packaging I've seen for a $500 video card:

DSCF8512.jpg


I guess this is some Japanese distributor?!
 
I used to not notice aliasing before, but since I started using 4xAA, I don't just "notice" it when it's missing - I'd rather not play the game.

It's like buying a bigger monitor. You put up with a smaller one fine, but once you get used to something better, it's very hard to go back.

Also, even the GTX 480 still doesn't deliver 60fps+ in certain extreme conditions (Metro 2033 maxed out, Crysis, STALKER, etc.), nevermind some lesser card.

I know when mine gets here on Tuesday, I'll probably play some Gothic III first and then probably get started on Stalker: Clear Sky. Should be much fun ;-) .
 
I just got a kick out of this article and wanted to share:

Nvidia downgrades Tesla again

About half the performance per watt promised

by Charlie Demerjian

May 5, 2010
<!-- @page { margin: 0.79in } P { margin-bottom: 0.08in } -->
Nvidia_world_icon.jpg
IT LOOKS LIKE
Nvidia is being it's normal honest self with respect to the company's high end Tesla compute cards. Yes, the specs on them dropped again, precipitously, and that is from the already castr^h^h^h^h^h downgraded specs released last fall.
If you recall, a year ago, Nvidia was telling people that Fermi would come out in October of 2009 at 1500MHz, have 512 shaders, and only take about 175W. At its triumphant launch during not-Nvision 2009, those specs creeped down a bit, finally finishing off at 1.25GHz-1.40GHz clock, 448 shaders, 1.8GHz-2.0GHz memory clock and only sipping a mere 225W. The ship date slipped from 2009 to Q1 of 2010, then Q2, and if Nvidia liked you, and you were a financial analyst covering its stock, Q3 for anything resembling real quantities.
Customers were not bothered by this change, they took it in stride. Everything was going well, just ask Nvidia. No problems. Can't make chips? Feh, the architecture is fine on paper. Less than 20 percent yields? Not a problem, just obfuscate when asked about it, and telling the truth seems to be punishable at Nvidia.
Step forward to the 'release' Tesla cards, the C2050 / C2070, as seen by the spec sheet here. Remember the spec sheet that was here, but now has a page not found for some reason? Odd. The link may be dead, but the documents are pictured here, and we have saved copies of both that document titled BD-04983-001_v01.pdf and the datasheet titled NV_DS_Tesla_C2050_C2070_Final_lowres.pdf as created on 11/11/2009 and dated "NOV09". They differ a bit from the more modern ones.
NV_Fermi_castration_PDF.png

November stats, from Nvidia's PDF
The new specs are a significant reduction, unless you are wondering about power, then they are higher. If you are an OEM, they are higher still, but lets not quibble about levels of dishonesty, I mean, if the SEC doesn't care about what Nvidia is telling the analysts and the investing public, why should mere journalists bother to hold it to its statements? It doesn't give analysts 30" monitors.
C2050_C2070_April_specs.png

The new stats, slightly reduced for quick sales
You can find all the new literature here, but the one you want is specifically the C2050 / C2070 data sheets from "APR10", here. Those 'puppies' only run at 1.15GHz, have 515GFLOPS DP FP performance, and 1.03TFLOPS SP FP performance. Memory is at 1.5GHz, and power consumption is now 247W. Here's an analysis of the slippage.
Tesla_numbers_april.png

Last year, November, and now.
<!-- @page { margin: 0.79in } P { margin-bottom: 0.08in } --> What is there to say? The Fermi based compute cards are already a running joke, delivering only 68 percent of the promised performance, 88 percent of the cores at 77 percent of the intended clock speed,for 141 percent of the power. That turns out to be slightly less than half the promised performance per watt, the overridingly critical measure in the compute space.
In the end, Nvidia seems to have delivered about half of what it had promised, less if you consider memory speeds, but it is late, draws lots of power, runs hot and costs far more than hoped per chip. None of these problems are fixable, it is time for a new architecture.
With any luck, Nvidia will get to those favored financial analysts before they realize this. One thing for sure, it needs to get word out before some pesky journalists start raising inconvenient questions about threading, asynchronous transfer capability, and how much CPU time that takes versus what Nvidia promised. If word of that gets out to any analyst who understands the effect this will likely have on sales, things could get mighty awkward for the boys in green.

http://www.semiaccurate.com/2010/05/05/nvidia-downgrades-tesla-again/
 
Back
Top