Gå til innhold

G80 og R600, hva vet vi?


Anbefalte innlegg

om 2900xt ska kosta 400$ så blir kanske 8800gts billigare..kan man hoppas på.

 

det er nok nesten garantert at vi blir og se et ok prisfall fra nvidia, ingen bra plan for de og konkurere med et kort som er 1000 kr billigere

8499277[/snapback]

 

Jepp spørsmålet blir vel om priskuttet kommer 2. mai da ATI er ventet å offisielt lansere 2900xt, eller 14. main da det er ventet å bli tilgjengelig i butikkene. Jeg håper det første da jeg kommer til å kjøpe meg nye pc med 8800gts ganske snart.

Lenke til kommentar
Videoannonse
Annonse

Jeg lurer på om dette er ett AMD og ikke ATI basert show for tiden.

 

Det er mye spekulasjoner rundt xtx for tiden, og selv håper jeg at det aldri lanseres, det er bedre at de baserer en ny runde med kort rundt 65nm teknikk.

 

Det jeg håper mest på er att toppklassekortene får en ny runde konkuranse, som igjen setter igang ett race med mellomklassekort, istedenfor at 1950pro ser ut som det mest attraktive kortet siden man kan få det ned i ca 1100,-

 

 

For selv om 8800 har bidratt med rå ytelse så har det aldri vært veldig fristende siden med dagens standard så trenger man det ikke hvis man ikke har en rå skjerm i tillegg.

 

Enten må DX10 dukke opp med noe som trenger ny makinvare, eller så må man gi ny maskinvare som yter bedre per krone, for alle som har drevet med maskinvare vet at det å kjøpe noe framtidsrettet ikke gir noen mening siden den maskinvaren koster vesentlig mindre i morgen.

Lenke til kommentar
Hehe jeg har 2x  G7950 GX2 SLI og jeg sitter med 20-30 FPS på Raimbow Six Vegas på medium grafikk xD

8502739[/snapback]

Du ville bare brukt 1 av kjernene i det spillet da. (altså samme som et 7900GT)

Men som sagt, så er det rippet rett over ifra Xbox360, så du vil alltid få elendig ytelse. En 50FPS i 1600x1200 med 8800GTX er jo ikke det beste det heller.

Lenke til kommentar
Trenger da absolutt ikke rå skjerm for å kjøpe 8800GTX.

Jeg har 19" LCD (1280x1024), og det er flere spill jeg ikke får 100FPS på engang (Full AA+AF).

8502723[/snapback]

 

Hehe, tja det var litt sent i går og jeg skrev kanskje litt bastant ja. Men med en skjerm som oppdaterer seg 60 ganger i sekundet, så er hjelper det vel lite om skjermkortet er i stand til å gjøre den samme oppgraderingen 100 ganger.

 

Og jeg vet jeg kastet litt stein i glasshus siden jeg har brukt en del penger på slike komponenter selv.

 

Og ikke minst, her på forumet er det snakk om entusiaster, så er vel litt andre behov som er til stede enn når man snakker om den gjevne bruker.

Lenke til kommentar
Hehe, tja det var litt sent i går og jeg skrev kanskje litt bastant ja. Men med en skjerm som oppdaterer seg 60 ganger i sekundet, så er hjelper det vel lite om skjermkortet er i stand til å gjøre den samme oppgraderingen 100 ganger.

Man får likevel en raskere oppdatering, men på kostnad av screen tearing (flere frame oppdatering per skjerm refresh). Med vsync minsker faren med at renderingen dupper under 60fps og dermed frameraten halveres til 30fps.

Lenke til kommentar

Mistapi, kan du kaste noe lys over dette ?

 

Yes, the R600XT can co-issue 5 vectors/scalars, and in some cases R600XT is over 2x faster than 8800 GTX.

 

Test instruction set (length 384, no texture fetch and 100 iterations)

MAD R0.xyz, R0, R0, R1;

MUL R2.x, R2, R3;

MAD R1.x, R1, R1, R3;

MAD R0.xyz, R1, R1, R0;

ADD R2.x, R1, R2;

MUL R3.x, R3, R1;

 

 

R600XT (co-issue 3 instructions) - 93,9277 GInstr/sec

 

8800 GTX - 39,1998 GInstr/sec

 

http://forum.beyond3d.com/showthread.php?t=39173&page=154

Lenke til kommentar
Mistapi, kan du kaste noe lys over dette ?

 

Yes, the R600XT can co-issue 5 vectors/scalars, and in some cases R600XT is over 2x faster than 8800 GTX.

 

Test instruction set (length 384, no texture fetch and 100 iterations)

MAD R0.xyz, R0, R0, R1;

MUL R2.x, R2, R3;

MAD R1.x, R1, R1, R3;

MAD R0.xyz, R1, R1, R0;

ADD R2.x, R1, R2;

MUL R3.x, R3, R1;

 

 

R600XT (co-issue 3 instructions) - 93,9277 GInstr/sec

 

8800 GTX - 39,1998 GInstr/sec

 

http://forum.beyond3d.com/showthread.php?t=39173&page=154

8506252[/snapback]

Det blir over mitt hodet. Hør heller med Jawed på Beyond3D forumet.

Lenke til kommentar
Hei. Lurer på om det er sånn at DailyTechs tester som viser at Radeon HD 2900XTX yter dårligere enn geforce 8800GTX er helt korekte? Testen som sammenligner Radeon HD 2900XT opp mot Geforce 8800GTS viser jo at HD 2900XT tar et klart forsprang, hvorfor taper de da med HD 2900XTX?

 

Mvh LockBreaker :)

8506563[/snapback]

Vi får vente å se.

 

You want some truth?

 

People who have the cards, affiliated, and replying on those Daily Tech reports:

 

 

Kombatant: www.kombatant.com AMD/ATI employee, (ex) Beyond3D/Rage3D mod:

 

On a serious note, I still see some stuff out there that have no merit. Patience.

 

They christened the OEM version we've known and loved for quite a few months as an "XTX". That should tell you a lot about their credibility actually.

 

Ok, so before this goes totally out of hand, let me say this, and this will be my final say on the matter, until the NDA is lifted: AMD made certain decisions concerning this card. I took a hard look out there, to see what's being leaked, and it seems there are still some stuff that are totally made up - tbh, it smells like a FUD campaign to me, if I take into consideration certain emails that are flying around lately. Certainly there are some stuff out there that are true, and you will know which is which when the NDA lifts soonish. The journalists that were in Tunis certainly know, and are probably laughing at some of those at this minute.

 

So, to recap: What I've said in the past stands about the card (Sound_Card been on a mission to put all of my quotes in his sig so that we don't miss anything ). Unfortunately I can't reveal more at this point due to the NDA. And for those who are wondering, no, I am not a moderator/staff on Rage3D anymore, I stepped down two months ago, because it wouldn't be ethical imo with the new job I now have to continue do work here.

 

As I said, some info out there are accurate, some are not. Whatever rumours are out there certainly won't force AMD to reveal stuff sooner, that simply doesn't work, whether you're called Intel, AMD, nVIDIA or whatever.

 

 

Bum_JCRules @ THG (under NDA with cards):

 

Total Crap.. well almost:

While I am required to follow the NDA, the stuff up on Daily Tech today is almost worthless. Yes Anandtech was present in Tunisia (signing Non-disclosure agreements like the Inquirer), why they are posting this stuff is beyond me because their numbers are off. They must be only using the XP drivers and OS because the numbers in CF vs the GTX are very much different. So until I can officially comment on the architecture and the performance.. hold all of this as useless until the rest of the world writes about it.

 

I really would love to comment on this stuff...

 

I understand that DT and Anand are seperate but that is so childish. Derek was there and his cards got to his place of business before he returned home from Tunisia. That long board they have ... Not what Derek should have gotten in his delivery. That is all I will say before I go too far.

 

 

Kink (under NDA with cards):

 

Dailytechs benchmarks are inaccurate. Atleast in terms of 3dmark 06 (commenting on HD 2900 XT)

 

 

Metro.cl (under NDA with cards): www.chilehardware.com

 

laughable (whistling emoticon (wasn't me))

 

 

BenchZowner (under NDA and has card):

 

1) These benchies from DailyTech are quite off the reality, that's all I can say.

 

2)The 2900XTX & the 8800GTX are performing on par in Crysis at the moment ( direct info from a developer of the game )

 

3) Is it a deja-vu ? Remember the Mr Sanders case ? The...non invited to the press event at Ibiza editor of Hardware Analysis who wanted to "punish" ATi by publishing his...numbers of desire for the R580 ? He he, Deja-Vu

 

http://i12.tinypic.com/2py29hz.png

 

"Do I really have to express myself here ?

If these guys are so called my colleagues ( as of being reviewers like me ) then I should feel ashamed, really Evil or Very Mad

 

What are we looking at here ?

 

a) They used a better test system today, better drivers ( supposed to be ) and they managed to get the 2900XT to perform worse than their previews bench session with a worse test system & worse drivers ?

How come ?

On April 24 they got 84FPS on F.E.A.R. with the 2900XT with a QX6700 and pre-release drivers, and they got 79FPS today with a QX6800 and retail drivers ? Oh really ? Very Happy

 

b) Where's the result for the 2900XT in Company Of Heroes today ? Why N/A ?

 

c) On Half-Life 2:E1 today they got the 2900XT to score 1FPS more than the 2900XTX.

What could've caused this ? A typo ? Quite angelic.

Something else ? Using a CPU limited resolution which would cause both cards to behave like they're the same.And then there's the GTX surpassing the R600s by ~40 FPS. Quite the real leap over the Radeon X1950XTX at that game. Evil ? heh

 

d) Now, the best part...they scored ~48FPS in Oblivion on the 24th, and now they present us a 54FPS gain by a move from the QX6700 to the QX6800 and the small gain from running the RAM at 1T Command Rate ?

 

e) A reviewer in order to conduct comparable and a valuable & trustworthy review must use the same testbed, quite they opposite is what they did ( if they really went through a performance testing process )

 

f) And for what reason would somebody present unclear results in combination with unclear drivers & filters ( AF & AA ) settings ?

***** right boy Very Happy

 

My two cents ( oh wait, I have another one ) [ I'll save it for later ]

 

P.S. The stock core clock for the 2900XTX as of current is 800MHz and not 745MHz as they state.

 

P.S.2. That's pretty much all I can say at the moment.

 

P.S.3. Now I have to finish a memory roundup and then pack my stuff for a trip.

 

 

What did Crysis lead Developer (has the cards) say about the two (G80GTX and R600XT)?

 

And remember, Crysis is a DX9 game with an additional DirectX 10 codepath which will only help in getting better performance (gains) for the ones with DX10 cards.

 

28th April:

I'm gonna spit out a very last detail ( before they get me behind the bars for disclosing information )

 

The 2900XTX is 4fps in front of the 8800GTX ( in average ) in Crysis ( information directly from a developer of the game )

 

The 2900XTX won't be the G80 killer that everybody was expecting, but it'll be in front in most cases, with little to good margins depending on the game and settings.

 

nVIDIA did a great job with the G80 this time, and the performance leap over the previous generation is huge and only compared to the R300 launch in the past

 

 

That's STOCK clocks guys, it can OC high

 

 

BTW, from Fudzilla by "mouth": GeForce 8800 Ultra get 14000 scores in 3DMARK06 with no detail of test platform and no driver detail:

http://www.fudzilla.com/index....=view&id=678&Itemid=1

 

From DailyTech: Radeon HD 2900XT get 14005 scores in 3DMARK06 with full detail of test platform and driver detail:

http://www.dailytech.com/Overc...+R600/article7044.htm

 

 

AND NONE of them are with with release drivers, look at the GPU clocks very carefully aswell, look for the silicon version, look at the benchmarks used aswell and the detail in each setting (compare), look at the motherboard used too and that 8800GTX is no where to be found BUT by BFG at $950 as a watercooled card. PLUS those are EARLY pre-release sample cards, ES samples if you understand what that means and do you remember the abysmal performance of the faulty 8800GTX early ES samples with the wrong resistor values on release? What about the driver optimization that took place over months on end to get decent performance? Joe is on the JEDEC board BTW, and an AMD/ATi employee who helped produce GDDR4 for X1950XTX and made it a killer product so now you are telling me after 2 years spent on it like nVidia had spent 4 years on the G80, and millions of dollars, it simply doesn't beat the X1950 in benchmarks and no higher clock on the GDDR4? BTW, those benchmarks are not remotely correct either. We have 100's of GTX split around in our corporation and my own work system has 2 in SLi at 650/2100 with watercooling. Seriously, some of them benchmarks for let's say Oblivion, are 100% increased by h*ll knows what!

 

Hint #200: Memory clocks can go much higher and bandwidth (i.e. performance too) is much greater in upcoming titles and higher res/detail.

 

BTW the owner of Fudzilla, Fuad, is the ex-Graphics Editor of The Inquirer if you didn't already know. Take everything with a brick of salt.

 

 

I'll leave you with a little pre-release bang

Lenke til kommentar
Hehe jeg har 2x  G7950 GX2 SLI og jeg sitter med 20-30 FPS på Raimbow Six Vegas på medium grafikk xD

8502739[/snapback]

Du ville bare brukt 1 av kjernene i det spillet da. (altså samme som et 7900GT)

Men som sagt, så er det rippet rett over ifra Xbox360, så du vil alltid få elendig ytelse. En 50FPS i 1600x1200 med 8800GTX er jo ikke det beste det heller.

8504277[/snapback]

 

Så jeg ville brukt 1 av 4 ?? DOMME DATAAA!!!!

Lenke til kommentar
Hehe jeg har 2x  G7950 GX2 SLI og jeg sitter med 20-30 FPS på Raimbow Six Vegas på medium grafikk xD

8502739[/snapback]

Du ville bare brukt 1 av kjernene i det spillet da. (altså samme som et 7900GT)

Men som sagt, så er det rippet rett over ifra Xbox360, så du vil alltid få elendig ytelse. En 50FPS i 1600x1200 med 8800GTX er jo ikke det beste det heller.

8504277[/snapback]

 

Så jeg ville brukt 1 av 4 ?? DOMME DATAAA!!

8506711[/snapback]

Jepp. Xbox 360 har ikke SLI (kun støtte for en GPU), og derfor har ikke PC versjonen det heller. (fordi spillet er nærmest kopiert rett over ifra Xbox 360)

Noe som igjen gir forferdelig ytelse i seg selv.

Lenke til kommentar

Opprett en konto eller logg inn for å kommentere

Du må være et medlem for å kunne skrive en kommentar

Opprett konto

Det er enkelt å melde seg inn for å starte en ny konto!

Start en konto

Logg inn

Har du allerede en konto? Logg inn her.

Logg inn nå
×
×
  • Opprett ny...