![]() |
|
IL-2 Sturmovik: Cliffs of Dover Latest instalment in the acclaimed IL-2 Sturmovik series from award-winning developer Maddox Games. |
![]() |
|
Thread Tools | Display Modes |
#21
|
|||
|
|||
![]() Quote:
![]() |
#22
|
|||
|
|||
![]() Quote:
Walshy, please don't get personal and stop to make yourself so big. You can be expert of whatever you want, but this is currently not changing anything in my result to give the initial poster the information for THIS game! You are right with problems in il2, but nvidia had issues, too. Workarounds were normal, especually in these days without that regular support activities from the hardware companies. It was less frequently and so, you had issues more often. I never meant to say, that overclocking do not stress, but the BOM of the ati card is quite a good quality and the card materials compared to the nvidia are more expensive. I do not say that it directly causes better performance, but the card itself has good materials. Gpu is "underclocked" as ati also say with the fact, that new cards are revised and got the name 1ghz editions. You and me are simply not able to say anything about lifetime in this case. I just do not have a sufficient link, but a simple magazine, i do not care. Some big magazines already revised their benchmarking methods and also had to say, that their initial test reaults are too much "pro" nvidia. The whole results, which came out on release of the 680 were some kind of weird. And I only saw 1 or 2 websites with a reasonable testing. Reasonable to me is the following situation: Nvidia uses boost and it uses it, when needed. So, in benchmarks it will use it very often or nearly always. So, we cannot compare the cards directly. So, we have to find a solution for it and as they are high end cards for enthusiasts, who know how to get all out of it, we should take a look over stock cooled reference cards, overclocked to its stable maximum. This test was made by only a few websites. The result, comparing the 680 overclocked (currently do not know the spec on that, but it was even higher than the overclockings of some big magazines) and the hd7970 @1200/1500 was very interesting and showed, that it is close together, but the whole package of features provided by ati was better, so that ati got the lead and their award for the fastest card. You never can say, that it is all true, but this was the only good benchmarking at release in the whole web. Overclocking leads to shorter lifetime-> not comparable as you just do not know all influences and material behaviours. You will have to look on your warranty. After this period, the card can break any day and you are not able to do anything. They could also have implemented life time dependend parts, which simply forces the card to die after some years. This is known from lamp or mobile phone lifetimes, which are more and more designed to die in a certain timeframe, to sell new products. It is just dumb NOT to overclock your card, if it is needed. That is my opinion. Or do you try to use a card more than 4 years? Perhaps your next solution is to undervolt all components to safe equipment lifetime. BTW, my hd7970 do not get over 45 degrees overclocked. You have critical, lifespan reducing temperatures on the components, which is simply not reached by both: nvidia and ati. You will get graphic problems and instability, before you reach temperatures over the components specs. Do not take it so serious. It isn't necessary to talk about arse ... Just my opinion. And please do not read between lines to strengthen your diss. Keep it cool. And to shoot back a bit: If you were working in my company and I see your statement of your jobs, I would fire you. This is unprofessional attitude and you jobs have nowhere near something to do with graphics cards or shouldn't state it in private opinion in a gaming forum, if you do not officially write for your company. BTW, I am asdfjkl engineer in rijfndk ltd. and I have the longest! ![]() Last edited by Stublerone; 08-16-2012 at 09:46 AM. |
#23
|
|||
|
|||
![]() Quote:
But you are right, both cards do a good job. To your vram statement: tell me your trick ![]() The more vram you have, the more it could be used potentially. That could also be a reason, that our loads are varying. |
#24
|
|||
|
|||
![]()
I have actuall a GTX 680 and I think for CoD this card works much faster than a 7970. For Rise of Flight it is vice versa!
|
#25
|
|||
|
|||
![]() Quote:
It is just as simple as that. They are both nearly the same with the small but tall advantage of the 7970 to have sufficient amount of vram. If we speak about another version of the 680, this thing could be solved, but we are currently talking about 680 with 2 gb vram. The game could also solve the problems with high vram load a bit by itself, so that this discussion isn't necessary, but in the current state, it is simply as described. So, 680 is NOT performing way better. Simply not true... ![]() |
#26
|
|||
|
|||
![]() Quote:
![]() This is also brand independant - as I run both ATi and nVidia cards... Even with engine fixes - in 1 or 2 years this issue will be back! Get VRAM. 2.5+! Just one piece of general advice: just set a limit of money you want to spend and get the best card over all for the money. There are way too many fanboys and pseudo experts involved here - your best bet is to read general reviews on tech sites. Both brands are really very much equal - it's +/- 10% typically. It's not worth the fuss about it really. |
#27
|
|||
|
|||
![]()
S!
Agree with Madfish here. I am by no means an expert with computers but can hold my own while configuring a gaming rig. So I use a simple phrase for my friends who want to buy a computer: If there is the word "gaming" in the requirement list, forget cheap." Just saw latest nVidia video sponsoring Borderlands 2 using PhysX. Must admit they can their stuff in marketing, but saw some glaring things that mean the video was just that..a marketing video oozing with gimmicks, but the devil is in the details and anyone with keen eye can see them ![]() |
#28
|
|||
|
|||
![]()
+1, I am also in the same position as you both.
If my friends ask me, some of my clanmembers or some other experienced jg friends about clod, we will say the same. And that is what the question was: hd7970 or 680 BEST in clod = both are nearly the same fast:-, but we definetly recommend the hd7970 for more variability and less problems in nearly ALL conditions in summary. Everything is well possible with it and there are situations, where you simply cannot run a 680 or they just will get frustrated ( example: triple monitors). Thx, that 2 guys are facing in the same direction as me and other gamers of sims, who found out the strengths and weakpoints with these cards and can sum up effectively. |
#29
|
|||
|
|||
![]()
S!
And have to remember that in most reviews they use only FPS games and a few RTS. Very seldom they use any flight sim or similar. The AAA titles are heavily leaning to the brand or another due their popular nature and this will never change. Therefore I test both brands on games I play to get a picture of the performance rather than blindly believe the reviews. So far neither brand has dissapointed me. Now in the AMD camp, next generation card be nV or AMD depending on how they will perform at time of their release. So no real preference here nor foam mouthed fanboi mentality ![]() |
#30
|
|||
|
|||
![]()
I am very brand skeptic, what it boils down to as which is on top in terms of tech reviews, and in pure speed terms AMD cards have slightly pushed ahead but that's only with being able to flash the bios to get that extra ooomph, as to what is the better card all round is goes to nvidia at the moment. In terms of who is using newer chip designs sorry Stublerone it's nvidia as the chip that AMD uses for it's card is now really long in the tooth as it's actually getting on for 5 years with the evergreen codename of cards and that is old tech sorry but AMD lost the plot with it's processors and it's cards as it was still pumping out variations of it's AMD64 processor for far too long after that tech should have been relegated to the dustbin! Sorry but for all your AMD pr boosting nvidia is actually using newer tech!
Last edited by Walshy; 08-17-2012 at 11:16 AM. |
![]() |
Thread Tools | |
Display Modes | |
|
|