![]() |
|
IL-2 Sturmovik The famous combat flight simulator. |
![]() |
|
Thread Tools | Display Modes |
|
#1
|
|||
|
|||
![]()
To directly answer your questions, It was once stated (by Oleg or Ilya in some interview) that SoW:BoB will support dual core processors so we know it will do that for certain. They may have actually coded in support for quads since that interview, but who knows. Secondly, it was directly stated that they will NOT support the PhysX hardware but, again, that was at least 6 months ago (probably longer) and could have changed since it was originally written. As was already stated, PhysX has made some strides since it was first introduced though real-world applications are still showing less than stellar performance gains.
TB |
#2
|
||||
|
||||
![]()
thanks TB,
It would be nice to know if quad cores were coded in or not, especially for those gamers among us looking to build new systems. IIRC only of the two new Nvidia cards, only the Uber GX280 will have the PhysX implementation. Maybe Oleg can address these issues in his next update? Flyby out
__________________
the warrior creed: crap happens to the other guy! |
#3
|
|||
|
|||
![]()
I imagine that once you get over the hurdle of making something dual core capable that moving things around to four cores isn't that big of a deal as you already have the frameworks in place. I'm saying this without any real programming knowledge...just some general reading on the subject.
What they should say is mutlicore programming or multithreaded programming. Once you get past just doing everything for one core I would think you could scare upwards...as long as you had threads that could be broken out into and still remain synchronized.
__________________
Find my missions and much more at Mission4Today.com |
#4
|
|||
|
|||
![]()
Both cards are at the moment basically used for eyecandy only. I think it would be pretty hard to make it worth much more. Consider that not everybody will buy this hardware and if you dedicate parts of the physics to these systems, you have to make physics the same for single- and multicore-PCs and both in combination with a physics coprocessor.
As tempting as it may sound, I doubt it will be more than providing additional eyecandy. |
#5
|
||||
|
||||
![]()
I thought Nvidia was implementing PhysX through it's drivers somehow. Maybe I misunderstood what I read (somewhere)
__________________
the warrior creed: crap happens to the other guy! |
#6
|
|||
|
|||
![]()
I doubt that - at least not as the PhysX-Chip. The statements were merely plans how it could be implemented, but on the technical side, the interface between the card and the mainboard still is a bottleneck, especially at higher resolutions with 16xAF & 16XFSAA.
Considering the bandwidth and functionality of this chip, I doubt a shared interface will be the solution. On the other hand, there may be parts of the former PhysX-Chip being implemented on the GFX-card to support graphics further. Basically the same thing with the step from pure 2D-Cards that were aided with 3D-Acceleration by the 3DFX-Voodoo-Chipset. All cried out how completely useless it was, a short while later everybody implemented the functionality on their own chip in one way or the other. |
#7
|
|||
|
|||
![]()
All Nvidia 8XXX cards have the ability to run the PhysX software. The new ATI boards about to come out also will be able to run PhysX too because they went into partnership with Nvidia as long as they used the CUDA programming model.
|
#8
|
|||
|
|||
![]() Quote:
I though ATI went with Intel and HAVOQ.... No? |
#9
|
|||
|
|||
![]()
Well just to further comment about physx if you will...
I would really like to see the Damage models improve dramatically. Not only from taking damage from weapons but possible "bending" your plane as well. Being hit while under a strong G load can cause a catastrophic failure or if the game models more complex systems like O2 and fuel management that they can fail on you. Now I know that really doesn't have a lot to do with Physx but it would be nice to see this sim take it to the next level. |
#10
|
|||
|
|||
![]()
Don't forget each shader is a programmable core. The 280GTX has 240 of them. That is some serious parallel power, and they can be programmed using C and C++.
I suspect that is why Intel is making a lot noise lately about how GPU's will never be good for everyday computing. But I reckon their $hit'n their pants at the prospect that GPGPU's (General Purpose GPUs) are becoming more than just a fleeting hobby now. |
![]() |
Thread Tools | |
Display Modes | |
|
|