Fulqrum Publishing Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official Fulqrum Publishing forum > Fulqrum Publishing > IL-2 Sturmovik

IL-2 Sturmovik The famous combat flight simulator.

Reply
 
Thread Tools Display Modes
  #1  
Old 06-18-2008, 02:22 PM
Thunderbolt56 Thunderbolt56 is offline
Approved Member
 
Join Date: Oct 2007
Location: Daytona Beach, FL
Posts: 398
Default

To directly answer your questions, It was once stated (by Oleg or Ilya in some interview) that SoW:BoB will support dual core processors so we know it will do that for certain. They may have actually coded in support for quads since that interview, but who knows. Secondly, it was directly stated that they will NOT support the PhysX hardware but, again, that was at least 6 months ago (probably longer) and could have changed since it was originally written. As was already stated, PhysX has made some strides since it was first introduced though real-world applications are still showing less than stellar performance gains.


TB
Reply With Quote
  #2  
Old 06-18-2008, 04:52 PM
Flyby's Avatar
Flyby Flyby is offline
Approved Member
 
Join Date: Oct 2007
Posts: 701
Default

thanks TB,
It would be nice to know if quad cores were coded in or not, especially for those gamers among us looking to build new systems. IIRC only of the two new Nvidia cards, only the Uber GX280 will have the PhysX implementation. Maybe Oleg can address these issues in his next update?
Flyby out
__________________
the warrior creed: crap happens to the other guy!
Reply With Quote
  #3  
Old 06-18-2008, 10:10 PM
IceFire IceFire is offline
Approved Member
 
Join Date: Apr 2008
Posts: 1,879
Default

I imagine that once you get over the hurdle of making something dual core capable that moving things around to four cores isn't that big of a deal as you already have the frameworks in place. I'm saying this without any real programming knowledge...just some general reading on the subject.

What they should say is mutlicore programming or multithreaded programming. Once you get past just doing everything for one core I would think you could scare upwards...as long as you had threads that could be broken out into and still remain synchronized.
__________________
Find my missions and much more at Mission4Today.com
Reply With Quote
  #4  
Old 06-19-2008, 06:43 AM
Feuerfalke Feuerfalke is offline
Approved Member
 
Join Date: Jan 2008
Location: Germany
Posts: 1,350
Default

Both cards are at the moment basically used for eyecandy only. I think it would be pretty hard to make it worth much more. Consider that not everybody will buy this hardware and if you dedicate parts of the physics to these systems, you have to make physics the same for single- and multicore-PCs and both in combination with a physics coprocessor.

As tempting as it may sound, I doubt it will be more than providing additional eyecandy.
Reply With Quote
  #5  
Old 06-19-2008, 12:33 PM
Flyby's Avatar
Flyby Flyby is offline
Approved Member
 
Join Date: Oct 2007
Posts: 701
Default

I thought Nvidia was implementing PhysX through it's drivers somehow. Maybe I misunderstood what I read (somewhere)
__________________
the warrior creed: crap happens to the other guy!
Reply With Quote
  #6  
Old 06-19-2008, 12:49 PM
Feuerfalke Feuerfalke is offline
Approved Member
 
Join Date: Jan 2008
Location: Germany
Posts: 1,350
Default

I doubt that - at least not as the PhysX-Chip. The statements were merely plans how it could be implemented, but on the technical side, the interface between the card and the mainboard still is a bottleneck, especially at higher resolutions with 16xAF & 16XFSAA.

Considering the bandwidth and functionality of this chip, I doubt a shared interface will be the solution.

On the other hand, there may be parts of the former PhysX-Chip being implemented on the GFX-card to support graphics further.

Basically the same thing with the step from pure 2D-Cards that were aided with 3D-Acceleration by the 3DFX-Voodoo-Chipset. All cried out how completely useless it was, a short while later everybody implemented the functionality on their own chip in one way or the other.
Reply With Quote
  #7  
Old 06-19-2008, 02:44 PM
mondo mondo is offline
Approved Member
 
Join Date: Oct 2007
Posts: 213
Default

Quote:
Originally Posted by 99th_Flyby View Post
I thought Nvidia was implementing PhysX through it's drivers somehow. Maybe I misunderstood what I read (somewhere)
All Nvidia 8XXX cards have the ability to run the PhysX software. The new ATI boards about to come out also will be able to run PhysX too because they went into partnership with Nvidia as long as they used the CUDA programming model.
Reply With Quote
  #8  
Old 07-07-2008, 08:24 PM
Zoom2136 Zoom2136 is offline
Approved Member
 
Join Date: Oct 2007
Posts: 224
Default

Quote:
Originally Posted by mondo View Post
All Nvidia 8XXX cards have the ability to run the PhysX software. The new ATI boards about to come out also will be able to run PhysX too because they went into partnership with Nvidia as long as they used the CUDA programming model.

I though ATI went with Intel and HAVOQ.... No?
Reply With Quote
  #9  
Old 06-19-2008, 05:57 PM
JG27CaptStubing JG27CaptStubing is offline
Approved Member
 
Join Date: Dec 2007
Posts: 330
Default

Well just to further comment about physx if you will...

I would really like to see the Damage models improve dramatically. Not only from taking damage from weapons but possible "bending" your plane as well. Being hit while under a strong G load can cause a catastrophic failure or if the game models more complex systems like O2 and fuel management that they can fail on you. Now I know that really doesn't have a lot to do with Physx but it would be nice to see this sim take it to the next level.
Reply With Quote
  #10  
Old 06-20-2008, 09:19 AM
Codex Codex is offline
Approved Member
 
Join Date: Nov 2007
Location: Hoppers Crossing, Vic, Australia
Posts: 624
Default

Don't forget each shader is a programmable core. The 280GTX has 240 of them. That is some serious parallel power, and they can be programmed using C and C++.

I suspect that is why Intel is making a lot noise lately about how GPU's will never be good for everyday computing. But I reckon their $hit'n their pants at the prospect that GPGPU's (General Purpose GPUs) are becoming more than just a fleeting hobby now.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 07:29 AM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2007 Fulqrum Publishing. All rights reserved.