InsaneDruid |
05-12-2011 09:57 AM |
Quote:
Originally Posted by Buchon
(Post 281497)
With 1920x1200 you fill 1GB VRAM easily (game maxed but SSAO).
1920x1200 = 2.304.000 pixels in 1 GB VRAM
Which mean that the appropriate capacity for 2 GB is :
2.304.000 x 2 = 4.608.000 pixels more or less.
Your system exceeds this :
5760x1080 = 6.220.800 pixels
Try lowering the resolution at 3760x1080, it should be in the margins to play the game maxed (4.060.800 pixels) and you will not notice the difference.
Edit :
And up a video, that Eyefinity setup is really cool :)
|
Your calculations are a bit off, as the vram usage is not alone by the back and frontbuffers holding the image itself. (scaling with resolution), but also the textures, geometry date etc etc. Going from 1920x1200 to 5760x1080
doesn't take that much more framebuffersize:
1920(width)*1200(height)*24(bit-depth)/8(bit->byte)/1024(byte->kbyte)/1024(kbyte->mbyte)*3(one frontbuffer, 2backbuffers) = about 20 Mbytes for framebuffer usage. ('bout 13.5 mbytes for double buffering instead of triple buffering)
5760(width)*1080(height)*24(bit-depth)/8(bit->byte)/1024(byte->kbyte)/1024(kbyte->mbyte)*3(one frontbuffer, 2backbuffers) = about 54 Mbytes for framebuffer usage. ('bout 36 mbytes for double buffering instead of triple buffering)
So its about 24 Megabytes more memory used for the framebuffers. Texture data doesnt change during this process (unless 1c scales the texture quality setting with resolution changes).
Changing the texture quality settings ingame has a LOT more influence of vram capacity that the difference in the resolutions. But of course highres res means a lot more pixels to calculate, so processing power of the gpu could be the limiting factor here.
|