Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Cyris, Apr 5, 2014.
tbh I expected him to be banned by now the mods must be on holliday
Being poor or unable to own a decent rig for whatever reason doesn't give anyone the right to insult people like that..
That's a really bad combination, especially when someone is being disrespectful among other things.
That was totally unnecessary and I always enjoy seeing new and contributing members and I welcome them, but imho attitudes like that
don't go well here or anywhere for that matter.
Enjoy your stay. I doubt it will last long... :thumbup:
Look you are the one complaining about performance , so if you did have better card this whole question would be moot .
All I was saying is don't expect miracles with these new driver .
Will 10% make you happy that is best you can hope for i bet with single card ,except for few titles .
We will see tomorrow how it pans out .
how the hell this thread became a who has a better GPU war??? guys FEEDBACK THE DRIVER.
He got gt 640 SO WHAT? he is happy with what he have, nothing else matters.
I hope BF4 will get a boost, they don't mentioned it but it could be ninja buff
On many inter*** site's i read that nvidia commes with a new driver that supports tiled-resources is this the same as CPU overhead optimizations by DX???
Agree , MP BF4 will be good test for lower CPU utilization optimizations in the driver .
On this dutch site they talk about that only dx11.2 cards will support tiled-resources in the driver from tomorrow.
how many Nvidia cards support Dx11.2?
I am confused now what does this driver from nvidia bring us tomorrow will it bring tiled resources or CPU overhead lowering driver or are those 2 the same things????????
So All kepler(GK)+win 8.1=Tiled resouces Dx11.2
No, DX11.2 work on nvidia. dx11.1 as well, nvidia removed some extensions that does nothing for gaming. they do it on software level, as BF4 the only dx11.1 game it seems to be good.
I really hope this driver is awesome as they make it to be, nvidia is the only one who can deliver high end driver
I know the driver tomorrow gonna be awesome, nvidia made lot of noise about it, they wouldn't release it as just another "driver". i guess we gonna be the ultimate passengers tommorow
Yawn, just read 4 pages of bs..
Why dont you all call your facts tomorrow in the afternoon, ok?
maybe my cpu bottleneck reduce slightly in bf4.
AFAIK they support DX11.2 software side fully just hardware side they don't. I have no idea what they don't support but it does not seem to be much of a difference.
There's so much bullsheit in this thread that it would be better to swipe this whole thread clean.
Apparently some have not heard Direct3D feature levels...
If anyone is interested, tomorrow we will see three new drivers. GRID 335.35, Geforce 337.50 and Linux 337.12 (dated 7th April)
New in GRID 335.35 driver.
These drivers have been tested to work with Amazon G2 instances running NVIDIA GRID
Adds support for GRID SDK 2.3.7. For access to the GRID SDK please visit https://developer.nvidia.com/grid-app-game-streaming.
Support for CUDA 5.5
Support for OpenGL 4.3
Support for the Open Computing Language (OpenCL) 1.1
Support for DirectX 9, 10, and 11
Support for GRID SDK 2.3.7 For access to the GRID SDK please visit https://developer.nvidia.com/grid-app-game-streaming.
New in Linux 337.10 driver.
Added support for the following GPUs:
GeForce GTX 850M
GeForce GTX 860M
GeForce GTX 870M
GeForce GTX 880M
GeForce GT 705
GeForce GT 720
Added the ability to over- and under-clock certain GeForce GPUs in the GeForce GTX 400 series and later. For GPUs that allow it, an offset can be applied to clock values in some clock domains of some performance levels. This clock manipulation is done at the user's own risk. See the README documentation of the "CoolBits" X configuration option for more details.
Updated the minimum required version of GTK+ from 2.2 to 2.4 for nvidia-settings.
Renamed the RandR output property _GUID to GUID now that it is an official property documented in randrproto.txt:
Reduced CPU utilization and GPU memory utilization of the NVIDIA EGL driver.
Added support for the following EGL extensions:
Renamed the "Clone" setting of the "MetaModeOrientation" X configuration option to "SamePositionAs", to make clear that this setting applies to the position only, and not to the resolution of modes in the MetaMode.
Added NV-CONTROL attribute NV_CTRL_VIDEO_ENCODER_UTILIZATION to query utilization percentage of the video encoder engine.
Added support for the GLX_NV_delay_before_swap extension. For more details, see the extension specification:
Report correct buffer sizes for RGB GLX visuals, GLXFBConfigs, and EGLConfigs. Previously, RGB10 and RGB8 formats were reported as having 32 bits, and RGB5 formats were reported as having 16 bits. Now they are correctly reported as 30, 24, and 15 bit formats respectively as required by the GLX and EGL specifications.
This is how it workes.
Within win 8.1 we have Dx11.2, now nvidia take's 1 part out of Dx11.2 the tiled resources and let all the cards that run on Dx11.0 make use of this feature.
Tiled resources is a HSA technology, simply put. DirectX 11.2 can now use system memory in place of graphics memory to create more detailed visuals on graphics cards that may not have a lot of memory to begin with. So for instance instead of using the page file on your hard drive or SSD, the tiled resources makes a pageable address space in your system RAM that the graphics card can use to store higher-resolution textures.
If you run over your frame buffer on your 1GB graphics card, for instance, you can have the extra textures put into system memory instead. The difference here in comparison to the HSA technology AMD’s been working on is that this works with Intel integrated and Nvidia discrete graphics. It is a software solution and would require a driver re-write to take advantage of the feature. One drawback to this is that it’s not addressable by the CPU, so it’s not like AMD’s hUMA technology in this regard.
How many ram will be used?????
Lol I know.. But middle earth New Zealand is and it's Monday morning here and no magical driver...
Now its the 7th in Sweden!
Windows already allocates a portion of your RAM for your video card.
What you describe Toncey is already what happens. When you start using resolutions and AA or whatever pushes your VRAM usage to its limit, you start to slow down because it needs to use system RAM.
Which also answers your second question, VRAM is much faster. It becomes a problem when you start to use your RAM in place of it.
What BadDriver explains is exclusively for textures.